US20170359691A1 - Method and system for location determination and navigation using textual information - Google Patents
Method and system for location determination and navigation using textual information Download PDFInfo
- Publication number
- US20170359691A1 US20170359691A1 US15/632,862 US201715632862A US2017359691A1 US 20170359691 A1 US20170359691 A1 US 20170359691A1 US 201715632862 A US201715632862 A US 201715632862A US 2017359691 A1 US2017359691 A1 US 2017359691A1
- Authority
- US
- United States
- Prior art keywords
- textual information
- wireless device
- images
- text
- sources
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
- H04W4/026—Services making use of location information using location based information parameters using orientation information, e.g. compass
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/22—Indexing; Data structures therefor; Storage structures
-
- G06F17/30312—
-
- G06K9/18—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/22—Character recognition characterised by the type of writing
- G06V30/224—Character recognition characterised by the type of writing of printed characters having additional code marks or containing code marks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/023—Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
-
- H04W4/043—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W64/00—Locating users or terminals or network equipment for network management purposes, e.g. mobility management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/024—Guidance services
Definitions
- Certain embodiments of the invention relate to wireless device positioning. More specifically, certain embodiments of the invention relate to a method and system for location determination and navigation using textual information.
- Wireless communication devices with global location navigation system (GNSS) capability are becoming more prevalent. These devices depend on RF signals received from satellites for calculating position. However, these satellite signals are weak and are attenuated when inside buildings such that wireless devices can no longer obtain a lock on the signals and thus can no longer determine location.
- GNSS global location navigation system
- FIG. 1 is a block diagram of an exemplary wireless device with positioning capability, in accordance with an embodiment of the invention.
- FIG. 2A is a block diagram illustrating an exemplary building with textual information, in accordance with an embodiment of the invention.
- FIG. 2B is a diagram illustrating an exemplary wireless device positioning performed inside a structure based on signs at entrances to the structure, in accordance with an embodiment of the invention.
- FIG. 2C is a diagram illustrating an exemplary wireless device positioning performed inside a structure based on a user's path inside the structure, in accordance with an embodiment of the invention.
- FIG. 2D is a diagram illustrating an exemplary wireless device navigation, in accordance with an embodiment of the invention.
- FIG. 3 is a diagram illustrating an exemplary wireless device for positioning, in accordance with an embodiment of the invention.
- FIG. 4 is a block diagram illustrating exemplary steps in determining location without GNSS, in accordance with an embodiment of the invention.
- FIG. 5 is a block diagram illustrating exemplary steps for navigation without GNSS, in accordance with an embodiment of the invention.
- FIG. 6 is a flow diagram illustrating exemplary steps in an enhanced GNSS positioning, in accordance with an embodiment of the invention.
- FIG. 7 is a flow diagram illustrating exemplary steps in image textual information extraction, in accordance with an embodiment of the invention.
- FIG. 8 is a flow diagram illustrating exemplary steps for accurate wireless device positioning, in accordance with an embodiment of the invention.
- FIGS. 9A-9C is a diagram illustrating an exemplary wireless device positioning based on text perspective, in accordance with an embodiment of the invention.
- FIG. 10 is a flow diagram illustrating an exemplary distance determination using perspective, in accordance with an embodiment of the invention.
- FIG. 11 is a flow diagram illustrating an exemplary distance determination using perspective, in accordance with an embodiment of the invention.
- Certain aspects of the invention may be found in a method and system for location determination and navigation using textual information.
- Exemplary aspects of the invention may comprise capturing one or more images of one or more sources of textual information in the vicinity of the wireless communication device.
- Text may be extracted from the one or more sources of textual information, and a position of the wireless device may be determined based on a comparison of the extracted text in the captured one or more images to text in a stored database of textual information.
- An orientation of the text may be sensed in the captured one or more images relative to the wireless device.
- An orientation of the wireless device may be utilized in conjunction with the extracted text for the position determining.
- the orientation and the extracted text may be utilized in conjunction with determined distances from the one or more sources of textual information for the position determining.
- Locations of the sources of textual information and/or the captured one or more images may be stored in the database of textual information.
- An instruction to capture one or more images in a different orientation may be received when the positioning does not meet an accuracy requirement.
- the database of textual information may be downloaded when GNSS signals sufficient for positioning are no longer received by the wireless communication device.
- a distance from one or more of the sources of textual information in the vicinity of the wireless communication device may be determined based on known optical properties of a camera in the wireless communication device.
- the optical properties may comprise focal length and/or and focus setting. The determined distance may be used to determine an accurate location based on the captured one or more images.
- FIG. 1 is a diagram of an exemplary wireless device with positioning capability, in accordance with an embodiment of the invention.
- a wireless device 101 and the server 105 may comprise a database 107 .
- the wireless device 101 may comprise any device (e.g. smart phone) or vehicle where its user may desire to know the location of such device or vehicle.
- the wireless device 101 may comprise a global navigation satellite system (GNSS) receiver that may be operable to receive medium Earth orbit (MEO) satellite signals and low Earth orbit (LEO) satellite signals.
- GNSS global navigation satellite system
- MEO satellites e.g. GPS satellites
- LEO satellites e.g. Iridium communication satellites
- Medium Earth orbit satellites may be at a height of about 12,000 miles above the surface of the Earth, compared to about 500 miles above the surface for low Earth orbit satellites. Therefore, the signal strength of LEO satellite signals is much stronger than MEO satellite signals.
- LEO satellites may be used for telecommunication systems, such as satellite phones, whereas MEO satellite systems may be utilized for location and navigation applications.
- satellite signals may be attenuated when the wireless device 101 enters a building, such that the wireless device may not be able to utilize satellite signals to determine its location. In this instance, it may be desirable to determine the location of the wireless device 101 utilizing other techniques.
- the wireless device 101 may determine its location via GNSS when outside the building 103 .
- the wireless device 101 may utilize textual information to determine its location.
- Textual information may be input to the wireless device 101 via a photo, a series of photos, or video taken by a camera in the wireless device 101 . These may be combined with orientation (elevation, azimuth, and rotation) of the camera captured by a compass, gyroscope, gravity sensor, or other kind of sensor present in the camera. Optical focus and/or distance sensors on the camera may estimate the distance of the wireless device 101 to the textual information.
- the camera in the wireless device may comprise an auto-detect function for text, such as optical character recognition, and may automatically perform an auto-zoom into text characters to improve recognition and positioning. The accuracy of the character recognition may be improved by making the text a larger portion of the captured image, increasing the ability to detect fonts, colors, or other features or characteristics of the text.
- the textual information may include the position and orientation of the letters in three dimensions, the direction of the text, the absolute size of the letters, the fonts used, the language, slang, dialect or patois in the text, the color and/or lighting of the text, defects, blemishes, and the presence of bar codes nearby, for example.
- other marks such as trademarks or any graphical feature usually associated with particular text, for example, may be considered in conjunction with the actual text captured to assist in determining the position.
- the context of the text may be utilized to assist in the correlation. For example, if the text is taken from a store sign (inferred from being present in a shopping mall and looking upward) this knowledge may be utilized to narrow the database search to store names.
- the text is from a billboard or from a sign giving directions to pedestrians
- a different subset of the database may be searched.
- the textual information in images, their relative position to each other and distance from the camera, obtained by the wireless device 101 may be compared to the known textual information of the building in one or more database, such as the database 107 , either stored locally or obtained from a remote server 105 .
- the server may be accessed via a cellular network or the Internet via an access point, for example.
- the wireless device 101 may download a textual information map of the building 103 upon entering and losing GNSS signals; it may download the textual features of the building; or it may use satellite information and publically-available maps to estimate the locations of text related to known stores or other facilities in the building.
- a map of a shopping center may comprise the location of stores in the facility, such that a captured image of a store's name of the appropriate size for the sign in front of the store may be utilized to accurately determine the position of the wireless device 101 .
- the wireless device 101 may store textual information maps of buildings commonly visited by the user of the device 101 .
- the wireless device 101 may download textual information maps of locations that the user of the wireless device enters or plans to enter in the future. For example, if a user has entered an address of a shopping mall or a sports arena into a navigation program, the wireless device may download one or more textual information maps associated with the destination or destinations.
- the wireless device 101 may download textual information maps for that sports arena, including seating sections, for example, prior to the event.
- the wireless device 101 may download textual information maps, including terminal and gate numbers and/or airline names, for the airports to be utilized on the trip.
- the wireless device 101 may determine its position by using estimated orientation and (when available) distance relative to textual elements, and employing triangulation methods such as is done in orienteering using the azimuth reading (e.g. provided by a compass) of several textual reference points to pinpoint one's location on a map. It may use more sophisticated trilateration calculations such as is done by GNSS devices, once distance to textual elements is estimated.
- the orientation of the captured text with respect to the wireless device 101 may be determined based on the orientation of the wireless device 101 in space as measured using internal sensors such as a compass and MEMS accelerometers, for example.
- the wireless device 101 may compare one or more captured textual elements, such as store signs, for example, to textual elements in stored or retrieved databases, and known to be in the vicinity of the wireless device when it lost GNSS lock upon entering the building 103 . By comparing a captured word, phrase, or character to a known textual element, and finding a match, or a correlation above a threshold level, the wireless device may then calculate its location based on the known map of the building 103 . This is shown in further detail in FIG. 2A-2D .
- FIG. 2A is a block diagram illustrating an exemplary building with textual information, in accordance with an embodiment of the invention.
- a shopping mall 200 comprising stores and hallways with textual signs 201 A- 201 E.
- the wireless device 101 which may be located at location (x,y,z) and at an orientation ( 0 , 0 , rotation), and may be operable to determine its location based on a comparison of textual information obtained from images captured by a camera in the wireless device 101 and known textual information in a pre-stored or retrieved database.
- the wireless device 101 may be operable to download a map, or textual information database, comprising visible examples of text in the shopping mall 200 when the device enters the facility and loses satellite positioning signals.
- the wireless device 101 may download a textual information database when the user of the wireless device 101 activates a positioning function on the device and no satellite positioning signals are present.
- the wireless device 101 may have textual information maps stored internally for various buildings frequented by the user, or automatically downloaded whenever in the vicinity of a structure that attenuates satellite positioning signals.
- the wireless device 101 may capture one or more images, or video, of the textual information visible in the shopping mall 200 .
- the wireless device 101 may be used to scan from left to right, as illustrated by the curved arrow in FIG. 2A , capturing an image comprising the signs 201 A- 201 E.
- the captured image or images and their orientation to textual elements may be compared to the known signs or other textual information in the shopping mall 200 from a pre-stored or downloaded textual information database.
- the database may comprise a map of the stores, entrances, restrooms, and other features of the shopping mall 200 such that captured text that matches these elements may indicate a position of the wireless device 101 .
- the database may comprise basic elements such as the simple text itself of various recognizable features of the shopping mall 200 , such as stores, restrooms, and exits.
- the database may comprise the 3D location of the textual elements; images or abstractions of the image of signs to allow for easy identification from any orientation relative to the element (including characteristics such as color, size, font, language, dialect, and patois of the text).
- the wireless device 101 may find in the database the most likely textual elements that correspond to those in the images.
- the wireless device may utilize optical character recognition to determine the text captured in the image and compare directly to text stored in the database that is known to be within the shopping mall 200 . Text extraction from captured images may be performed by the wireless device 101 and/or by the system comprising the database.
- the wireless device 101 may then compare the captured image to a database comprising the known text. For example, a processor in the wireless device 101 may determine a coarse location by determining that the text “GAP” from the image corresponds to the location of the Gap store, particularly since it is adjacent to the “Restrooms” text, which is known to be adjacent to the Gap in the database. In addition, given the known optical properties of the camera in the wireless device, e.g., focal length and zoom, and the size of the GAP text in the sign as stored in the database, the wireless device 101 may thus calculate its distance from the Gap store.
- the sign 201 E may comprise a circular or other shaped spot or location on the floor of the shopping mall 200 with a textual identifier and a known position in the database. Accordingly, a user may capture an image of the sign 201 E while standing above it and then determine an accurate location from the database. Similarly, the wireless device 101 may capture one or more images of signs or other text in the vicinity while standing atop the spot or location. In this manner, by comparing the captured text with text in the database in conjunction with the known position of the spot or location, the wireless device 101 may determine an accurate position and then be utilized to navigate through the shopping mall 200 . Similar spots or locations (with or without identifying signs) with known locations may be placed throughout the shopping center 200 to assist in accurate positioning.
- the wireless device 101 may then display a map for the user to navigate to a desired location, such as an emergency exit, handicap-accessible restroom, or other desired location in the shopping mall 200 , for example.
- the wireless device 101 may then calculate its location on a constant or periodic basis, assuming the camera continues to take images as the user travels throughout the shopping mall 200 .
- the invention is not limited to location determination in a shopping mall, but may be utilized in any location where GNSS positioning is not available and a database of known textual characteristics is available. For example, this may include sports arenas, train stations, airports, hospitals, or office buildings.
- velocity can be estimated, thereby providing another input to the inertial guidance system in the wireless device 101 , which may comprise a compass and inertial sensors, for example.
- the wireless device 101 may utilize textual or structural elements for positioning purposes to supplement any positioning technology such as GPS, assisted-GPS, WiFi triangulation, inertial sensors/accelerometers, or a hybrid positioning technique comprising two or more of such techniques.
- any positioning technology such as GPS, assisted-GPS, WiFi triangulation, inertial sensors/accelerometers, or a hybrid positioning technique comprising two or more of such techniques.
- the wireless device 101 may be operable to capture images of structural elements in the shopping center 220 and determine its location based on the known structures in the stored database. Use of captured images of structures for positioning and navigation is described further in related application U.S. patent application Ser. No. 13/309,081 (Attorney Docket No. 24772US01) filed on Dec. 1, 2011, which is herein incorporated by reference.
- FIG. 2B is a diagram illustrating an exemplary wireless device positioning performed inside a structure based on signs at entrances to the structure, in accordance with an embodiment of the invention.
- a shopping center 220 with various stores, the wireless device 101 , and entrance/exit signs 221 A- 221 D.
- the wireless device 101 may be operable to determine its position inside the shopping center 220 by determining the GNSS location of the device at a particular entrance, such as outside the entrance labeled with the entrance/exit sign 221 A, for example. In other words, the device knows that it has entered the shopping center 220 at that entrance, since, for example, the textual information database for the shopping center 220 includes GNSS positions for each entrance to the building. This will enable the wireless device 101 to obtain an initial position for navigation within the shopping center 220 without the use of GNSS, in accordance with the textual information database for the shopping center 220 , as described herein.
- the wireless device 101 may be operable to determine its initial position within the shopping center 220 without GNSS.
- the wireless device may have lost GNSS reception when it entered the shopping center 220 , or may have had GNSS recently switched off or disabled, for example.
- the wireless device 101 may be operable to determine its location through the identification of signs posted at the nearest entrance.
- the wireless device 101 may comprise a compass such that it can determine what direction the wireless device 101 is facing when capturing an image of an entrance/exit sign, such as the entrance/exit sign 221 A.
- the locations and directional facings of the entrance/exit signs 221 A- 221 D of the shopping center 220 may be known, i.e., stored in a textual information database.
- initial position can be determined by comparison of the text from the captured image of the entrance/exit sign with the direction of the device 101 when the image was captured, with the corresponding information stored in the textual information database (i.e., with textual information representative of each entrance such as “South Entrance”, and what direction each entrance sign is facing).
- the wireless device 101 may then determine its position by calculating a distance from the entrance/exit sign 221 A using the captured image.
- the textual information in the sign in the captured image may be utilized to narrow the location down to be near one particular entrance.
- the wireless device may thus be utilized to input data to a database for the shopping center 220 .
- the wireless device 101 may have established its position before entering the shopping center 220 and then acquired images and/or video of signs and other textual information once inside.
- the wireless device 101 may obtain spatial data in conjunction with captured images and/or video utilizing a pedometer and an altimeter (if necessary).
- FIG. 2C is a diagram illustrating an exemplary wireless device positioning performed inside a structure based on a user's path inside the structure, in accordance with an embodiment of the invention.
- a shopping center 230 with various stores, the wireless device 101 , the entrance/exit signs 221 A- 221 D, and a user path 231 .
- the wireless device 101 may be operable to determine its position within the shopping center 230 without GNSS.
- the wireless device 101 may comprise a compass such that it can determine what direction the wireless device 101 is facing and may also comprise a pedometer for determining the distance the user of the wireless device 101 has traveled based on the number of steps taken since the last GNSS position was determined.
- the wireless device 101 may determine its position by calculating a distance traveled using a pedometer and an altimeter, if necessary (i.e., if the user has travelled to a different level or floor of the shopping center 230 ), in conjunction with the direction traveled as determined by a compass.
- the distance from the last known GNSS position may be determined by integrating the steps taken over the direction that the wireless device 101 traveled as determined by a compass, for example.
- the wireless device 101 may be operable to track its position via captured images and/or video. For example, images may be taken periodically such that the wireless device 101 may update its position by calculating its distance from captured images as compared to a textual information database. For example, the user of the wireless device 101 may capture images of signs on store fronts, determine its distance from the storefronts, and compare the text to a textual information database with known store locations to accurately determine its position.
- FIG. 2D is a diagram illustrating an exemplary wireless device navigation, in accordance with an embodiment of the invention. Referring to FIG. 2D , there is shown a shopping center 250 , the wireless device 101 , various stores, a shoe store 251 , and a desired store 253 .
- the wireless device 101 may be operable to determine its position, and then may be able to navigate the user of the wireless device 101 to a desired location, such as at the desired store 253 from its present position near the shoe store 251 , all without GNSS.
- the wireless device 101 may determine its initial or present position from an image of the text of a nearby store sign, or any other signage or textual information, such as support column labels, in the vicinity of the wireless device 101 that corresponds to textual information in a stored structural database or map, as discussed above.
- the user of the device may then enter a desired location or destination, such as the desired store 253 , into the wireless device 101 via textual or audio inputs, for example.
- the wireless device 101 may also be operable to determine the optimal path to reach the desired store 253 , based on the stored map of the shopping center 250 , and may display all or part of a map demonstrating the steps to be taken by the user to get to the destination. For example, the wireless device 101 may display a top-view surface map of the shopping center 250 , and overlay a path or direction on the map for the user to follow, with instructions and/or arrows for navigation assistance. Alternatively (or additionally), the wireless device 101 may display a superimposed augmented reality, with instructions, arrows, and/or a direction overlaid on an image of the shopping center 250 in the direction in which the user should be facing to reach the desired destination. The wireless device 101 may then, using the compass and pedometer (and altimeter, if necessary), track and display the user's progress along the path to, or in the direction toward, the desired destination.
- the wireless device 101 may utilize textual features from captured images to track the progress of the user toward the desired store 253 .
- text from store front signs may be used for accurate positioning.
- the updated positioning may enable an updated augmented reality display, such that the displayed image continues to match that of the surroundings when following the appropriate route.
- the wireless device 101 may continue to track position via a pedometer, compass, and/or an altimeter, such that when the wireless device 101 is again retrieved to check the route, it may still have an accurate calculated position.
- the wireless device 101 may reestablish its position using a captured image of a sign or other textual information to verify its location with respect to the desired destination.
- the wireless device 101 may incorporate a conversational element to positioning. Accordingly, if the wireless device 101 determines a position but does not have a high accuracy probability, the wireless device 101 may ask the user questions to improve the positioning accuracy. For example, the wireless device may ask the user “Do you see a movie theater to your right?” If the answer does not confirm the estimated position, the wireless device 101 may request that the user capture one or more additional images for positioning purposes. Or the wireless device 101 may request that the user scan to the left or right to determine whether expected textual information is present.
- the conversational element may also be extended to assist in the navigational aspects of the present invention.
- the wireless device 101 may again establish a route by determining its position without the use of GNSS, and comparing the determined position to that of the desired store 253 in the stored map. This position may be reestablished by capturing an image of the exited storefront sign or other signs in the vicinity, for example, as described herein. This information can then be used to assist the user, via the map or superimposed augmented reality, back on the previous path or in the proper direction, or on a new path or in a new direction, to the desired destination (i.e., desired store 253 ).
- FIG. 3 is a block diagram illustrating an exemplary positioning wireless device, in accordance with an embodiment of the invention.
- the wireless device 101 comprising a global navigation satellite system (GNSS) module 301 , a processor 303 , an RF module 305 , a memory 307 , and a camera 309 .
- GNSS global navigation satellite system
- the access point or cellular tower 311 and the remote textual information database 313 are also shown.
- the GNSS module 301 may comprise an RF receiver (Rx) path for receiving satellite signals for positioning functions.
- the GNSS module 301 may be operable to down-convert received RF signals to baseband and subsequently demodulate the baseband signals to obtain an accurate clock signal, such as a GPS clock signal.
- an accurate clock signal such as a GPS clock signal.
- the RF module 305 may comprise one or more RF Rx and transmit (Tx) paths for communicating with cellular towers or wireless access points, for example.
- the RF module 305 may comprise one or more antennas, low-noise amplifiers (LNAs), power amplifiers, mixers, local oscillators, variable gain amplifiers, filters, and analog-to-digital converters (ADCs), for example.
- LNAs low-noise amplifiers
- ADCs analog-to-digital converters
- the RF module may thus be operable to receive RF signals, amplify the signals before down-converting to baseband, filter out noise signals, and convert the resulting filtered signals to digital signals for processing by the processor 303 .
- the RF module may be operable to convert digital baseband signals to analog signals, upconvert the analog baseband signals to RF, amplify the resulting RF signals and transmit the amplified signals via an antenna.
- the memory 307 may comprise a programmable memory module that may be operable to store software and data, for example, for the operation of the wireless device 101 . Furthermore, the memory 307 may store downloaded textual information databases that may be utilized by the processor 303 to determine its location without a GNSS signal.
- the processor 303 may comprise a general purpose processor, such as a reduced instruction set computing (RISC) processor, for example, that may be operable to control the functions of the wireless device.
- RISC reduced instruction set computing
- the processor 303 may enable the GNSS module 301 when a user indicates a desire to determine their location.
- the processor may utilize images captured by the camera 309 to determine location when no GNSS signal is present.
- the processor 303 may be operable to perform optical character recognition (OCR) for extracting textual data from captured images.
- OCR optical character recognition
- the processor may correlate a previously determined GNSS location to a stored or downloaded textual information map of a building or other structure that the user of the wireless device 101 has entered.
- the camera 309 may be operable to capture still and/or video images via a charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) imaging sensor and associated optical components, such as lenses and readout circuitry.
- CCD charge-coupled device
- CMOS complementary metal-oxide semiconductor
- the optical components may comprise one or more lenses with known focal lengths for determining the distance to an object that is in focus, for example.
- the camera 309 may be operable to obtain information on text within images including attributes such as position and orientation of the letters in three dimensions, the direction of the text, the absolute size of the letters, the fonts used, the language, dialect or patois in the text, the color and/or lighting of the text, and the presence of bar codes nearby, for example.
- the processor 303 and the camera 309 may thus be operable to search for the text in an image using any of the above attributes in combination, and use the perspective on the text (angle/skew/relative size of words/absolute size of the text where available) to determine the location of the wireless device 101 .
- the access point or cellular tower 311 may be operable to provide wireless connectivity to the wireless device 101 via the RF module 305 .
- the access point or cellular tower 311 may enable access to the remote structural database 313 via the Internet or other network.
- the remote textual information database 313 may comprise data relating to textual information found within a building or other location where GNSS signals are not available.
- the remote textual information database 313 may comprise text of all stores in a shopping center or seating sections of a sports arena with their associated locations.
- the wireless device 101 may download data from the remote textual information database 313 when entering a building or may download such data at some time prior to entering.
- FIG. 4 is a block diagram illustrating exemplary steps in determining location without GNSS, in accordance with an embodiment of the invention.
- the exemplary method illustrated in FIG. 4 may, for example, share any or all functional aspects discussed previously with regard to FIGS. 1-3 .
- the wireless device may determine its location via GNSS (e.g., GPS).
- GNSS e.g., GPS
- step 405 if GNSS signals are still available for positioning purposes, the exemplary steps may return to step 403 for continued GNSS positioning. If there are no GNSS signals, such as when the wireless device enters a building or other facility that attenuates GNSS signals below a threshold required for positioning purposes, the exemplary steps may proceed to step 407 where the wireless device may take one or more photo images or videos of the textual features of the surroundings. For example, this may include store signs, seating sections, exit/entrance signs, etc.
- the text in the images may be extracted utilizing optical character recognition, for example, and may also capture characteristics of the text such as direction of the text, size, font, color, language, dialect, patois, and for example.
- the wireless device may determine its distance from the sign or other source of textual information using the known optical characteristics of the camera, for example.
- the extracted information may be uploaded to a server comprising a textual information database.
- the server may download a textual information database to the wireless device.
- the server may compare the text in the captured images to the stored and/or retrieved textual information database.
- the wireless device may take an image of a sign of a store that matches that of a store and its location that are stored in a textual information database in the wireless device or on a remote server, which may then calculate an accurate position based on the distance from the sign, as determined by the size of the text and its orientation with respect to the wireless device.
- the search of the database may be narrowed geographically when the wireless device is known to be in a specific area. For example, if a wireless device is known to be in a certain shopping center, words that correlate to the known stores or businesses in that shopping center would be searched with a higher priority since they would be more likely to be captured by the wireless device.
- the optical properties of the camera system in the wireless device may assist in determining an accurate location by determining a distance to imaged text.
- the textual information database may comprise a seating map with seating section names/numbers and their position. This information may then be utilized to indicate to the user where they are with respect to their desired seats.
- capturing more than one source of text in an image may provide accurate positioning based on the perspective seen between the difference sources, such as one sign being closer than and at a particular angle from another sign.
- UPC bar codes or other bar or 2-dimensional codes may be utilized to determine a position.
- a wireless device may take an image of or read one or more product UPC bar codes, or bar codes otherwise identifying a specific section, pillar, shelf, etc., in a large store where the locations of the products, sections and shelves are known. It can be determined from the captured image of the code or from a reading of the code that the user is in a certain section of the store (e.g. “tools”). If the user desires to find the section for “plumbing”, the user can then enter text or speak into the wireless device (e.g., inputs indicating “plumbing”) and be directed along a path toward the plumbing section (with navigation assistance as set forth further herein).
- the textual information map may be pre-stored in the wireless device, or may be downloaded at the time of entry into the building. For example, if a calendar in a wireless device indicates that the user will be at a sports event at a particular time, the wireless device may download a seating map with associated seat section names and numbers for the sports arena prior to arrival at the event. In another exemplary scenario, the wireless device may download a textual information map for the nearest structure as determined by the last GNSS determined position when GNSS signals were lost.
- the exemplary steps may end at end step 413 , or may continue back to step 405 if further positioning is desired.
- FIG. 5 is a block diagram illustrating exemplary steps for navigation without GNSS, in accordance with an embodiment of the invention.
- the exemplary method illustrated in FIG. 5 may, for example, share any or all functional aspects discussed previously with regard to FIGS. 1-4 .
- the wireless device may determine its location without GNSS, but with knowledge that it is within a particular structure, such as a shopping center or sports arena, for example.
- the position may be determined by measuring a distance to a sign with textual information whose location is stored in a textual information database.
- the wireless device may take one or more photo images or videos of the textual features of the surroundings. For example, this may include store signs, entrance/exit signs, billboards, sports arena seating section signs, etc.
- Knowledge of the optical properties of the camera system in the wireless device may assist in determining an accurate location by determining a distance to imaged text.
- step 505 if the user of the wireless device has entered a desired destination either textually or vocally, the exemplary steps may proceed to step 507 , and if there is no desired destination, the exemplary steps may proceed to end step 513 .
- the wireless device may generate and display a navigation map comprising a top-view surface map with a path or direction overlaid thereon, or an augmented reality with a path or direction overlaid on an image of the surroundings in the direction of the desired location.
- the position of the wireless device may be tracked or monitored utilizing periodically captured images and/or video, and/or by tracking distance utilizing a pedometer in conjunction with a compass (and altimeter, if necessary).
- the wireless device may track and display the user's progress along the path to, or in the direction toward, the desired destination.
- step, 511 if the wireless device is at the desired location, the exemplary steps may proceed to end step 513 , and if not may continue back to step 507 for further navigation.
- FIG. 6 is a flow diagram illustrating exemplary steps in an enhanced GNSS positioning, in accordance with an embodiment of the invention.
- the exemplary method illustrated in FIG. 6 may, for example, share any or all functional aspects discussed previously with regard to FIGS. 1-5 .
- the exemplary steps may proceed to step 605 where the wireless device may take one or more images or video of surrounding sources of textual information. If the GNSS signal is strong, and thus the accuracy of the positioning is high, the exemplary steps may proceed to step 609 for further GNSS positioning followed by end step 611 .
- the text in the captured images or video may be compared to text in a database comprising text associated with known positions to determine a more accurate location. For example, if GNSS has determined that the wireless device is on a particular street with about 100 meter accuracy, the wireless device may take images of one or more street signs or building signs. The text of the signs and their locations may be stored in the database, enabling the wireless device to accurately determine its location, despite the weak GNSS signal, followed by end step 611 .
- FIG. 7 is a flow diagram illustrating exemplary steps in image textual information extraction, in accordance with an embodiment of the invention.
- the exemplary method illustrated in FIG. 7 may, for example, share any or all functional aspects discussed previously with regard to FIGS. 1-6 .
- the wireless device may determine what building or other attenuating structure that it is within based on its last known position and, if available, velocity and/or elapsed time.
- image processing algorithms such as optical character recognition (OCR) may be utilized to extract text of captured signs or other sources of text within the building and the wireless device may establish the orientations and distances of the extracted text.
- OCR optical character recognition
- the wireless device may also determine its distance from the text source.
- the wireless device may communicate images and/or the extracted text, orientations, and distances to a server comprising one or more textual information databases.
- the wireless device may upload raw images to the server for the extraction of text and its attributes.
- the server may search the database for geolocation based on the text and its attributes.
- step 711 based on knowledge of the position of the text found in the database, and the relative position of the camera, the precise position of the camera is determined by the server, or alternatively the wireless device. If the positions of the different text that is captured are known with a varying degree of certainty, estimators such as a Kalman filter can be used to obtain an optimal estimate of the camera location. The exemplary steps may then proceed to end step 713 .
- FIG. 8 is a flow diagram illustrating exemplary steps for accurate wireless device positioning, in accordance with an embodiment of the invention.
- the exemplary method illustrated in FIG. 8 may, for example, share any or all functional aspects discussed previously with regard to FIGS. 1-8 .
- image processing algorithms may be utilized to extract textual information from one or more images or videos captured by the wireless device of a sign or other textual information source in a building or other attenuating structure.
- the wireless device may establish orientations and distances of the text with respect to the wireless device.
- the wireless device may communicate extracted text, orientations, fonts, size, language, color, and distances of the elements to a server comprising one or more textual information databases.
- the server may determine the best matches between received data and structural elements within its database, and may order the results by the quality, Q i , of the match.
- the exemplary steps may include one or both of steps 809 and 811 .
- the server may start with the highest quality matches, M i , and use the orientation and distance to the wireless device to determine the estimated position of the wireless device, P i .
- the server may start with the highest quality match pairs, M i and M j that have the largest orientation difference, and use them to triangulate the position P ij .
- step 813 the server may find the matches M 1 that have positions P i and P ij that cluster most closely around a position P x , using a distance metric weighted by the quality of match and triangulation.
- step 815 if there is a P x which has the largest cluster, including data from previous iterations, the exemplary steps may proceed to step 817 , where the server may report P x as the likely location of the wireless device. The accuracy may be given by the variance of the positions P i weighted by the quality of the match and triangulation.
- step 815 the exemplary steps may proceed to step 821 where the wireless device may be requested to take one or more images at a different orientation, preferably recommending an orientation with prominent text and large orientation difference, before proceeding to step 803 for a renewed extraction process.
- step 819 if the positioning accuracy meets a user or application requirement, the exemplary steps may proceed to end step 823 , and if not, the exemplary steps may proceed to step 821 for further image capture.
- FIG. 9 is a diagram illustrating an exemplary wireless device positioning based on text perspective, in accordance with an embodiment of the invention. Referring to FIG. 9 , there is shown three different perspectives, FIGS. 9A-9C , of the same text as captured by the wireless device 101 . The perspective, or orientation, of the text in conjunction with the knowledge of the text characteristics and location may be utilized to determine an accurate position of the wireless device 101 .
- the height of the letters of the text as viewed by the wireless device may appear to be the same on the left side as the right side, i.e., the “T”s are of the same height, or are of the same height ratio as expected if looking straight at the text.
- the distance to the two outer letters may be measured to be the same by the wireless device based on focus settings and focal length of the optics in the camera of the wireless device. Accordingly, because the location of the text is stored in the textual information database, the wireless device may then be accurately determine its location, since it is at an accurately measured distance from a known location.
- the text characters may be appear to be larger on the left side as compared to the right, indicating that the left side is closer than the right side to the wireless device.
- the wireless device may accurately determine its position from a measurement of the relative heights of the letters and a single measure of the distance to the letters.
- the observed relative heights of the outer letters, when compared to the actual height ratio as stored in the database may be a measurement of the lateral distance from looking straight on to the sign, as shown in FIG. 9A .
- the wireless device may determine distances to both the edges of the left and right letters or between the first and last letters, based on focus settings and focal length of the optics in the camera of the wireless device. These distances, along with the known distance between the first and last letters or left and right edges of the text (i.e., the width of the sign) from the textual information database, give the three sides of the triangle formed between the wireless device and the outer letters.
- the text characters may be appear to be larger on the right side as compared to the left, indicating that the right side is closer than the left side to the wireless device.
- the wireless device may accurately determine its position from a measurement of the relative heights of the letters and a single measurement of the distance to the letters.
- the wireless device may determine distances to the left and right edges of the letters or between the first and last letters, based on focus settings and focal length of the optics in the camera of the wireless device. These distances, along with the known distance between the first and last letters or left and right edges of the text (i.e., the width of the sign) from the textual information database, give the three sides of the triangle formed between the wireless device and the outer letters.
- FIG. 10 is a flow diagram illustrating an exemplary distance determination using perspective, in accordance with an embodiment of the invention.
- the exemplary method illustrated in FIG. 10 may, for example, share any or all functional aspects discussed previously with regard to FIGS. 1-9 .
- the wireless device may determine what building or other GNSS signal attenuating structure in which it is located, based upon its last known GNSS position, for example.
- the wireless device may capture images of a source of text, followed by step 1007 , where the wireless device may determine the distances to the outer letters of the sign or other source of text. This may be determined using known focus settings and focal lengths of optics in the camera or cameras in the wireless device.
- the location of the source of the text and the distance between the outer or inner edges of the outer letters may be downloaded from a textual information database.
- the database may be stored in a remote server, or may be downloaded to the wireless device when entering the building.
- the wireless device may triangulate its position from the three sides of the triangle as defined by the wireless device and the outer letters of the text source, followed by end step 1013 .
- FIG. 11 is a flow diagram illustrating an exemplary distance determination using perspective, in accordance with an embodiment of the invention.
- the exemplary method illustrated in FIG. 11 may, for example, share any or all functional aspects discussed previously with regard to FIGS. 1-10 .
- the wireless device may determine what building or other GNSS signal attenuating structure in which it is located, based upon its last known GNSS position, for example.
- the wireless device may capture images of a source of text, followed by step 1107 , where the wireless device may determine a distance to the sign or other source of text and the height, or relative height, of outer letters in the sign. This may be determined using known focus settings and focal lengths of optics in the camera or cameras in the wireless device, although a relative height measurement would have fewer requirements.
- the location of the source of the text and the height and/or relative height of the outer letters may be downloaded from a textual information database.
- the database may be stored in a remote server, or may be downloaded to the wireless device when entering the building.
- the wireless device may triangulate its position from the measured distance between the sign and the wireless device in conjunction with the lateral distance as determined by the outer letter height ratio when compared to the actual height ratio, followed by end step 1113 .
- a method and system may comprise capturing one or more images of one or more sources of textual information 201 A- 201 D, 221 A- 221 D in the vicinity of a wireless communication device 101 .
- Text may be extracted from the one or more sources of textual information 201 A- 201 D, 221 A- 221 D and a position of the wireless device 101 may be determined based on a comparison of the extracted text in the captured one or more images to text in a stored database 107 , 313 of textual information.
- An orientation of the text may be sensed in the captured one or more images relative to the wireless device 101 .
- An orientation of the wireless device 101 may be utilized in conjunction with the extracted text for the position determining.
- the orientation and the extracted text may be utilized in conjunction with determined distances from the one or more sources of textual information 201 A- 201 D, 221 A- 221 D for the position determining.
- Locations of the sources of textual information 201 A- 201 D, 221 A- 221 D and/or the captured one or more images may be stored in the database of textual information 107 , 313 .
- An instruction to capture one or more images in a different orientation may be received when the positioning does not meet an accuracy requirement.
- the database of textual information 107 , 313 may be downloaded when GNSS signals sufficient for positioning are no longer received by the wireless communication device 101 .
- a distance from one or more of the sources of textual information 201 A- 201 D, 221 A- 221 D in the vicinity of the wireless communication device 101 may be determined based on known optical properties of a camera 309 in the wireless communication device 101 .
- the optical properties may comprise focal length and/or focus setting.
- the determined distance may be used to determine an accurate location based on the captured one or more images.
- inventions may provide a non-transitory computer readable medium and/or storage medium, and/or a non-transitory machine readable medium and/or storage medium, having stored thereon, a machine code and/or a computer program having at least one code section executable by a machine and/or a computer, thereby causing the machine and/or computer to perform the steps as described herein for location determination and navigation using textual information.
- aspects of the invention may be realized in hardware, software, firmware or a combination thereof.
- the invention may be realized in a centralized fashion in at least one computer system or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited.
- a typical combination of hardware, software and firmware may be a general-purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
- One embodiment of the present invention may be implemented as a board level product, as a single chip, application specific integrated circuit (ASIC), or with varying levels integrated on a single chip with other portions of the system as separate components.
- the degree of integration of the system will primarily be determined by speed and cost considerations. Because of the sophisticated nature of modern processors, it is possible to utilize a commercially available processor, which may be implemented external to an ASIC implementation of the present system. Alternatively, if the processor is available as an ASIC core or logic block, then the commercially available processor may be implemented as part of an ASIC device with various functions implemented as firmware.
- the present invention may also be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods.
- Computer program in the present context may mean, for example, any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form.
- other meanings of computer program within the understanding of those skilled in the art are also contemplated by the present invention.
Abstract
Description
- This application is a continuation of application Ser. No. 15/012,494 filed on Feb. 1, 2016, which is a continuation of application Ser. No. 13/328,413 filed on Dec. 16, 2011, now U.S. Pat. No. 9,253,607, each of which is hereby incorporated herein by reference in its entirety.
- Certain embodiments of the invention relate to wireless device positioning. More specifically, certain embodiments of the invention relate to a method and system for location determination and navigation using textual information.
- Wireless communication devices with global location navigation system (GNSS) capability are becoming more prevalent. These devices depend on RF signals received from satellites for calculating position. However, these satellite signals are weak and are attenuated when inside buildings such that wireless devices can no longer obtain a lock on the signals and thus can no longer determine location.
- Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of skill in the art, through comparison of such systems with the present invention as set forth in the remainder of the present application with reference to the drawings.
- A system and/or method for location determination and navigation using textual information, substantially as shown in and/or described in connection with at least one of the figures, as set forth more completely in the claims.
- Various advantages, aspects and novel features of the present invention, as well as details of an illustrated embodiment thereof, will be more fully understood from the following description and drawings.
-
FIG. 1 is a block diagram of an exemplary wireless device with positioning capability, in accordance with an embodiment of the invention. -
FIG. 2A is a block diagram illustrating an exemplary building with textual information, in accordance with an embodiment of the invention. -
FIG. 2B is a diagram illustrating an exemplary wireless device positioning performed inside a structure based on signs at entrances to the structure, in accordance with an embodiment of the invention. -
FIG. 2C is a diagram illustrating an exemplary wireless device positioning performed inside a structure based on a user's path inside the structure, in accordance with an embodiment of the invention. -
FIG. 2D is a diagram illustrating an exemplary wireless device navigation, in accordance with an embodiment of the invention. -
FIG. 3 is a diagram illustrating an exemplary wireless device for positioning, in accordance with an embodiment of the invention. -
FIG. 4 is a block diagram illustrating exemplary steps in determining location without GNSS, in accordance with an embodiment of the invention. -
FIG. 5 is a block diagram illustrating exemplary steps for navigation without GNSS, in accordance with an embodiment of the invention. -
FIG. 6 is a flow diagram illustrating exemplary steps in an enhanced GNSS positioning, in accordance with an embodiment of the invention. -
FIG. 7 is a flow diagram illustrating exemplary steps in image textual information extraction, in accordance with an embodiment of the invention. -
FIG. 8 is a flow diagram illustrating exemplary steps for accurate wireless device positioning, in accordance with an embodiment of the invention. -
FIGS. 9A-9C is a diagram illustrating an exemplary wireless device positioning based on text perspective, in accordance with an embodiment of the invention. -
FIG. 10 is a flow diagram illustrating an exemplary distance determination using perspective, in accordance with an embodiment of the invention. -
FIG. 11 is a flow diagram illustrating an exemplary distance determination using perspective, in accordance with an embodiment of the invention. - Certain aspects of the invention may be found in a method and system for location determination and navigation using textual information. Exemplary aspects of the invention may comprise capturing one or more images of one or more sources of textual information in the vicinity of the wireless communication device. Text may be extracted from the one or more sources of textual information, and a position of the wireless device may be determined based on a comparison of the extracted text in the captured one or more images to text in a stored database of textual information. An orientation of the text may be sensed in the captured one or more images relative to the wireless device. An orientation of the wireless device may be utilized in conjunction with the extracted text for the position determining. The orientation and the extracted text may be utilized in conjunction with determined distances from the one or more sources of textual information for the position determining. Locations of the sources of textual information and/or the captured one or more images may be stored in the database of textual information. An instruction to capture one or more images in a different orientation may be received when the positioning does not meet an accuracy requirement. The database of textual information may be downloaded when GNSS signals sufficient for positioning are no longer received by the wireless communication device. A distance from one or more of the sources of textual information in the vicinity of the wireless communication device may be determined based on known optical properties of a camera in the wireless communication device. The optical properties may comprise focal length and/or and focus setting. The determined distance may be used to determine an accurate location based on the captured one or more images.
-
FIG. 1 is a diagram of an exemplary wireless device with positioning capability, in accordance with an embodiment of the invention. Referring toFIG. 1 , there is shown awireless device 101 and theserver 105, which may comprise adatabase 107. Thewireless device 101 may comprise any device (e.g. smart phone) or vehicle where its user may desire to know the location of such device or vehicle. Thewireless device 101 may comprise a global navigation satellite system (GNSS) receiver that may be operable to receive medium Earth orbit (MEO) satellite signals and low Earth orbit (LEO) satellite signals. - There is also shown MEO satellites (e.g. GPS satellites), and LEO satellites (e.g. Iridium communication satellites). Medium Earth orbit satellites may be at a height of about 12,000 miles above the surface of the Earth, compared to about 500 miles above the surface for low Earth orbit satellites. Therefore, the signal strength of LEO satellite signals is much stronger than MEO satellite signals. LEO satellites may be used for telecommunication systems, such as satellite phones, whereas MEO satellite systems may be utilized for location and navigation applications.
- In certain circumstances, satellite signals may be attenuated when the
wireless device 101 enters a building, such that the wireless device may not be able to utilize satellite signals to determine its location. In this instance, it may be desirable to determine the location of thewireless device 101 utilizing other techniques. - In an exemplary embodiment, the
wireless device 101 may determine its location via GNSS when outside thebuilding 103. In instances where thebuilding 103 attenuates satellite signals to such a level that thewireless device 101 can no longer obtain a lock for GNSS positioning purposes, thewireless device 101 may utilize textual information to determine its location. - Textual information may be input to the
wireless device 101 via a photo, a series of photos, or video taken by a camera in thewireless device 101. These may be combined with orientation (elevation, azimuth, and rotation) of the camera captured by a compass, gyroscope, gravity sensor, or other kind of sensor present in the camera. Optical focus and/or distance sensors on the camera may estimate the distance of thewireless device 101 to the textual information. Furthermore, the camera in the wireless device may comprise an auto-detect function for text, such as optical character recognition, and may automatically perform an auto-zoom into text characters to improve recognition and positioning. The accuracy of the character recognition may be improved by making the text a larger portion of the captured image, increasing the ability to detect fonts, colors, or other features or characteristics of the text. - The textual information may include the position and orientation of the letters in three dimensions, the direction of the text, the absolute size of the letters, the fonts used, the language, slang, dialect or patois in the text, the color and/or lighting of the text, defects, blemishes, and the presence of bar codes nearby, for example. Similarly, other marks, such as trademarks or any graphical feature usually associated with particular text, for example, may be considered in conjunction with the actual text captured to assist in determining the position. Furthermore, the context of the text may be utilized to assist in the correlation. For example, if the text is taken from a store sign (inferred from being present in a shopping mall and looking upward) this knowledge may be utilized to narrow the database search to store names. Similarly, if the text is from a billboard or from a sign giving directions to pedestrians, a different subset of the database may be searched. The textual information in images, their relative position to each other and distance from the camera, obtained by the
wireless device 101 may be compared to the known textual information of the building in one or more database, such as thedatabase 107, either stored locally or obtained from aremote server 105. The server may be accessed via a cellular network or the Internet via an access point, for example. - In an exemplary scenario, the
wireless device 101 may download a textual information map of thebuilding 103 upon entering and losing GNSS signals; it may download the textual features of the building; or it may use satellite information and publically-available maps to estimate the locations of text related to known stores or other facilities in the building. For example, a map of a shopping center may comprise the location of stores in the facility, such that a captured image of a store's name of the appropriate size for the sign in front of the store may be utilized to accurately determine the position of thewireless device 101. - In another exemplary scenario, the
wireless device 101 may store textual information maps of buildings commonly visited by the user of thedevice 101. In yet another exemplary scenario, thewireless device 101 may download textual information maps of locations that the user of the wireless device enters or plans to enter in the future. For example, if a user has entered an address of a shopping mall or a sports arena into a navigation program, the wireless device may download one or more textual information maps associated with the destination or destinations. Similarly, if a calendar for the user has an entry for a sports event at a particular sports arena, thewireless device 101 may download textual information maps for that sports arena, including seating sections, for example, prior to the event. Or if a user has a flight scheduled, thewireless device 101 may download textual information maps, including terminal and gate numbers and/or airline names, for the airports to be utilized on the trip. - The
wireless device 101 may determine its position by using estimated orientation and (when available) distance relative to textual elements, and employing triangulation methods such as is done in orienteering using the azimuth reading (e.g. provided by a compass) of several textual reference points to pinpoint one's location on a map. It may use more sophisticated trilateration calculations such as is done by GNSS devices, once distance to textual elements is estimated. The orientation of the captured text with respect to thewireless device 101 may be determined based on the orientation of thewireless device 101 in space as measured using internal sensors such as a compass and MEMS accelerometers, for example. - The
wireless device 101 may compare one or more captured textual elements, such as store signs, for example, to textual elements in stored or retrieved databases, and known to be in the vicinity of the wireless device when it lost GNSS lock upon entering thebuilding 103. By comparing a captured word, phrase, or character to a known textual element, and finding a match, or a correlation above a threshold level, the wireless device may then calculate its location based on the known map of thebuilding 103. This is shown in further detail inFIG. 2A-2D . -
FIG. 2A is a block diagram illustrating an exemplary building with textual information, in accordance with an embodiment of the invention. Referring toFIG. 2A , there is shown ashopping mall 200 comprising stores and hallways withtextual signs 201A-201E. There is also shown thewireless device 101, which may be located at location (x,y,z) and at an orientation (0, 0, rotation), and may be operable to determine its location based on a comparison of textual information obtained from images captured by a camera in thewireless device 101 and known textual information in a pre-stored or retrieved database. - The
wireless device 101 may be operable to download a map, or textual information database, comprising visible examples of text in theshopping mall 200 when the device enters the facility and loses satellite positioning signals. In another exemplary scenario, thewireless device 101 may download a textual information database when the user of thewireless device 101 activates a positioning function on the device and no satellite positioning signals are present. In yet another exemplary scenario, thewireless device 101 may have textual information maps stored internally for various buildings frequented by the user, or automatically downloaded whenever in the vicinity of a structure that attenuates satellite positioning signals. - In an exemplary scenario, once inside the
shopping mall 200, thewireless device 101 may capture one or more images, or video, of the textual information visible in theshopping mall 200. For example, thewireless device 101 may be used to scan from left to right, as illustrated by the curved arrow inFIG. 2A , capturing an image comprising thesigns 201A-201E. The captured image or images and their orientation to textual elements may be compared to the known signs or other textual information in theshopping mall 200 from a pre-stored or downloaded textual information database. Similarly, the database may comprise a map of the stores, entrances, restrooms, and other features of theshopping mall 200 such that captured text that matches these elements may indicate a position of thewireless device 101. - The database may comprise basic elements such as the simple text itself of various recognizable features of the
shopping mall 200, such as stores, restrooms, and exits. Similarly, the database may comprise the 3D location of the textual elements; images or abstractions of the image of signs to allow for easy identification from any orientation relative to the element (including characteristics such as color, size, font, language, dialect, and patois of the text). Given images containing several possible textual elements, and given the orientation and distance of the textual elements, thewireless device 101 may find in the database the most likely textual elements that correspond to those in the images. Alternatively, the wireless device may utilize optical character recognition to determine the text captured in the image and compare directly to text stored in the database that is known to be within theshopping mall 200. Text extraction from captured images may be performed by thewireless device 101 and/or by the system comprising the database. - The
wireless device 101 may then compare the captured image to a database comprising the known text. For example, a processor in thewireless device 101 may determine a coarse location by determining that the text “GAP” from the image corresponds to the location of the Gap store, particularly since it is adjacent to the “Restrooms” text, which is known to be adjacent to the Gap in the database. In addition, given the known optical properties of the camera in the wireless device, e.g., focal length and zoom, and the size of the GAP text in the sign as stored in the database, thewireless device 101 may thus calculate its distance from the Gap store. - In another exemplary scenario, the
sign 201E may comprise a circular or other shaped spot or location on the floor of theshopping mall 200 with a textual identifier and a known position in the database. Accordingly, a user may capture an image of thesign 201E while standing above it and then determine an accurate location from the database. Similarly, thewireless device 101 may capture one or more images of signs or other text in the vicinity while standing atop the spot or location. In this manner, by comparing the captured text with text in the database in conjunction with the known position of the spot or location, thewireless device 101 may determine an accurate position and then be utilized to navigate through theshopping mall 200. Similar spots or locations (with or without identifying signs) with known locations may be placed throughout theshopping center 200 to assist in accurate positioning. - In an exemplary scenario, the
wireless device 101 may then display a map for the user to navigate to a desired location, such as an emergency exit, handicap-accessible restroom, or other desired location in theshopping mall 200, for example. Thewireless device 101 may then calculate its location on a constant or periodic basis, assuming the camera continues to take images as the user travels throughout theshopping mall 200. It should be noted that the invention is not limited to location determination in a shopping mall, but may be utilized in any location where GNSS positioning is not available and a database of known textual characteristics is available. For example, this may include sports arenas, train stations, airports, hospitals, or office buildings. Using the time elapsed between images, together with the estimated distances from signs or other text, velocity (speed and direction) can be estimated, thereby providing another input to the inertial guidance system in thewireless device 101, which may comprise a compass and inertial sensors, for example. - Similarly, the
wireless device 101 may utilize textual or structural elements for positioning purposes to supplement any positioning technology such as GPS, assisted-GPS, WiFi triangulation, inertial sensors/accelerometers, or a hybrid positioning technique comprising two or more of such techniques. - In another exemplary scenario, the
wireless device 101 may be operable to capture images of structural elements in theshopping center 220 and determine its location based on the known structures in the stored database. Use of captured images of structures for positioning and navigation is described further in related application U.S. patent application Ser. No. 13/309,081 (Attorney Docket No. 24772US01) filed on Dec. 1, 2011, which is herein incorporated by reference. -
FIG. 2B is a diagram illustrating an exemplary wireless device positioning performed inside a structure based on signs at entrances to the structure, in accordance with an embodiment of the invention. Referring toFIG. 2B , there is shown ashopping center 220 with various stores, thewireless device 101, and entrance/exit signs 221A-221D. - In an exemplary scenario, the
wireless device 101 may be operable to determine its position inside theshopping center 220 by determining the GNSS location of the device at a particular entrance, such as outside the entrance labeled with the entrance/exit sign 221A, for example. In other words, the device knows that it has entered theshopping center 220 at that entrance, since, for example, the textual information database for theshopping center 220 includes GNSS positions for each entrance to the building. This will enable thewireless device 101 to obtain an initial position for navigation within theshopping center 220 without the use of GNSS, in accordance with the textual information database for theshopping center 220, as described herein. - In another exemplary scenario, the
wireless device 101 may be operable to determine its initial position within theshopping center 220 without GNSS. The wireless device may have lost GNSS reception when it entered theshopping center 220, or may have had GNSS recently switched off or disabled, for example. - The
wireless device 101 may be operable to determine its location through the identification of signs posted at the nearest entrance. For example, thewireless device 101 may comprise a compass such that it can determine what direction thewireless device 101 is facing when capturing an image of an entrance/exit sign, such as the entrance/exit sign 221A. The locations and directional facings of the entrance/exit signs 221A-221D of theshopping center 220 may be known, i.e., stored in a textual information database. Thus, initial position can be determined by comparison of the text from the captured image of the entrance/exit sign with the direction of thedevice 101 when the image was captured, with the corresponding information stored in the textual information database (i.e., with textual information representative of each entrance such as “South Entrance”, and what direction each entrance sign is facing). In the non-limiting example shown inFIG. 2B , since the database indicates that theshopping center 220 comprises a single south facing entrance (i.e. labeled with entrance/exit sign 221A), thewireless device 101 may then determine its position by calculating a distance from the entrance/exit sign 221A using the captured image. In addition, if more than one entrance were to be located to the south, the textual information in the sign in the captured image may be utilized to narrow the location down to be near one particular entrance. - In another exemplary scenario, there may not yet be a fully developed textual information database for the
shopping center 220. The wireless device may thus be utilized to input data to a database for theshopping center 220. For example, thewireless device 101 may have established its position before entering theshopping center 220 and then acquired images and/or video of signs and other textual information once inside. Similarly, thewireless device 101 may obtain spatial data in conjunction with captured images and/or video utilizing a pedometer and an altimeter (if necessary). -
FIG. 2C is a diagram illustrating an exemplary wireless device positioning performed inside a structure based on a user's path inside the structure, in accordance with an embodiment of the invention. Referring toFIG. 2C , there is shown ashopping center 230 with various stores, thewireless device 101, the entrance/exit signs 221A-221D, and a user path 231. - In an exemplary scenario, the
wireless device 101 may be operable to determine its position within theshopping center 230 without GNSS. For example, thewireless device 101 may comprise a compass such that it can determine what direction thewireless device 101 is facing and may also comprise a pedometer for determining the distance the user of thewireless device 101 has traveled based on the number of steps taken since the last GNSS position was determined. Thewireless device 101 may determine its position by calculating a distance traveled using a pedometer and an altimeter, if necessary (i.e., if the user has travelled to a different level or floor of the shopping center 230), in conjunction with the direction traveled as determined by a compass. The distance from the last known GNSS position may be determined by integrating the steps taken over the direction that thewireless device 101 traveled as determined by a compass, for example. - In another exemplary scenario, the
wireless device 101 may be operable to track its position via captured images and/or video. For example, images may be taken periodically such that thewireless device 101 may update its position by calculating its distance from captured images as compared to a textual information database. For example, the user of thewireless device 101 may capture images of signs on store fronts, determine its distance from the storefronts, and compare the text to a textual information database with known store locations to accurately determine its position. -
FIG. 2D is a diagram illustrating an exemplary wireless device navigation, in accordance with an embodiment of the invention. Referring toFIG. 2D , there is shown ashopping center 250, thewireless device 101, various stores, ashoe store 251, and a desiredstore 253. - In an exemplary scenario, the
wireless device 101 may be operable to determine its position, and then may be able to navigate the user of thewireless device 101 to a desired location, such as at the desiredstore 253 from its present position near theshoe store 251, all without GNSS. - The
wireless device 101 may determine its initial or present position from an image of the text of a nearby store sign, or any other signage or textual information, such as support column labels, in the vicinity of thewireless device 101 that corresponds to textual information in a stored structural database or map, as discussed above. The user of the device may then enter a desired location or destination, such as the desiredstore 253, into thewireless device 101 via textual or audio inputs, for example. - The
wireless device 101 may also be operable to determine the optimal path to reach the desiredstore 253, based on the stored map of theshopping center 250, and may display all or part of a map demonstrating the steps to be taken by the user to get to the destination. For example, thewireless device 101 may display a top-view surface map of theshopping center 250, and overlay a path or direction on the map for the user to follow, with instructions and/or arrows for navigation assistance. Alternatively (or additionally), thewireless device 101 may display a superimposed augmented reality, with instructions, arrows, and/or a direction overlaid on an image of theshopping center 250 in the direction in which the user should be facing to reach the desired destination. Thewireless device 101 may then, using the compass and pedometer (and altimeter, if necessary), track and display the user's progress along the path to, or in the direction toward, the desired destination. - In addition, the
wireless device 101 may utilize textual features from captured images to track the progress of the user toward the desiredstore 253. For example, text from store front signs may be used for accurate positioning. The updated positioning may enable an updated augmented reality display, such that the displayed image continues to match that of the surroundings when following the appropriate route. - In instances where the
wireless device 101 is no longer capable of capturing images, such as by being placed in a pocket, it may continue to track position via a pedometer, compass, and/or an altimeter, such that when thewireless device 101 is again retrieved to check the route, it may still have an accurate calculated position. Thewireless device 101 may reestablish its position using a captured image of a sign or other textual information to verify its location with respect to the desired destination. - In another exemplary scenario, the
wireless device 101 may incorporate a conversational element to positioning. Accordingly, if thewireless device 101 determines a position but does not have a high accuracy probability, thewireless device 101 may ask the user questions to improve the positioning accuracy. For example, the wireless device may ask the user “Do you see a movie theater to your right?” If the answer does not confirm the estimated position, thewireless device 101 may request that the user capture one or more additional images for positioning purposes. Or thewireless device 101 may request that the user scan to the left or right to determine whether expected textual information is present. The conversational element may also be extended to assist in the navigational aspects of the present invention. - Similarly, if the user of the
wireless device 101 makes a stop along the way, as shown inFIG. 2 , thewireless device 101 may again establish a route by determining its position without the use of GNSS, and comparing the determined position to that of the desiredstore 253 in the stored map. This position may be reestablished by capturing an image of the exited storefront sign or other signs in the vicinity, for example, as described herein. This information can then be used to assist the user, via the map or superimposed augmented reality, back on the previous path or in the proper direction, or on a new path or in a new direction, to the desired destination (i.e., desired store 253). -
FIG. 3 is a block diagram illustrating an exemplary positioning wireless device, in accordance with an embodiment of the invention. Referring toFIG. 3 , there is shown thewireless device 101 comprising a global navigation satellite system (GNSS)module 301, aprocessor 303, anRF module 305, amemory 307, and acamera 309. There are also shown the access point orcellular tower 311 and the remotetextual information database 313. - The
GNSS module 301 may comprise an RF receiver (Rx) path for receiving satellite signals for positioning functions. TheGNSS module 301 may be operable to down-convert received RF signals to baseband and subsequently demodulate the baseband signals to obtain an accurate clock signal, such as a GPS clock signal. By receiving clock signals and ephemeris data from multiple satellites, thewireless device 101 may be operable to accurately determine its location. - The
RF module 305 may comprise one or more RF Rx and transmit (Tx) paths for communicating with cellular towers or wireless access points, for example. TheRF module 305 may comprise one or more antennas, low-noise amplifiers (LNAs), power amplifiers, mixers, local oscillators, variable gain amplifiers, filters, and analog-to-digital converters (ADCs), for example. The RF module may thus be operable to receive RF signals, amplify the signals before down-converting to baseband, filter out noise signals, and convert the resulting filtered signals to digital signals for processing by theprocessor 303. Similarly, the RF module may be operable to convert digital baseband signals to analog signals, upconvert the analog baseband signals to RF, amplify the resulting RF signals and transmit the amplified signals via an antenna. - The
memory 307 may comprise a programmable memory module that may be operable to store software and data, for example, for the operation of thewireless device 101. Furthermore, thememory 307 may store downloaded textual information databases that may be utilized by theprocessor 303 to determine its location without a GNSS signal. - The
processor 303 may comprise a general purpose processor, such as a reduced instruction set computing (RISC) processor, for example, that may be operable to control the functions of the wireless device. For example, theprocessor 303 may enable theGNSS module 301 when a user indicates a desire to determine their location. Similarly, the processor may utilize images captured by thecamera 309 to determine location when no GNSS signal is present. Accordingly, theprocessor 303 may be operable to perform optical character recognition (OCR) for extracting textual data from captured images. The processor may correlate a previously determined GNSS location to a stored or downloaded textual information map of a building or other structure that the user of thewireless device 101 has entered. - The
camera 309 may be operable to capture still and/or video images via a charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) imaging sensor and associated optical components, such as lenses and readout circuitry. The optical components may comprise one or more lenses with known focal lengths for determining the distance to an object that is in focus, for example. - The
camera 309 may be operable to obtain information on text within images including attributes such as position and orientation of the letters in three dimensions, the direction of the text, the absolute size of the letters, the fonts used, the language, dialect or patois in the text, the color and/or lighting of the text, and the presence of bar codes nearby, for example. - The
processor 303 and thecamera 309 may thus be operable to search for the text in an image using any of the above attributes in combination, and use the perspective on the text (angle/skew/relative size of words/absolute size of the text where available) to determine the location of thewireless device 101. - The access point or
cellular tower 311 may be operable to provide wireless connectivity to thewireless device 101 via theRF module 305. The access point orcellular tower 311 may enable access to the remotestructural database 313 via the Internet or other network. - The remote
textual information database 313 may comprise data relating to textual information found within a building or other location where GNSS signals are not available. For example, the remotetextual information database 313 may comprise text of all stores in a shopping center or seating sections of a sports arena with their associated locations. Thewireless device 101 may download data from the remotetextual information database 313 when entering a building or may download such data at some time prior to entering. -
FIG. 4 is a block diagram illustrating exemplary steps in determining location without GNSS, in accordance with an embodiment of the invention. The exemplary method illustrated inFIG. 4 may, for example, share any or all functional aspects discussed previously with regard toFIGS. 1-3 . Referring toFIG. 4 , afterstart step 401, instep 403, the wireless device may determine its location via GNSS (e.g., GPS). - In
step 405, if GNSS signals are still available for positioning purposes, the exemplary steps may return to step 403 for continued GNSS positioning. If there are no GNSS signals, such as when the wireless device enters a building or other facility that attenuates GNSS signals below a threshold required for positioning purposes, the exemplary steps may proceed to step 407 where the wireless device may take one or more photo images or videos of the textual features of the surroundings. For example, this may include store signs, seating sections, exit/entrance signs, etc. - The text in the images may be extracted utilizing optical character recognition, for example, and may also capture characteristics of the text such as direction of the text, size, font, color, language, dialect, patois, and for example.
- In
step 409, the wireless device may determine its distance from the sign or other source of textual information using the known optical characteristics of the camera, for example. The extracted information may be uploaded to a server comprising a textual information database. In another exemplary scenario, the server may download a textual information database to the wireless device. - In step 411, the server, or alternatively the wireless device, may compare the text in the captured images to the stored and/or retrieved textual information database. For example, the wireless device may take an image of a sign of a store that matches that of a store and its location that are stored in a textual information database in the wireless device or on a remote server, which may then calculate an accurate position based on the distance from the sign, as determined by the size of the text and its orientation with respect to the wireless device. The search of the database may be narrowed geographically when the wireless device is known to be in a specific area. For example, if a wireless device is known to be in a certain shopping center, words that correlate to the known stores or businesses in that shopping center would be searched with a higher priority since they would be more likely to be captured by the wireless device.
- The optical properties of the camera system in the wireless device may assist in determining an accurate location by determining a distance to imaged text. For example, if the wireless device extracted the text “Sections 220-240” from a sign to a section of a sports arena, the textual information database may comprise a seating map with seating section names/numbers and their position. This information may then be utilized to indicate to the user where they are with respect to their desired seats. Similarly, capturing more than one source of text in an image may provide accurate positioning based on the perspective seen between the difference sources, such as one sign being closer than and at a particular angle from another sign. In another exemplary scenario, UPC bar codes or other bar or 2-dimensional codes may be utilized to determine a position. For example, a wireless device may take an image of or read one or more product UPC bar codes, or bar codes otherwise identifying a specific section, pillar, shelf, etc., in a large store where the locations of the products, sections and shelves are known. It can be determined from the captured image of the code or from a reading of the code that the user is in a certain section of the store (e.g. “tools”). If the user desires to find the section for “plumbing”, the user can then enter text or speak into the wireless device (e.g., inputs indicating “plumbing”) and be directed along a path toward the plumbing section (with navigation assistance as set forth further herein).
- The textual information map may be pre-stored in the wireless device, or may be downloaded at the time of entry into the building. For example, if a calendar in a wireless device indicates that the user will be at a sports event at a particular time, the wireless device may download a seating map with associated seat section names and numbers for the sports arena prior to arrival at the event. In another exemplary scenario, the wireless device may download a textual information map for the nearest structure as determined by the last GNSS determined position when GNSS signals were lost.
- The exemplary steps may end at
end step 413, or may continue back to step 405 if further positioning is desired. -
FIG. 5 is a block diagram illustrating exemplary steps for navigation without GNSS, in accordance with an embodiment of the invention. The exemplary method illustrated inFIG. 5 may, for example, share any or all functional aspects discussed previously with regard toFIGS. 1-4 . Referring toFIG. 5 , afterstart step 501, instep 503, the wireless device may determine its location without GNSS, but with knowledge that it is within a particular structure, such as a shopping center or sports arena, for example. The position may be determined by measuring a distance to a sign with textual information whose location is stored in a textual information database. The wireless device may take one or more photo images or videos of the textual features of the surroundings. For example, this may include store signs, entrance/exit signs, billboards, sports arena seating section signs, etc. Knowledge of the optical properties of the camera system in the wireless device may assist in determining an accurate location by determining a distance to imaged text. - In
step 505, if the user of the wireless device has entered a desired destination either textually or vocally, the exemplary steps may proceed to step 507, and if there is no desired destination, the exemplary steps may proceed to endstep 513. - In
step 507, the wireless device may generate and display a navigation map comprising a top-view surface map with a path or direction overlaid thereon, or an augmented reality with a path or direction overlaid on an image of the surroundings in the direction of the desired location. - In
step 509, the position of the wireless device may be tracked or monitored utilizing periodically captured images and/or video, and/or by tracking distance utilizing a pedometer in conjunction with a compass (and altimeter, if necessary). In this regard, the wireless device may track and display the user's progress along the path to, or in the direction toward, the desired destination. In step, 511, if the wireless device is at the desired location, the exemplary steps may proceed to endstep 513, and if not may continue back to step 507 for further navigation. -
FIG. 6 is a flow diagram illustrating exemplary steps in an enhanced GNSS positioning, in accordance with an embodiment of the invention. The exemplary method illustrated inFIG. 6 may, for example, share any or all functional aspects discussed previously with regard toFIGS. 1-5 . Referring toFIG. 6 , afterstart step 601, instep 603, if the GNSS signal is weak or minimal satellites are within the field of view due to the wireless device being surrounded by tall buildings or other attenuating structures, the exemplary steps may proceed to step 605 where the wireless device may take one or more images or video of surrounding sources of textual information. If the GNSS signal is strong, and thus the accuracy of the positioning is high, the exemplary steps may proceed to step 609 for further GNSS positioning followed byend step 611. - In
step 607, the text in the captured images or video may be compared to text in a database comprising text associated with known positions to determine a more accurate location. For example, if GNSS has determined that the wireless device is on a particular street with about 100 meter accuracy, the wireless device may take images of one or more street signs or building signs. The text of the signs and their locations may be stored in the database, enabling the wireless device to accurately determine its location, despite the weak GNSS signal, followed byend step 611. -
FIG. 7 is a flow diagram illustrating exemplary steps in image textual information extraction, in accordance with an embodiment of the invention. The exemplary method illustrated inFIG. 7 may, for example, share any or all functional aspects discussed previously with regard toFIGS. 1-6 . Referring toFIG. 7 , afterstart step 701, instep 703, the wireless device may determine what building or other attenuating structure that it is within based on its last known position and, if available, velocity and/or elapsed time. Instep 705, image processing algorithms, such as optical character recognition (OCR), may be utilized to extract text of captured signs or other sources of text within the building and the wireless device may establish the orientations and distances of the extracted text. The wireless device may also determine its distance from the text source. - In step, 707, the wireless device may communicate images and/or the extracted text, orientations, and distances to a server comprising one or more textual information databases. In another exemplary scenario, the wireless device may upload raw images to the server for the extraction of text and its attributes.
- In
step 709, the server may search the database for geolocation based on the text and its attributes. - In step 711, based on knowledge of the position of the text found in the database, and the relative position of the camera, the precise position of the camera is determined by the server, or alternatively the wireless device. If the positions of the different text that is captured are known with a varying degree of certainty, estimators such as a Kalman filter can be used to obtain an optimal estimate of the camera location. The exemplary steps may then proceed to end
step 713. -
FIG. 8 is a flow diagram illustrating exemplary steps for accurate wireless device positioning, in accordance with an embodiment of the invention. The exemplary method illustrated inFIG. 8 may, for example, share any or all functional aspects discussed previously with regard toFIGS. 1-8 . Referring toFIG. 8 , afterstart step 801, instep 803, image processing algorithms may be utilized to extract textual information from one or more images or videos captured by the wireless device of a sign or other textual information source in a building or other attenuating structure. The wireless device may establish orientations and distances of the text with respect to the wireless device. - In
step 805, the wireless device may communicate extracted text, orientations, fonts, size, language, color, and distances of the elements to a server comprising one or more textual information databases. Instep 807, the server may determine the best matches between received data and structural elements within its database, and may order the results by the quality, Qi, of the match. - Following
step 807, the exemplary steps may include one or both ofsteps step 809, the server may start with the highest quality matches, Mi, and use the orientation and distance to the wireless device to determine the estimated position of the wireless device, Pi. Instep 811, the server may start with the highest quality match pairs, Mi and Mj that have the largest orientation difference, and use them to triangulate the position Pij. - In
step 813, the server may find the matches M1 that have positions Pi and Pij that cluster most closely around a position Px, using a distance metric weighted by the quality of match and triangulation. In step, 815, if there is a Px which has the largest cluster, including data from previous iterations, the exemplary steps may proceed to step 817, where the server may report Px as the likely location of the wireless device. The accuracy may be given by the variance of the positions Pi weighted by the quality of the match and triangulation. - If, in
step 815, there is no Px with a large cluster of results, the exemplary steps may proceed to step 821 where the wireless device may be requested to take one or more images at a different orientation, preferably recommending an orientation with prominent text and large orientation difference, before proceeding to step 803 for a renewed extraction process. - Following
step 817, instep 819, if the positioning accuracy meets a user or application requirement, the exemplary steps may proceed to endstep 823, and if not, the exemplary steps may proceed to step 821 for further image capture. -
FIG. 9 is a diagram illustrating an exemplary wireless device positioning based on text perspective, in accordance with an embodiment of the invention. Referring toFIG. 9 , there is shown three different perspectives,FIGS. 9A-9C , of the same text as captured by thewireless device 101. The perspective, or orientation, of the text in conjunction with the knowledge of the text characteristics and location may be utilized to determine an accurate position of thewireless device 101. - For example, in
FIG. 9A , the height of the letters of the text as viewed by the wireless device may appear to be the same on the left side as the right side, i.e., the “T”s are of the same height, or are of the same height ratio as expected if looking straight at the text. Similarly, the distance to the two outer letters may be measured to be the same by the wireless device based on focus settings and focal length of the optics in the camera of the wireless device. Accordingly, because the location of the text is stored in the textual information database, the wireless device may then be accurately determine its location, since it is at an accurately measured distance from a known location. - In
FIG. 9B , the text characters may be appear to be larger on the left side as compared to the right, indicating that the left side is closer than the right side to the wireless device. In this instance, the wireless device may accurately determine its position from a measurement of the relative heights of the letters and a single measure of the distance to the letters. The observed relative heights of the outer letters, when compared to the actual height ratio as stored in the database may be a measurement of the lateral distance from looking straight on to the sign, as shown inFIG. 9A . - Similarly, the wireless device may determine distances to both the edges of the left and right letters or between the first and last letters, based on focus settings and focal length of the optics in the camera of the wireless device. These distances, along with the known distance between the first and last letters or left and right edges of the text (i.e., the width of the sign) from the textual information database, give the three sides of the triangle formed between the wireless device and the outer letters.
- In
FIG. 9C , the text characters may be appear to be larger on the right side as compared to the left, indicating that the right side is closer than the left side to the wireless device. In this instance, the wireless device may accurately determine its position from a measurement of the relative heights of the letters and a single measurement of the distance to the letters. - Similarly, the wireless device may determine distances to the left and right edges of the letters or between the first and last letters, based on focus settings and focal length of the optics in the camera of the wireless device. These distances, along with the known distance between the first and last letters or left and right edges of the text (i.e., the width of the sign) from the textual information database, give the three sides of the triangle formed between the wireless device and the outer letters.
- These two distance determination methods provide an accurate positioning capability to the wireless device without the need for GNSS, and are described further with respect to
FIGS. 10 and 11 . -
FIG. 10 is a flow diagram illustrating an exemplary distance determination using perspective, in accordance with an embodiment of the invention. The exemplary method illustrated inFIG. 10 may, for example, share any or all functional aspects discussed previously with regard toFIGS. 1-9 . Referring toFIG. 10 , afterstart step 1001, instep 1003, the wireless device may determine what building or other GNSS signal attenuating structure in which it is located, based upon its last known GNSS position, for example. Instep 1005, the wireless device may capture images of a source of text, followed bystep 1007, where the wireless device may determine the distances to the outer letters of the sign or other source of text. This may be determined using known focus settings and focal lengths of optics in the camera or cameras in the wireless device. - In
step 1009, the location of the source of the text and the distance between the outer or inner edges of the outer letters may be downloaded from a textual information database. The database may be stored in a remote server, or may be downloaded to the wireless device when entering the building. - In
step 1011, the wireless device may triangulate its position from the three sides of the triangle as defined by the wireless device and the outer letters of the text source, followed byend step 1013. -
FIG. 11 is a flow diagram illustrating an exemplary distance determination using perspective, in accordance with an embodiment of the invention. The exemplary method illustrated inFIG. 11 may, for example, share any or all functional aspects discussed previously with regard toFIGS. 1-10 . Referring toFIG. 11 , afterstart step 1101, instep 1103, the wireless device may determine what building or other GNSS signal attenuating structure in which it is located, based upon its last known GNSS position, for example. Instep 1105, the wireless device may capture images of a source of text, followed bystep 1107, where the wireless device may determine a distance to the sign or other source of text and the height, or relative height, of outer letters in the sign. This may be determined using known focus settings and focal lengths of optics in the camera or cameras in the wireless device, although a relative height measurement would have fewer requirements. - In
step 1109, the location of the source of the text and the height and/or relative height of the outer letters may be downloaded from a textual information database. The database may be stored in a remote server, or may be downloaded to the wireless device when entering the building. - In
step 1111, the wireless device may triangulate its position from the measured distance between the sign and the wireless device in conjunction with the lateral distance as determined by the outer letter height ratio when compared to the actual height ratio, followed byend step 1113. - In an embodiment of the invention, a method and system may comprise capturing one or more images of one or more sources of
textual information 201A-201D, 221A-221D in the vicinity of awireless communication device 101. Text may be extracted from the one or more sources oftextual information 201A-201D, 221A-221D and a position of thewireless device 101 may be determined based on a comparison of the extracted text in the captured one or more images to text in a storeddatabase - An orientation of the text may be sensed in the captured one or more images relative to the
wireless device 101. An orientation of thewireless device 101 may be utilized in conjunction with the extracted text for the position determining. The orientation and the extracted text may be utilized in conjunction with determined distances from the one or more sources oftextual information 201A-201D, 221A-221D for the position determining. Locations of the sources oftextual information 201A-201D, 221A-221D and/or the captured one or more images may be stored in the database oftextual information - An instruction to capture one or more images in a different orientation may be received when the positioning does not meet an accuracy requirement. The database of
textual information wireless communication device 101. A distance from one or more of the sources oftextual information 201A-201D, 221A-221D in the vicinity of thewireless communication device 101 may be determined based on known optical properties of acamera 309 in thewireless communication device 101. The optical properties may comprise focal length and/or focus setting. The determined distance may be used to determine an accurate location based on the captured one or more images. - Other embodiments of the invention may provide a non-transitory computer readable medium and/or storage medium, and/or a non-transitory machine readable medium and/or storage medium, having stored thereon, a machine code and/or a computer program having at least one code section executable by a machine and/or a computer, thereby causing the machine and/or computer to perform the steps as described herein for location determination and navigation using textual information.
- Accordingly, aspects of the invention may be realized in hardware, software, firmware or a combination thereof. The invention may be realized in a centralized fashion in at least one computer system or in a distributed fashion where different elements are spread across several interconnected computer systems. Any kind of computer system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware, software and firmware may be a general-purpose computer system with a computer program that, when being loaded and executed, controls the computer system such that it carries out the methods described herein.
- One embodiment of the present invention may be implemented as a board level product, as a single chip, application specific integrated circuit (ASIC), or with varying levels integrated on a single chip with other portions of the system as separate components. The degree of integration of the system will primarily be determined by speed and cost considerations. Because of the sophisticated nature of modern processors, it is possible to utilize a commercially available processor, which may be implemented external to an ASIC implementation of the present system. Alternatively, if the processor is available as an ASIC core or logic block, then the commercially available processor may be implemented as part of an ASIC device with various functions implemented as firmware.
- The present invention may also be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein, and which when loaded in a computer system is able to carry out these methods. Computer program in the present context may mean, for example, any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code or notation; b) reproduction in a different material form. However, other meanings of computer program within the understanding of those skilled in the art are also contemplated by the present invention.
- While the invention has been described with reference to certain embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the present invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the present invention without departing from its scope. Therefore, it is intended that the present invention not be limited to the particular embodiments disclosed, but that the present invention will include all embodiments falling within the scope of the appended claims.
Claims (24)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/632,862 US20170359691A1 (en) | 2011-12-16 | 2017-06-26 | Method and system for location determination and navigation using textual information |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/328,413 US9253607B2 (en) | 2011-12-16 | 2011-12-16 | Method and system for location determination and navigation using textual information |
US15/012,494 US9693198B2 (en) | 2011-12-16 | 2016-02-01 | Method and system for location determination and navigation using textual information |
US15/632,862 US20170359691A1 (en) | 2011-12-16 | 2017-06-26 | Method and system for location determination and navigation using textual information |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/012,494 Continuation US9693198B2 (en) | 2011-12-16 | 2016-02-01 | Method and system for location determination and navigation using textual information |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170359691A1 true US20170359691A1 (en) | 2017-12-14 |
Family
ID=48610623
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/328,413 Expired - Fee Related US9253607B2 (en) | 2011-12-16 | 2011-12-16 | Method and system for location determination and navigation using textual information |
US15/012,494 Expired - Fee Related US9693198B2 (en) | 2011-12-16 | 2016-02-01 | Method and system for location determination and navigation using textual information |
US15/632,862 Abandoned US20170359691A1 (en) | 2011-12-16 | 2017-06-26 | Method and system for location determination and navigation using textual information |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/328,413 Expired - Fee Related US9253607B2 (en) | 2011-12-16 | 2011-12-16 | Method and system for location determination and navigation using textual information |
US15/012,494 Expired - Fee Related US9693198B2 (en) | 2011-12-16 | 2016-02-01 | Method and system for location determination and navigation using textual information |
Country Status (1)
Country | Link |
---|---|
US (3) | US9253607B2 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190090095A1 (en) * | 2016-09-02 | 2019-03-21 | SK Planet Co., Ltd | Position providing method and apparatus therefor |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9097536B2 (en) * | 2012-02-02 | 2015-08-04 | Mcube, Inc. | Indoor navigation techniques to calibrate/recalibrate inertial sensors and navigation processing |
US9471092B2 (en) | 2012-02-03 | 2016-10-18 | MCube Inc. | Distributed MEMS devices time synchronization methods and system |
US20130329061A1 (en) * | 2012-06-06 | 2013-12-12 | Samsung Electronics Co. Ltd. | Method and apparatus for storing image data |
KR101887422B1 (en) * | 2012-11-19 | 2018-09-10 | 삼성전자주식회사 | Apparatas and method for displaying a location information of established device in an electronic device |
EP2940426A1 (en) * | 2014-04-28 | 2015-11-04 | Alcatel Lucent | Process for guiding in a building a user connected through at least one mobile terminal to a network |
US20160044467A1 (en) * | 2014-07-12 | 2016-02-11 | Cartogram, Inc. | Method for improving the accuracy of an indoor positioning system with crowdsourced fingerprints |
KR102250947B1 (en) * | 2014-12-26 | 2021-05-12 | 삼성전자주식회사 | Method for identifying a location of electronic apparatus and electronic apparatus and operating method of server |
EP3054310A1 (en) * | 2015-02-03 | 2016-08-10 | Vodafone IP Licensing limited | Method for location estimation of a mobile device |
US10001376B1 (en) * | 2015-02-19 | 2018-06-19 | Rockwell Collins, Inc. | Aircraft position monitoring system and method |
US10395126B2 (en) | 2015-08-11 | 2019-08-27 | Honda Motor Co., Ltd. | Sign based localization |
US20170046891A1 (en) * | 2015-08-12 | 2017-02-16 | Tyco Fire & Security Gmbh | Systems and methods for location identification and tracking using a camera |
CN110019027B (en) * | 2017-07-28 | 2022-10-04 | 华为终端有限公司 | Folder naming method and terminal |
CN107782316B (en) * | 2017-11-01 | 2019-07-12 | 北京旷视科技有限公司 | The track of target object determines method, apparatus and system |
JP7272764B2 (en) * | 2018-08-28 | 2023-05-12 | 清水建設株式会社 | Information provision system |
US11011055B2 (en) * | 2019-03-21 | 2021-05-18 | Verizon Patent And Licensing Inc. | Collecting movement analytics using augmented reality |
CN110611884A (en) * | 2019-09-24 | 2019-12-24 | 马鞍山问鼎网络科技有限公司 | Intelligent navigation system for shopping in shopping mall |
CN112784174A (en) * | 2019-11-08 | 2021-05-11 | 华为技术有限公司 | Method, device and system for determining pose |
CN110972065B (en) * | 2019-12-02 | 2021-06-15 | 广东小天才科技有限公司 | Building entrance and exit association method and device, terminal equipment and storage medium |
US11703586B2 (en) | 2021-03-11 | 2023-07-18 | Qualcomm Incorporated | Position accuracy using sensor data |
US20230360258A1 (en) * | 2022-05-04 | 2023-11-09 | Qualcomm Incorporated | Estimating and transmitting objects captured by a camera |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7969311B2 (en) | 2005-12-15 | 2011-06-28 | Invisitrack, Inc. | Multi-path mitigation in rangefinding and tracking objects using reduced attenuation RF technology |
US8942483B2 (en) | 2009-09-14 | 2015-01-27 | Trimble Navigation Limited | Image-based georeferencing |
US8131118B1 (en) * | 2008-01-31 | 2012-03-06 | Google Inc. | Inferring locations from an image |
US8155387B2 (en) * | 2008-10-13 | 2012-04-10 | International Business Machines Corporation | Method and system for position determination using image deformation |
US8509488B1 (en) * | 2010-02-24 | 2013-08-13 | Qualcomm Incorporated | Image-aided positioning and navigation system |
US8938257B2 (en) * | 2011-08-19 | 2015-01-20 | Qualcomm, Incorporated | Logo detection for indoor positioning |
-
2011
- 2011-12-16 US US13/328,413 patent/US9253607B2/en not_active Expired - Fee Related
-
2016
- 2016-02-01 US US15/012,494 patent/US9693198B2/en not_active Expired - Fee Related
-
2017
- 2017-06-26 US US15/632,862 patent/US20170359691A1/en not_active Abandoned
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190090095A1 (en) * | 2016-09-02 | 2019-03-21 | SK Planet Co., Ltd | Position providing method and apparatus therefor |
US10560809B2 (en) * | 2016-09-02 | 2020-02-11 | Sk Planet Co., Ltd. | Position providing method and apparatus therefor |
Also Published As
Publication number | Publication date |
---|---|
US20130157682A1 (en) | 2013-06-20 |
US9693198B2 (en) | 2017-06-27 |
US9253607B2 (en) | 2016-02-02 |
US20160150377A1 (en) | 2016-05-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9693198B2 (en) | Method and system for location determination and navigation using textual information | |
US9395188B2 (en) | Method and system for location determination and navigation using structural visual information | |
US9706350B2 (en) | Method and system for MAP generation for location and navigation with user sharing/social networking | |
US9234965B2 (en) | Indoor positioning using pressure sensors | |
US9476717B2 (en) | Simultaneous localization and mapping by using Earth's magnetic fields | |
KR101524395B1 (en) | Camera-based position location and navigation based on image processing | |
US20110306323A1 (en) | Acquisition of navigation assistance information for a mobile station | |
US20030008671A1 (en) | Method and apparatus for providing local orientation of a GPS capable wireless device | |
US20170059328A1 (en) | Generating Map Data | |
JP7045285B2 (en) | Information providing equipment, information providing method, and information providing program | |
KR102622585B1 (en) | Indoor navigation apparatus and method | |
US11519750B2 (en) | Estimating a device location based on direction signs and camera output | |
CN103557834A (en) | Dual-camera-based solid positioning method | |
US11160047B2 (en) | Determining motion information associated with a mobile device | |
CN105043375A (en) | Navigation method, navigation system and corresponding mobile terminal | |
Berkovich et al. | Coursa venue: Indoor navigation platform using fusion of inertial sensors with magnetic and radio fingerprinting | |
CN113739784A (en) | Positioning method, user equipment, storage medium and electronic equipment | |
KR20160119880A (en) | information service method for rock climbing root | |
US11703586B2 (en) | Position accuracy using sensor data | |
CN116931021A (en) | Positioning method and device, equipment, carrier and storage medium | |
KR20090083815A (en) | The geographical information guidance system and driving method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MAXLINEAR, INC., CALIFORNIA Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN CERTAIN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:046737/0594 Effective date: 20180807 Owner name: ENTROPIC COMMUNICATIONS, LLC (F/K/A ENTROPIC COMMU Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN CERTAIN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:046737/0594 Effective date: 20180807 Owner name: EXAR CORPORATION, CALIFORNIA Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN CERTAIN PATENTS;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:046737/0594 Effective date: 20180807 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |
|
AS | Assignment |
Owner name: RADIOXIO, LLC, MINNESOTA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MAXLINEAR, INC.;REEL/FRAME:047264/0199 Effective date: 20180803 |