GB2510586A - Navigational aid for visually impaired people - Google Patents

Navigational aid for visually impaired people Download PDF

Info

Publication number
GB2510586A
GB2510586A GB1302195.1A GB201302195A GB2510586A GB 2510586 A GB2510586 A GB 2510586A GB 201302195 A GB201302195 A GB 201302195A GB 2510586 A GB2510586 A GB 2510586A
Authority
GB
United Kingdom
Prior art keywords
location
user
image
stored
salient features
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1302195.1A
Other versions
GB201302195D0 (en
Inventor
Alastair James Buchanan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spiral Scratch Ltd
Original Assignee
Spiral Scratch Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spiral Scratch Ltd filed Critical Spiral Scratch Ltd
Priority to GB1302195.1A priority Critical patent/GB2510586A/en
Publication of GB201302195D0 publication Critical patent/GB201302195D0/en
Priority to PCT/GB2014/000047 priority patent/WO2014122420A1/en
Publication of GB2510586A publication Critical patent/GB2510586A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Navigation (AREA)

Abstract

A method for guiding a visually impaired person to a specific location comprises comparing a real-time image 12 of a location captured by a user-carried device with a stored image 11 of the location. The real-time image is processed to identify salient features thereof 14b, 15b, matching the salient features to features 14a, 15a in the stored image 11, and providing an output to the user indicative of the user's location relative to the salient features. The location of the device may initially be identified via GPS, and this data may be used to select an appropriate stored image. A bank of stored images relating to a specific location or building may be accessed by the device via a local area network. The device may also provide obstacle detection to detect obstacles that are not present in the stored image. An apparatus 13 for carrying out the method comprising a mobile phone or tablet appropriately equipped e.g. with a tactile map and/or audiovisual guidance provision is also disclosed.

Description

Aid for Visually Impaired People This invention relates to aid for visually impaired people.
People with visual impairment have many problems navigating their environment, particularly severe when that environment is unfamiliar. GPS-based route finding systems exist on most mobile phones today and can help guide the user to a pre-selected destination. However, GPS has an accuracy of around ten metres and does not work indoors. While these systems can be useful, these performance limitations mean that they can only guide the user to the general vicinity of their destination (the location of a building) rather than the precise location required (the door to the building). The problem also exists in large public buildings such as shopping centres, airports or railway stations.
In an unfamiliar environment a visually impaired person stifi requires assistance from another person to comptete their journey over the last ten metres.
The present invention provides means by which visually impaired people may be more accurately guided to specific places.
The invention comprises a method for guiding a visually impaired person to a specific location comprising providing a stored image of the location and forming a real time image of the location in a user-carried device, processing the real time image to pick out salient features thereof, matching the salient features to features in the stored image, and providing an output to the user indicative of the user's location relative to the salient features, A set of objectives, such as doors, cash machines, help desks, that a sight-impaired user would wish to navigate to, may be tagged in a stored image, and assigned precise, for example, metre or sub-metre coordinates. The real time image can be processed to identify the tagged objecrives and locate the user in relation to them.
The images may be matched by extracting salient features from each and identifying those that are similar. Salient features may comprise edges, corners and features detected by specialist feature detectors such as the FAST corner detector, SIFT (Scale Invariant Feature Transfhrin). SURF (Speeded Up Robust Features), and ORB (Oriented FAST and Rotated BRIEF -a ver.' fast binary descriptor).
The stored image may be a 360° panorama.
Once a real time image is matched to a stored image, it may be scaled and overlaid whereby to allow the user to align the real time image with the stored image by orientating the device, whereupon instructions may be given as a direction relative to the device orientation. The instructions may also comprise distance information.
Tagged objectives may be provided in a list from which a desired objective may be selected. The list may be provided as visual display on a screen, which may be a touch screen, the desired location being selected by tapping the screen, as a spoken list, from which a selection maybe made by a spoken command or by a keyboard or keypad button, or as a tactile display, for example a Braille screen.
The method may be canied out using a mobile communication device such as a mobile phone or tablet. The device may be UPS-enabled. The GPS facility may assist user guidance to the general location and may select an appropriate stored image from a bank of stored images. The selection may be based on the UPS location. The bank of stored images may comprise an existing geo-referenced image database such as Google Street View, or may comprise a specially assembled bank that may be made available over a network in a location such as a shopping mall or a factory, warehouse or office building.
For a limited site, the database may be stored on the device itself.
When a match is made, the position and orientation of the device, and hence of the user, is calculated, and a set of instructions generated to guide the user towards a chosen destination.
The instructions may be displayed with reference to a tactile map, which may be configured in a Braille reader, screen magnification, for the partially sighted. voice instructions and sonic pitch andior volume indications, which may be binaural. These, or at least one of them, may be supported on existing commercial mobile devices.
Provision may be made for obstacle detection, to wam of obstacles not present in the stored images.
Depth perception may be provided to determine distances. This may be effected by a depth from defocus' system or by comparing the size of a real time image to the size of the stored image.
The method may be enabled on a mobile communications device by downloading an app.
An auxiliary mirror may be attached to the communications device so that its camera lens images the scene ahead of the user when the device is held horizontally for orientation purposes.
Where an accelerometer, gyro-sensor and/or digital compass are present on the device, their functions may be integrated into the method to improve accuracy.
The invention also compnses apparatus for guiding a visually impaired person to a specific location comprising a user-carried device containing a stored image of the location and an imaging device capturing a real time image of the location and processing means programmed to process the real time image to pick out salient features thereof, to matching the salient features to features in the stored image, and to provide an output to the user indicative of the user's location relative to the salient features.
The apparatus may comprise a communications device such as a mobile phone or tablet having a camera, an area of memory assigned to holding a stored image of a location, and processing instructions to compare the stored image with a real time image generated by the camera, and output means to provide an output indicative of the user's location relative to the stored image location.
The output means may comprise a Braille reader, producing a tactile map. A program module stored in device memory may provide screen magnification for the partially sighted. A program module stored in memory may provide voice instructions andlor sonic pitch and/or volume indications, which may be binaural.
A method and apparatus for guiding a visually impaired person according to the invention will now be described with reference to the accompanying drawings, in which; Figure 1 is a flowchart of steps in navigating to a location; Figure 2 shows a stored image and a real-time image; and Figure 3 is a view of a minor attachment for a mobile device.
The drawings illustrate a method for guiding a visually impaired person to a specific location comprising providing a stored image 11 (Figure 2) of the location and forming a real time image 12 of the location in a user-carried device 13, processing the real time image 12 to pick out salient features 14a, ISa thereof, matching the salient features 14a.
ISa to features 14b, lSb in the stored imagel 1, and providing an output to the user indicative of the user's location relative to the salient features 14a, 15a. Salient features may include corners, lines and so forth as may be recognised by software image recognition routines, The real-time image and the stored image may be virtual images, one or other or neither of which may be actually displayed on a screen of the device 13.
As illustrated, the images II, 12 are not identical, but are clearly of the same location.
taken from different viewpoints. Software can calculate the precise position of the device from known coordinates of the salient features, and can align and resize the images.
The flowchart of Figure 1 illustrates the principal steps of a method in accordance with the invention, At Step I, the user will navigate to a target are using a UPS facility on a mobile phone serving as the device 13 or in any other convenient way as simply by asking a taxi driver to take him there.
At Step II, he will use the camera on his mobile phone to image the area, and may rotate so as to form a 360° panoramic image.
At Step Ill, the user will call up a stored image of the target location, also a 360° panorama, This may be called up from a cloud based database in accordance with the GPS location as known to the device, or it may be an image already stored in the device as being one of a location already, or perhaps frequently visited by the user.
At Step IV. software in the device processes the real time image to pick out salient features, and compares them with salient features of the stored image, and resizes and aligns the images. This fixes the precise location and orientation of the device.
At Step V, the user can call up a menu of objectives in the location, such as a cash machine, a bank door, an estate agents', a restaurant, and select one. If the user is partially sighted. the menu may be displayed on the device's screen and selected by a touch screen facility, or it may be a Braille screen, or the menu may be an audio list, from which an item may be selected by a verbal instruction, or the name of an item may be spoken to the device if there is a speech recognition facility, At Step VI, the device displays a direction and distance to the selected item, enabling the user to navigate precisely to it. The direction and distance may, again, be a visual display, perhaps magnified and brightened for a partially sighted user, or may be a tactile display with a direction arrow standing out in relief and a Braille number of metres, or it may be an audio display. as appropriate to the user's needs. The display may be updated as the user makes his way to the target.
Figure 3 illustrates a screen display of a menu of objectives, for Step V. Figure 4 illustrates a direction arrow on the screen as at Step VI.
As it will be desirable to have the screen horizontal, for a screen display, a minor 51, as seen in Figure 5. may be clipped to the device 13 to direct the forward scene into the camera 52.
A smart' mobile phone that is equipped with a camera, particularly a high-resolution camera, as some are. GPS and web access may thus be transformed, by downloading software into a micronavigation, instrument enabling the visually impaired confidently to navigate, say, the last ten metres of a journey once other services such as GPS have got them to within that distance.
Where GPS may be unavailable, for example inside buildings, a local navigation system may be provided, for example by the managers of a shopping mall. This could include a local street level view such as is provided by Google Street View.
While the arrangements may be kept simple, so as to be inexpensive, various refinements may be added, such for example as obstacle detection, in which a temporary obstacle, such as a waste bin or chair may be identified by comparing its appearance in the real time image being compared against a database of such objects and tagged as such in the real time image or indicated by an audio identifying signal.

Claims (8)

  1. Claims: 1 A method for guiding a visually impaired person to a specific location comprising providing a stored image of the location and forming a real time image of the location in a user-carried device, processing the real time image to pick out salient features thereof, matching the salient features to features in the stored image, and providing an output to the user indicative of the user's location relative to the salient features.
  2. 2 A method according to claim 1, carried out using a mobile communication device such as a mobile phone or tablet.
  3. 3 A method according to claim I or claim 2, in which the device is UPS-enabled.
  4. 4 A method according to claim 3, in which the GPS facility assists user guidance to the general location and selects an appropriate stored image from a bank of stored images.
  5. A method according to claim 4, in which the bank of stored images comprises an existing geo-referenced image database such as Google Street View.
  6. 6 A method according to claim 4, in which the bank of stored images comprises a specially assembled bank made available for a location such as a shopping mall or a factory, warehouse or oftice building.
  7. 7 A method according to claim 6, in which the bank is made available over a local area network 8 A method according to claim 6, in which, for a limited site, the an image or a database of images is stored on the device itself.9 A method according to any one of claims 1 to 8, in which, when a match is made, the position and onentation of the device, and hence of the user, is calculated, and a set of instructions generated to guide the user towards a chosen destination.10 A method according to claim 9. in which the instructions are displayed with reference to a tactile map.12 A method according to claim 9, in which the tactile map is displayed on a Braifle reader.13 A method according to claim 9, in which the instructions are displayed as a simplified, magnified image on a video screen.14 A method according to claim
  8. 8. in which the instructions comprise voice instructions.A method according to any one of claims ito 16, comprising provision for obstacle detection, to warn of obstacles not present in the stored images.16 A method according to any one of claims ito 17, in which depth perception is provided to determine distances.17 A method according to claim i7, in which depth perception is effected by a depth from defocus system.18 A method according to claim 18, in which depth perception is effected by comparing the size of a real time image to the size of the stored image.19 A method according to any one of claims I to IS, enabled on a mobile communications device by downloading an app'. iSApparatus for guiding a visually impaired person to a specific location comprising a user-carried device containing a stored image of the location and an imaging device capturing a real time image of the location and processing means programmed to process the real time image to pick out salient features thereof to matching the salient features to features in the stored image, and to provide an output to the user indicative of the user's location relative to the salient features.21 Apparatus according to claim 21, in which the user-carried device comprises a communications device such as a mobile phone or tablet having a camera.22 Apparatus according to ciaim 22, in which an area of memory is assigned to holding a stored image of a location and processing instructions to compare the stored image with a real time image generated by the camera.23 Apparatus according to any one of claims 20 to 22, comprising output means comprising a tactile map.24 Apparatus according to claim 23. in which the tactile map is comprised in a Braille reader.Apparatus according to any one of claims 20 to 24, comprising a program module stored in device memory.26 Apparatus according to ciaim 25, in which the program module provides screen magnification for the partially sighted.27 Apparatus according to claim 25. in which the program module provides voice instructions.
GB1302195.1A 2013-02-07 2013-02-07 Navigational aid for visually impaired people Withdrawn GB2510586A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1302195.1A GB2510586A (en) 2013-02-07 2013-02-07 Navigational aid for visually impaired people
PCT/GB2014/000047 WO2014122420A1 (en) 2013-02-07 2014-02-07 Aid for visually impaired people

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1302195.1A GB2510586A (en) 2013-02-07 2013-02-07 Navigational aid for visually impaired people

Publications (2)

Publication Number Publication Date
GB201302195D0 GB201302195D0 (en) 2013-03-27
GB2510586A true GB2510586A (en) 2014-08-13

Family

ID=47998775

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1302195.1A Withdrawn GB2510586A (en) 2013-02-07 2013-02-07 Navigational aid for visually impaired people

Country Status (2)

Country Link
GB (1) GB2510586A (en)
WO (1) WO2014122420A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021179036A1 (en) * 2020-03-10 2021-09-16 Sensen Networks Group Pty Ltd Systems and methods for image-based location determination

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8131118B1 (en) * 2008-01-31 2012-03-06 Google Inc. Inferring locations from an image
US20120130762A1 (en) * 2010-11-18 2012-05-24 Navteq North America, Llc Building directory aided navigation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120140486A (en) * 2011-06-21 2012-12-31 삼성전자주식회사 Apparatus and method for providing guiding service in portable terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8131118B1 (en) * 2008-01-31 2012-03-06 Google Inc. Inferring locations from an image
US20120130762A1 (en) * 2010-11-18 2012-05-24 Navteq North America, Llc Building directory aided navigation

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021179036A1 (en) * 2020-03-10 2021-09-16 Sensen Networks Group Pty Ltd Systems and methods for image-based location determination

Also Published As

Publication number Publication date
GB201302195D0 (en) 2013-03-27
WO2014122420A1 (en) 2014-08-14

Similar Documents

Publication Publication Date Title
US11692842B2 (en) Augmented reality maps
JP6763448B2 (en) Visually enhanced navigation
EP2672232B1 (en) Method for Providing Navigation Information and Server
CN104848863B (en) Generate the amplification view of location of interest
US8526677B1 (en) Stereoscopic camera with haptic feedback for object and location detection
CN117029868A (en) System and method for using visual landmarks in initial navigation
JP2007263835A (en) Car navigation apparatus
CN102915310A (en) Generating method of electronic maps, navigation method and navigation device
EP3116212B1 (en) Transition from display of first camera information to display of second camera information
EP3827222B1 (en) Method and device for navigating two or more users to a meeting location
KR102222102B1 (en) An augment reality navigation system and method of route guidance of an augment reality navigation system
US9418284B1 (en) Method, system and computer program for locating mobile devices based on imaging
KR101568741B1 (en) Information System based on mobile augmented reality
Fernandes et al. Integrating computer vision object recognition with location based services for the blind
JP2006285546A (en) Information providing system, database server, portable communication terminal
JP2008157820A (en) Device for providing sign information
GB2510586A (en) Navigational aid for visually impaired people
KR20180002249A (en) Intersection guidance method using the crossroads facilities in a navigation
KR20050013000A (en) Apparatus and method for guiding route of vehicle using three-dimensional information
JP5460252B2 (en) Navigation device and navigation method
US20230050644A1 (en) Generating Computer Augmented Maps from Physical Maps
KR102336775B1 (en) Apparatus and method for generating thema route
KR20110135516A (en) Navigation apparatus
KR20160060278A (en) Apparatus and method for displaying buliding data around the junction when driving in an alley
KR20180019860A (en) Navigation supplying apparatus and method based on street view for mobile

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)