WO2023247984A1 - Device and method for assisting visually impaired persons in public spaces - Google Patents
Device and method for assisting visually impaired persons in public spaces Download PDFInfo
- Publication number
- WO2023247984A1 WO2023247984A1 PCT/IB2022/000299 IB2022000299W WO2023247984A1 WO 2023247984 A1 WO2023247984 A1 WO 2023247984A1 IB 2022000299 W IB2022000299 W IB 2022000299W WO 2023247984 A1 WO2023247984 A1 WO 2023247984A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- orientation
- head
- visually impaired
- data
- impaired persons
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 8
- 230000001771 impaired effect Effects 0.000 title claims description 5
- 239000011521 glass Substances 0.000 abstract description 2
- 238000011156 evaluation Methods 0.000 abstract 1
- 230000001133 acceleration Effects 0.000 description 2
- 230000011664 signaling Effects 0.000 description 2
- 208000029257 vision disease Diseases 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000004393 visual impairment Effects 0.000 description 2
- 241000282472 Canis lupus familiaris Species 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
- G09B21/001—Teaching or communicating with blind persons
- G09B21/006—Teaching or communicating with blind persons using audible presentation of the information
Definitions
- Such mobility aids include canes for the blind, visual aids, guide dogs and the like.
- a cane for the blind provides tactile or auditory feedback, but requires greater attention from the user.
- Spatial location systems are able to indicate the location of obstacles via acoustic or haptic signals.
- the task of the invention is to improve the spatial orientation of visually impaired persons without the use of arms or hands.
- Another task of the invention is to recognize, evaluate and output objects, symbols and inscriptions in the close-up and distance range on a case-by-case basis.
- microphones and speakers there are microphones and speakers, cameras (daylight and infrared), infrared LEDs, font and symbol recognition, and a signal and data processing system.
- the data from the depth sensors is used to generate a spatial map, the correctness of which is verified using the compass, character recognition and, in the outdoor area, position data and the position of the sun.
- An acoustic map is generated from the existing data, this is output via loudspeakers (usually headphones) based on the orientation of the head.
- Objects which have a label or are recognized as standardized symbols, are deposited in the acoustic map and read out, if this serves the guidance or the orientation. This applies in particular to street signs, warning signs, target objects or information signs for target objects, traffic lights or POIs.
- goggle-like devices or methods incorporating them that are designed to improve or enhance the spatial orientation of the visually impaired.
- Some use sensors to detect the environment and signal the wearer, in some cases converting the detected obstacles or clearances into acoustic signals for example US 8797386, JP6765306B2, US 9140554 B2, US20130216093A1 , EP1293184B1 , US8594935B2.
- GPS global positioning system
- 3G cellular networks
- Fig. 1 shows the device, consisting of a spectacle frame (1), left spectacle temple (2), right spectacle temple (3) and bridge plate (4) with two batteries (5).
- each of the two eyeglass temples (2, 3) there is a battery (5), a Processing, communication, signaling and storage unit (6), an acceleration sensor (7), an infrared detector (8) and attached to it a headphone (9).
- infrared LEDs 10
- a camera 11
- an infrared detector 8
- distance sensors 12
- microphones 13
- Fig. 2 describes the method, here a three-dimensional acoustic map (14) is formed from the data of the distance sensors (12) and the exactly aligned position (15) as well as recognized symbols and labels (16) and output via the headphones (9).
- the exactly aligned position (15) results from the alignment of the head and map data (20), the sensor data of the compass (17), the GPS data (18) and the position of the sun (22) (detected by infrared detector (8) and calculated on the basis of known orbit data (23), GPS position data (18) and time (19)).
- a plausibility check is performed by a detection of ambient noise (21) by the microphones
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Navigation (AREA)
Abstract
A device, implemented as a pair of glasses, combines spatial navigation with the recognition and evaluation as well as the three-dimensional acoustic output of objects, symbols and labels.
Description
Device and method for assisting visually impaired persons in public spaces.
BACKGROUND OF THE INVENTION
So far, there are several mobility aids for people with visual impairments that enable them to move more safely in their environment. Such mobility aids include canes for the blind, visual aids, guide dogs and the like.
For example, a cane for the blind provides tactile or auditory feedback, but requires greater attention from the user. Spatial location systems are able to indicate the location of obstacles via acoustic or haptic signals.
Recognition of lettering is virtually impossible for persons with visual impairments, whether visual text or even Braille.
TASK OF THE INVENTION/PROBLEM TO BE SOLVED
The task of the invention is to improve the spatial orientation of visually impaired persons without the use of arms or hands.
Another task of the invention is to recognize, evaluate and output objects, symbols and inscriptions in the close-up and distance range on a case-by-case basis.
SOLUTION
These tasks are solved via smart glasses that have sensors for detecting distances, infrared, acceleration, compass direction and satellite navigation signals.
In addition, there are microphones and speakers, cameras (daylight and infrared), infrared LEDs, font and symbol recognition, and a signal and data processing system.
The data from the depth sensors is used to generate a spatial map, the correctness of which is verified using the compass, character recognition and, in the outdoor area, position data and the position of the sun.
An acoustic map is generated from the existing data, this is output via loudspeakers (usually headphones) based on the orientation of the head.
Objects, which have a label or are recognized as standardized symbols, are deposited in the acoustic map and read out, if this serves the guidance or the orientation. This applies in particular to street signs, warning signs, target objects or information signs for target objects, traffic lights or POIs.
PRIOR ART
There are already a variety of goggle-like devices or methods incorporating them that are designed to improve or enhance the spatial orientation of the visually impaired.
Some use sensors to detect the environment and signal the wearer, in some cases converting the detected obstacles or clearances into acoustic signals, for example US 8797386, JP6765306B2, US 9140554 B2, US20130216093A1 , EP1293184B1 , US8594935B2.
Further, global positioning system (GPS) and cellular networks (3G) are also considered, for example in CN102186141 B.
Systems with tactile signaling are also known, for example US20070016425A1 .
None of the known systems integrate recognition of labels or symbols, such as road signs, and use them both to verify the position and to improve the orientation of the person using the system.
CLOSER DESCRIPTION ALONG TO THE DRAWING
Fig. 1 shows the device, consisting of a spectacle frame (1), left spectacle temple (2), right spectacle temple (3) and bridge plate (4) with two batteries (5).
In each of the two eyeglass temples (2, 3) there is a battery (5), a Processing, communication, signaling and storage unit (6), an acceleration sensor (7), an infrared detector (8) and attached to it a headphone (9).
In the frame of the glasses there are six infrared LEDs (10), a camera (11), an infrared detector (8), distance sensors (12) and microphones (13).
Fig. 2 describes the method, here a three-dimensional acoustic map (14) is formed from the data of the distance sensors (12) and the exactly aligned position (15) as well as recognized symbols and labels (16) and output via the headphones (9).
The exactly aligned position (15) results from the alignment of the head and map data (20), the sensor data of the compass (17), the GPS data (18) and the position of the sun (22) (detected by infrared detector (8) and calculated on the basis of known orbit data (23), GPS position data (18) and time (19)).
A plausibility check is performed by a detection of ambient noise (21) by the microphones
Claims
1 . Device and method with guidance for assisting visually impaired persons wherein labels and normed as well as non-normed symbols are detected, processed with head orientation, position data, ambient noise and map data, and output in a spatial acoustic map based on head orientation.
2. Device and method accoring to claim 1 , wherein the orientation of the head is calculated using data from accelerometers (7), compass (6), infrared detectors (8), and microphones (13).
3. Device and method according to claim 1 , wherein the acoustic output indicates detected labels and standardized as well as non-standardized symbols by sounds and outputs them verbally depending on the orientation of the head.
4. Device and method according to claim 1 , wherein the acoustic output is linguistically overlaid by guidance instructions depending on the orientation of the head.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2022/000299 WO2023247984A1 (en) | 2022-06-20 | 2022-06-20 | Device and method for assisting visually impaired persons in public spaces |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2022/000299 WO2023247984A1 (en) | 2022-06-20 | 2022-06-20 | Device and method for assisting visually impaired persons in public spaces |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023247984A1 true WO2023247984A1 (en) | 2023-12-28 |
Family
ID=82404005
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2022/000299 WO2023247984A1 (en) | 2022-06-20 | 2022-06-20 | Device and method for assisting visually impaired persons in public spaces |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2023247984A1 (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0465306B2 (en) | 1985-01-29 | 1992-10-19 | Matsushita Electric Ind Co Ltd | |
EP1293184B1 (en) | 2001-09-17 | 2005-07-27 | Seiko Epson Corporation | Walking auxiliary for person with dysopia |
US20070016425A1 (en) | 2005-07-12 | 2007-01-18 | Koren Ward | Device for providing perception of the physical environment |
CN102186141A (en) | 2011-05-06 | 2011-09-14 | 中国华录集团有限公司 | Blind user navigation service system based on global positioning system (GPS) and 3rd-generation (3G) network |
US20130216093A1 (en) | 2012-02-21 | 2013-08-22 | Hon Hai Precision Industry Co., Ltd. | Walking assistance system and method |
US8594935B2 (en) | 2009-03-18 | 2013-11-26 | Intouch Graphics, Inc. | Systems, methods, and software for providing wayfinding orientation and wayfinding data to blind travelers |
US8797386B2 (en) | 2011-04-22 | 2014-08-05 | Microsoft Corporation | Augmented auditory perception for the visually impaired |
US9140554B2 (en) | 2014-01-24 | 2015-09-22 | Microsoft Technology Licensing, Llc | Audio navigation assistance |
US20210056866A1 (en) * | 2019-08-21 | 2021-02-25 | Seungoh Ryu | Portable Reading, Multi-sensory Scan and Vehicle-generated Motion Input |
-
2022
- 2022-06-20 WO PCT/IB2022/000299 patent/WO2023247984A1/en unknown
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0465306B2 (en) | 1985-01-29 | 1992-10-19 | Matsushita Electric Ind Co Ltd | |
EP1293184B1 (en) | 2001-09-17 | 2005-07-27 | Seiko Epson Corporation | Walking auxiliary for person with dysopia |
US20070016425A1 (en) | 2005-07-12 | 2007-01-18 | Koren Ward | Device for providing perception of the physical environment |
US8594935B2 (en) | 2009-03-18 | 2013-11-26 | Intouch Graphics, Inc. | Systems, methods, and software for providing wayfinding orientation and wayfinding data to blind travelers |
US8797386B2 (en) | 2011-04-22 | 2014-08-05 | Microsoft Corporation | Augmented auditory perception for the visually impaired |
CN102186141A (en) | 2011-05-06 | 2011-09-14 | 中国华录集团有限公司 | Blind user navigation service system based on global positioning system (GPS) and 3rd-generation (3G) network |
US20130216093A1 (en) | 2012-02-21 | 2013-08-22 | Hon Hai Precision Industry Co., Ltd. | Walking assistance system and method |
US9140554B2 (en) | 2014-01-24 | 2015-09-22 | Microsoft Technology Licensing, Llc | Audio navigation assistance |
US20210056866A1 (en) * | 2019-08-21 | 2021-02-25 | Seungoh Ryu | Portable Reading, Multi-sensory Scan and Vehicle-generated Motion Input |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Zafar et al. | Assistive devices analysis for visually impaired persons: A review on taxonomy | |
Chen et al. | An implementation of an intelligent assistance system for visually impaired/blind people | |
Prudhvi et al. | Silicon eyes: GPS-GSM based navigation assistant for visually impaired using capacitive touch braille keypad and smart SMS facility | |
Sharma et al. | International journal of engineering sciences & research technology a review on obstacle detection and vision | |
Hub et al. | Interactive tracking of movable objects for the blind on the basis of environment models and perception-oriented object recognition methods | |
TW201927610A (en) | Safety confirmation evaluating device, on-vehicle device, safety confirmation evaluation system having the two, safety confirmation evaluation method, and safety confirmation evaluation program | |
Muhammad et al. | A deep learning-based smart assistive framework for visually impaired people | |
US20230277404A1 (en) | System and method for guiding visually impaired person for walking using 3d sound point | |
WO2023247984A1 (en) | Device and method for assisting visually impaired persons in public spaces | |
Al-Shehabi et al. | An obstacle detection and guidance system for mobility of visually impaired in unfamiliar indoor environments | |
Kandoth et al. | Dhrishti: a visual aiding system for outdoor environment | |
Singh et al. | Navigation system for blind people using GPS & GSM Techniques | |
Kalaivani et al. | An Artificial Eye for Blind People | |
Jhonny et al. | Special glasses for obstacle detection with location system in case of emergency and aid for recognition of dollar bills for visually impaired persons | |
Telkar et al. | Vinetra Parijana: Smart Cane and Smart Glasses for Visually Impaired People | |
Okolo et al. | Assistive systems for visually impaired persons: challenges and opportunities for navigation assistance | |
Walvekar et al. | Blind Hurdle Stick: Android Integrated Voice Based Intimation via GPS with Panic Alert System | |
CN108721069A (en) | A kind of blind person's auxiliary eyeglasses carrying out vision positioning based on multi-modal data | |
Bhandary et al. | AI-Based Navigator for hassle-free navigation for visually impaired | |
Kim | Smart Information System for Socially Vulnerable Hierarchy: Focusing on Visually Impaired | |
Nikhil et al. | Solar Powered Smart Stick for Impaired People | |
WO2021070765A1 (en) | Guidance device, guidance method, and recording medium | |
Malkan et al. | Navigation using Object Detection and Depth Sensing for Blind People | |
Ashar et al. | A Survey on Deep Learning-based Smart Assistive Aids for Visually Impaired Individuals | |
Bolla et al. | Object Detection and Localization for Visually Impaired |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22737990 Country of ref document: EP Kind code of ref document: A1 |