WO2023222951A1 - Apparatus and method for impaired visibility perception - Google Patents

Apparatus and method for impaired visibility perception Download PDF

Info

Publication number
WO2023222951A1
WO2023222951A1 PCT/FI2023/050277 FI2023050277W WO2023222951A1 WO 2023222951 A1 WO2023222951 A1 WO 2023222951A1 FI 2023050277 W FI2023050277 W FI 2023050277W WO 2023222951 A1 WO2023222951 A1 WO 2023222951A1
Authority
WO
WIPO (PCT)
Prior art keywords
surrounding environment
points
information
detection means
position detection
Prior art date
Application number
PCT/FI2023/050277
Other languages
French (fr)
Inventor
Albert MANNINEN
Priit JAANSON
Jari Takala
Original Assignee
Manninen Albert
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Manninen Albert filed Critical Manninen Albert
Publication of WO2023222951A1 publication Critical patent/WO2023222951A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/068Sticks for blind persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • A61H2003/063Walking aids for blind persons with electronic detecting or guiding means with tactile perception
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • A61H2003/063Walking aids for blind persons with electronic detecting or guiding means with tactile perception
    • A61H2003/065Walking aids for blind persons with electronic detecting or guiding means with tactile perception in the form of braille
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/01Constructive details
    • A61H2201/0157Constructive details portable
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5092Optical sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5097Control means thereof wireless

Definitions

  • the present disclosure generally relates to aiding perception while operating with limited visibility.
  • the disclosure relates particularly, though not exclusively, to aiding perception while operating with limited visibility by providing non-visual feedback on the surroundings.
  • the disclosure relates particularly, though not exclusively, to aiding perception while operating with limited visibility by providing tactile feedback on the surroundings.
  • blind and visually impaired persons might use a white cane for probing their surroundings while moving around.
  • the cane is swayed laterally by hand at the direction of propagation. If the tip of the cane touches an obstacle, the person can feel it by hand at the other side of the cane. By probing the obstacle, a person can distinguish between a doorway, step, or a fallen tree.
  • the cane is sometimes touching the ground to verify that the ground is solid and there are no openings or steps downwards.
  • firefighters can move in low-visibility conditions by probing the surroundings with hands or any object.
  • an apparatus comprising at least one position detection means configured to detect the position of a plurality of points on the surfaces of the surrounding environment; and at least one feedback means configured to provide information on the position of the plurality of points on the surfaces of the surrounding environment; wherein the at least one feedback means comprise a two-dimensional array of elements configured to provide tactile information on the position of the plurality of points on the surfaces of the surrounding environment.
  • the position detection means may comprise at least one illuminating means for illuminating the surrounding environment with structured light by splitting a laser light into multiple beams pointing in different directions and falling upon the surface of the surrounding environment or modifying the laser light into a pattern falling upon the surface of the surrounding environment; and at least one imaging means for detecting the positions on which the multiple beams or the pattern fall.
  • the position detection means may comprise a sonic or ultrasonic radar or sonar, a lidar, a depth camera and/or 3D imaging means.
  • the two-dimensional array of elements may comprise actuator elements that are configured to be raised and/or lowered and/or vibrated and/or the temperature of which is configured to be changed and/or elements which are configured to provide electrical stimuli for providing the tactile information.
  • the position detection means may be configured to be wearable or releasably attached to, or integrated with, a white cane used by visually impaired people.
  • the feedback means may be configured to be wearable or releasably attached to, or integrated with, a white cane used by visually impaired people.
  • a method comprising detecting the position of a plurality of points on the surfaces of the surrounding environment; and providing information on the position of the plurality of points on the surfaces of the surrounding environment; wherein information on the position of the plurality of points on the surfaces of the surrounding environment is provided as tactile information.
  • Detecting the position of a plurality of points on the surfaces of the surrounding environment may comprise illuminating the surrounding environment with structured light by splitting a laser light into multiple beams pointing in different directions and falling upon the surface of the surrounding environment; and detecting the positions on which the multiple beams fall by imaging.
  • Providing information on the position of the plurality of points on the surfaces of the surrounding environment may comprise operating a two-dimensional array of elements that are configured to be raised and/or lowered and/or vibrated and/or the temperature of which is configured to be changed and/or which are configured to provide electrical stimuli for providing the tactile information.
  • a computer program comprising code for causing carrying out the method of the second example aspect, when executed by a processor comprised in the apparatus of the first example aspect.
  • a non-transitory memory medium comprising the computer program of the third example aspect.
  • Any foregoing memory medium may comprise a digital data storage such as a data disc or diskette; optical storage; magnetic storage; holographic storage; opto-magnetic storage; phase-change memory; resistive random-access memory; magnetic random-access memory; solid-electrolyte memory; ferroelectric random-access memory; organic memory; or polymer memory.
  • the memory medium may be formed into a device without other substantial functions than storing memory or it may be formed as part of a device with other functions, including but not limited to a memory of a computer; a chip set; and a sub assembly of an electronic device.
  • Fig. 1A schematically shows a principle view of an apparatus according to an example embodiment
  • Fig. 1 B schematically shows a principle view of an apparatus according to an example embodiment
  • Fig. 2A schematically shows a top view of a part of an apparatus according to an example embodiment
  • Fig. 2B schematically shows a side view of a part of an apparatus according to an example embodiment
  • Fig. 3 shows a block diagram of an apparatus according to an example embodiment
  • Fig. 4 shows a flow chart according to an example embodiment.
  • Figs. 1 A and 1 B schematically show a principle view of an apparatus according to an example embodiment.
  • Figs. 1 A and 1 B show an environment in which the apparatus is operated comprising ground, or floor, 1 10 which is uneven, i.e. as common in real world, the ground comprises portions of various elevations, for example steps and curbs.
  • the environment comprises obstacles, such as walls 120. Operating in such an environment with impaired visibility requires identification of the obstacles and elevations.
  • Figs. 1 A and 1 B show the apparatus according to an example embodiment comprising at least one position detection means 10 and at least one feedback means 20.
  • the position detection means 10 and the feedback means 20 are wearable, i.e. are configured to be worn or carried by a user of the apparatus.
  • the position detection means 10 and the feedback means 20 are provided separately from each other and can be positioned in different positions, for example the position detection means 10 are worn by the user and the feedback means 20 are provided on the handle of a cane.
  • the position detection means 10 are positioned on a hat, on glasses or on a further piece of clothing or on or about a body part of a user.
  • the position detection means 10 and the feedback means 20 are integrated into the same entity, i.e. into a single housing.
  • the position detection means 10 and the feedback means 20 are configured to communicate with each other.
  • the position detection means 10 and the feedback means 20 are positioned together, for example in a single housing, and communicate with a wired and/or wireless connection.
  • the position detection means 10 and the feedback means 20 are provided separately and the communication is provided via a wireless connection.
  • the position detection means 10 is configured to scan, or probe, the surrounding environment in a contactless manner and convey the position information to the feedback means 20.
  • the position detection means is configured to detect the position of a plurality of points or patterns in the surrounding environment, i.e. the position of plurality of points on surfaces of the surrounding environment as schematically shown in Figs. 1 A and 1 B.
  • the position detection means is configured to determine for each point a distance and a direction from the position detection means 10.
  • Figs 1 A and 1 B show a field of vision 14 of the position detection means and the distance and direction detected 12.
  • the position detection means comprise passive or active detection means such as a structured light detection means, sonic or ultrasonic radar or sonar, a lidar, a depth camera or 3D imaging means.
  • the detection of the position of a plurality of points or patterns on the surfaces is carried out by triangulation.
  • the detection of the position of a plurality of points or patterns on the surfaces is carried out by time of flight, amplitude modulated continuous wave, frequency modulated continuous wave and/or a further distance or coordinate measurement technique.
  • the position detection means 10 comprise structured light detection means comprising illuminating means and imaging means.
  • the illuminating means provide illumination with a laser light that is split into multiple beams or modified into a pattern using a suitable optical element such as a diffractive optical element .
  • the split or modified beams are directed into different directions forming points or patterns on the surfaces of the surrounding environment to be probed.
  • the imaging means such as a camera, is used to detect the positions upon which the beams fall on the surfaces of the surrounding environment.
  • the laser light comprises light with a wavelength on the infrared region, for example 800-1000 nm, that can be imaged with a Si- based detector.
  • the illuminating means provide a laser power well within the eye safety limits, such as combined power of 100-200mW with a single beam power in 1 mW range.
  • the apparatus comprises means for determining a geographical location of the apparatus and therethrough the user thereof.
  • the means for determining a geographical location are integrated with the positioning means.
  • the means for determining a geographical location comprise a satellite-based navigation system, such as global positioning system (GPS), global navigation satellite system (GLONASS), BeiDou or Galileo.
  • the means for determining a geographical location comprise a system not dependent on satellite navigation, for example simultaneous localization and mapping (SLAM) or wireless local area network (WLAN)-based navigation.
  • SLAM simultaneous localization and mapping
  • WLAN wireless local area network
  • the positioning means comprise means for measuring orientation, such as an accelerometer, a gyroscope and/or pointcloud localization means.
  • the position detection means and/or the means for determining a geographical location are integrated with another user device, such as a smartphone or a smartwatch.
  • the position detection means are configured to convey the position information to the feedback means 20.
  • Figs. 2A and 2B schematically show a top view and side view of the feedback means 20.
  • the feedback means 20 comprises an actuated two-dimensional array comprising a plurality of elements 22-11 to 22-nn.
  • the two-dimensional array is irregularly shaped, i.e. some rows or columns may have a different number of elements.
  • the actuated two- dimensional array comprises a binary Braille actuator, a multistep array actuator, an array of vibrating actuator elements, an array of temperature changing elements, an array of elements providing electrical stimulation, an array of elements producing ultrasound or a combination of any of the foregoing.
  • Fig 2A shows example feedback means being actuated to provide the user thereof with information of an example surrounding environment shown in Figs. 1 A and 1 B.
  • the actuator elements depicted in black have been raised to indicate the position of a wall 120.
  • the actuator elements depicted in black and white have been lowered to indicate an area of lower elevation on the ground.
  • the position of the raised and lowered actuator elements conveys information on the type of surroundings and their direction and distance with respect to the user.
  • an area of lower elevation is indicated by repeatedly raising and lowering back the actuator elements in question.
  • vibration and/or temperature change and/or electrical or ultrasound stimuli of the elements is used to convey the information.
  • the feedback means 20 are configured to provide the user with information about special objects in the surrounding environment.
  • the special objects in an embodiment, comprise special objects such as stairs, curbs, steps up or down.
  • the information about special objects in the surrounding environment is provided using spatio-temporal patterns.
  • the feedback means 20 and/or the position detection means 10 are configured to adapt to the surrounding environment by modifying or adapting the position detection and or feedback, for example by adapting the algorithms used.
  • ground detection and feedback thereon may be adapted or switched off in case the user is walking in tall grass.
  • the user is able to select the functions used, for example by switching off certain features, i.e. the user may choose for example to use only obstacle detection.
  • the feedback means 20 are configured to provide the user with tactile information on the surrounding environment, i.e. the user is able to feel the actuator array.
  • the feedback means is in an embodiment positioned in such a way that it is convenient to touch by the user, for example releasably attached to, or integrated with, the handle of a white cane used by the visually impaired user.
  • the feedback means are positioned against the skin of a user on a different body part, for example around the arm.
  • the feedback means 20 are configured to provide the user with navigational data, i.e. directions, for example navigational instructions akin to a navigation system.
  • the feedback means are configured to provide the user with information including text and numbers for example using braille tactile writing system.
  • the feedback means 20 are configured to provide haptic and/or audio guidance, for example for describing special objects in the surrounding environment, such as store names, street names, house numbers, signs, or information at a public transport station.
  • the audio guidance is provided via headphones. In an embodiment, such information is recognized using imaging.
  • the feedback means are integrated with, or attached to, another user device, such as a smartphone or a smartwatch.
  • Fig. 3 shows a block diagram of an apparatus 300 according to an example embodiment comprising the position detection means 10 and the feedback means 20.
  • the apparatus 300 comprises a communication interface 310; a processor 320; a user interface 330; and a memory 340.
  • all the elements of the apparatus are integrated with the position detection means and the feedback means, i.e. there is no need for an extra housing or equipment.
  • the communication interface 310 comprises in an embodiment a wired and/or wireless communication circuitry, such as Ethernet; Wireless LAN; Bluetooth; GSM; CDMA; WCDMA; LTE; and/or 5G circuitry.
  • the communication interface can be integrated in the apparatus 300 or provided as a part of an adapter, card or the like, that is attachable to the apparatus 300, or as a part of a charging station.
  • the communication interface 310 may support one or more different communication technologies.
  • the apparatus 300 may also or alternatively comprise more than one of the communication interfaces 310.
  • a processor may refer to a central processing unit (CPU); a microprocessor; a digital signal processor (DSP); a graphics processing unit; an application specific integrated circuit (ASIC); a field programmable gate array; a microcontroller; or a combination of such elements.
  • the processor 320 is configured to control the apparatus and to cause it to carry out the methods according to example embodiments.
  • the user interface may comprise a circuitry for receiving input from a user of the apparatus 300, e.g., via a keyboard; speech recognition circuitry; or an accessory device; such as a smartphone, smartwatch or a headset; and for providing output to the user.
  • the feedback means 20 are further configured to operate as user interface means for example by providing information on the on/off-status of the apparatus.
  • the memory 340 comprises a work memory 342 and a persistent memory 344 configured to store computer program code 346 and data 348.
  • the memory 340 may comprise any one or more of: a read-only memory (ROM); a programmable read-only memory (PROM); an erasable programmable read-only memory (EPROM); a random-access memory (RAM); a flash memory; a data disk; an optical storage; a magnetic storage; a smart card; a solid- state drive (SSD); or the like.
  • the apparatus 300 may comprise a plurality of the memories 340.
  • the memory 340 may be constructed as a part of the apparatus 300 or as an attachment to be inserted into a slot; port; or the like of the apparatus 300 by a user or by another person or by a robot.
  • the memory 340 may serve the sole purpose of storing data, or be constructed as a part of an apparatus 300 serving other purposes, such as processing data.
  • the apparatus 300 may comprise other elements, such as microphones; displays; as well as additional circuitry such as input/output (I/O) circuitry; memory chips; application-specific integrated circuits (ASIC); processing circuitry for specific purposes such as source coding/decoding circuitry; channel coding/decoding circuitry; ciphering/deciphering circuitry; and the like. Additionally, the apparatus 300 may comprise a disposable or rechargeable battery (not shown) for powering the apparatus 300 if external power supply is not available.
  • Fig. 4 shows a flow chart according to an example embodiment. Fig. 4 illustrates a process comprising various possible steps while also further steps can be included and/or some of the steps can be performed more than once:
  • the position detection means 10 is used to probe the environment. In an example embodiment using structured light to direct multiple beams at the surrounding environment. 420. The position detection means 10 is used to detect a plurality of positions on the surrounding environments. In an embodiment, the direction and distance of the positions on which the multiple beams of light fall on the surfaces of the environment.
  • the feedback means 20 are used to provide the user with tactile information on the surrounding environment by actuating an array of actuator elements.
  • the apparatus and method according to example embodiments is configured to enable a blind or visually impaired user to navigate and move in an unfamiliar environment or in a familiar environment taking into account changes like for example other pedestrians, closed doors or parked cars.
  • a technical effect of example embodiments is enabling a visually impaired person more freedom of movement without overly intrusive equipment.
  • a further technical effect of the example embodiments is enabling a visually impaired person more freedom of movement in a discreet manner.
  • a still further technical effect of the example embodiments is to provide information on the surroundings with a wide and variable field of view.
  • a still further technical effect of the example embodiments is to provide information from a variable range and field of view, for example a shorter range and wider field of view indoors and a longer range and narrower field of view outdoors, or depending on the size of the user.
  • Any of the afore described methods, method steps, or combinations thereof, may be controlled or performed using hardware; software; firmware; or any combination thereof.
  • the software and/or hardware may be local; distributed; centralised; or any combination thereof.
  • any form of computing, including computational intelligence may be used for controlling or performing any of the afore described methods, method steps, or combinations thereof.
  • Computational intelligence may refer to, for example, any of artificial intelligence; neural networks; fuzzy logics; machine learning; genetic algorithms; evolutionary computation; or any combination thereof.
  • words comprise; include; and contain are each used as open-ended expressions with no intended exclusivity.

Landscapes

  • Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Pain & Pain Management (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Rehabilitation Tools (AREA)

Abstract

An apparatus comprising at least one position detection means (10) configured to detect the position of a plurality of points on the surfaces of the surrounding environment; and at least one feedback means (20) configured to provide information on the position of the plurality of points on the surfaces of the surrounding environment; wherein the at least one feedback means (20) comprise a two-dimensional array of elements configured to provide tactile information on the position of the plurality of points on the surfaces of the surrounding environment.

Description

APPARATUS AND METHOD FOR IMPAIRED VISIBILITY PERCEPTION
TECHNICAL FIELD
The present disclosure generally relates to aiding perception while operating with limited visibility. The disclosure relates particularly, though not exclusively, to aiding perception while operating with limited visibility by providing non-visual feedback on the surroundings. The disclosure relates particularly, though not exclusively, to aiding perception while operating with limited visibility by providing tactile feedback on the surroundings.
BACKGROUND
This section illustrates useful background information without admission of any technique described herein representative of the state of the art.
Blind and visually impaired persons might use a white cane for probing their surroundings while moving around. The cane is swayed laterally by hand at the direction of propagation. If the tip of the cane touches an obstacle, the person can feel it by hand at the other side of the cane. By probing the obstacle, a person can distinguish between a doorway, step, or a fallen tree. In addition, the cane is sometimes touching the ground to verify that the ground is solid and there are no openings or steps downwards. Similarly, for example firefighters can move in low-visibility conditions by probing the surroundings with hands or any object.
Previously, systems have been disclosed that allow movement and a type of navigation for blind and visually impaired persons by using a proximity sensor. Furthermore, different types of electronic travel aids have been disclosed, among those laser scanning and a tactile feedback-based system. Such systems however suffer from various disadvantages, for example being unable to distinguish overhanging or ground objects or both simultaneously, or identify special objects such as stairs or doorways, or by requiring complicated wearable equipment or by not providing any mapping or object identification.
It is the object of the present disclosure to provide an arrangement for mitigating the problems of the prior art.
SUMMARY
The appended claims define the scope of protection. Any examples and technical descriptions of apparatuses, products and/or methods in the description and/or drawings not covered by the claims are presented not as embodiments of the invention but as background art or examples useful for understanding the invention. According to a first example aspect there is provided an apparatus comprising at least one position detection means configured to detect the position of a plurality of points on the surfaces of the surrounding environment; and at least one feedback means configured to provide information on the position of the plurality of points on the surfaces of the surrounding environment; wherein the at least one feedback means comprise a two-dimensional array of elements configured to provide tactile information on the position of the plurality of points on the surfaces of the surrounding environment.
The position detection means may comprise at least one illuminating means for illuminating the surrounding environment with structured light by splitting a laser light into multiple beams pointing in different directions and falling upon the surface of the surrounding environment or modifying the laser light into a pattern falling upon the surface of the surrounding environment; and at least one imaging means for detecting the positions on which the multiple beams or the pattern fall.
The position detection means may comprise a sonic or ultrasonic radar or sonar, a lidar, a depth camera and/or 3D imaging means.
The two-dimensional array of elements may comprise actuator elements that are configured to be raised and/or lowered and/or vibrated and/or the temperature of which is configured to be changed and/or elements which are configured to provide electrical stimuli for providing the tactile information.
The position detection means may be configured to be wearable or releasably attached to, or integrated with, a white cane used by visually impaired people.
The feedback means may be configured to be wearable or releasably attached to, or integrated with, a white cane used by visually impaired people.
According to a second example aspect there is provided a method, comprising detecting the position of a plurality of points on the surfaces of the surrounding environment; and providing information on the position of the plurality of points on the surfaces of the surrounding environment; wherein information on the position of the plurality of points on the surfaces of the surrounding environment is provided as tactile information. Detecting the position of a plurality of points on the surfaces of the surrounding environment may comprise illuminating the surrounding environment with structured light by splitting a laser light into multiple beams pointing in different directions and falling upon the surface of the surrounding environment; and detecting the positions on which the multiple beams fall by imaging.
Providing information on the position of the plurality of points on the surfaces of the surrounding environment may comprise operating a two-dimensional array of elements that are configured to be raised and/or lowered and/or vibrated and/or the temperature of which is configured to be changed and/or which are configured to provide electrical stimuli for providing the tactile information.
According to a third example aspect there is provided a computer program, comprising code for causing carrying out the method of the second example aspect, when executed by a processor comprised in the apparatus of the first example aspect.
According to a fourth example aspect there is provided a non-transitory memory medium, comprising the computer program of the third example aspect.
Any foregoing memory medium may comprise a digital data storage such as a data disc or diskette; optical storage; magnetic storage; holographic storage; opto-magnetic storage; phase-change memory; resistive random-access memory; magnetic random-access memory; solid-electrolyte memory; ferroelectric random-access memory; organic memory; or polymer memory. The memory medium may be formed into a device without other substantial functions than storing memory or it may be formed as part of a device with other functions, including but not limited to a memory of a computer; a chip set; and a sub assembly of an electronic device.
Different non-binding example aspects and embodiments have been illustrated in the foregoing. The embodiments in the foregoing are used merely to explain selected aspects or steps that may be utilized in different implementations. Some embodiments may be presented only with reference to certain example aspects. It should be appreciated that corresponding embodiments may apply to other example aspects as well.
BRIEF DESCRIPTION OF THE FIGURES
Some example embodiments will be described with reference to the accompanying figures, in which:
Fig. 1A schematically shows a principle view of an apparatus according to an example embodiment; Fig. 1 B schematically shows a principle view of an apparatus according to an example embodiment;
Fig. 2A schematically shows a top view of a part of an apparatus according to an example embodiment;
Fig. 2B schematically shows a side view of a part of an apparatus according to an example embodiment;
Fig. 3 shows a block diagram of an apparatus according to an example embodiment; and Fig. 4 shows a flow chart according to an example embodiment.
DETAILED DESCRIPTION
In the following description, like reference signs denote like elements or steps.
Figs. 1 A and 1 B schematically show a principle view of an apparatus according to an example embodiment. Figs. 1 A and 1 B show an environment in which the apparatus is operated comprising ground, or floor, 1 10 which is uneven, i.e. as common in real world, the ground comprises portions of various elevations, for example steps and curbs. Furthermore, the environment comprises obstacles, such as walls 120. Operating in such an environment with impaired visibility requires identification of the obstacles and elevations.
Figs. 1 A and 1 B show the apparatus according to an example embodiment comprising at least one position detection means 10 and at least one feedback means 20. In an embodiment, the position detection means 10 and the feedback means 20 are wearable, i.e. are configured to be worn or carried by a user of the apparatus. In an embodiment, the position detection means 10 and the feedback means 20 are provided separately from each other and can be positioned in different positions, for example the position detection means 10 are worn by the user and the feedback means 20 are provided on the handle of a cane. In a further example embodiment, the position detection means 10 are positioned on a hat, on glasses or on a further piece of clothing or on or about a body part of a user. In a still further example embodiment, the position detection means 10 and the feedback means 20 are integrated into the same entity, i.e. into a single housing.
The position detection means 10 and the feedback means 20 are configured to communicate with each other. In an embodiment, the position detection means 10 and the feedback means 20 are positioned together, for example in a single housing, and communicate with a wired and/or wireless connection. In a further example embodiment, the position detection means 10 and the feedback means 20 are provided separately and the communication is provided via a wireless connection. The position detection means 10 is configured to scan, or probe, the surrounding environment in a contactless manner and convey the position information to the feedback means 20. In an embodiment, the position detection means is configured to detect the position of a plurality of points or patterns in the surrounding environment, i.e. the position of plurality of points on surfaces of the surrounding environment as schematically shown in Figs. 1 A and 1 B. In an embodiment, the position detection means is configured to determine for each point a distance and a direction from the position detection means 10. Figs 1 A and 1 B show a field of vision 14 of the position detection means and the distance and direction detected 12.
In an embodiment, the position detection means comprise passive or active detection means such as a structured light detection means, sonic or ultrasonic radar or sonar, a lidar, a depth camera or 3D imaging means. In an embodiment, the detection of the position of a plurality of points or patterns on the surfaces is carried out by triangulation. In a further example embodiment, the detection of the position of a plurality of points or patterns on the surfaces is carried out by time of flight, amplitude modulated continuous wave, frequency modulated continuous wave and/or a further distance or coordinate measurement technique.
In an example embodiment, the position detection means 10 comprise structured light detection means comprising illuminating means and imaging means. The illuminating means provide illumination with a laser light that is split into multiple beams or modified into a pattern using a suitable optical element such as a diffractive optical element . The split or modified beams are directed into different directions forming points or patterns on the surfaces of the surrounding environment to be probed. The imaging means, such as a camera, is used to detect the positions upon which the beams fall on the surfaces of the surrounding environment. In an embodiment, the laser light comprises light with a wavelength on the infrared region, for example 800-1000 nm, that can be imaged with a Si- based detector. In an embodiment, the illuminating means provide a laser power well within the eye safety limits, such as combined power of 100-200mW with a single beam power in 1 mW range.
In an example embodiment, the apparatus comprises means for determining a geographical location of the apparatus and therethrough the user thereof. In an embodiment, the means for determining a geographical location are integrated with the positioning means. In an embodiment, the means for determining a geographical location comprise a satellite-based navigation system, such as global positioning system (GPS), global navigation satellite system (GLONASS), BeiDou or Galileo. In a further embodiment, the means for determining a geographical location comprise a system not dependent on satellite navigation, for example simultaneous localization and mapping (SLAM) or wireless local area network (WLAN)-based navigation.
In an example embodiment, the positioning means comprise means for measuring orientation, such as an accelerometer, a gyroscope and/or pointcloud localization means. In a further embodiment, the position detection means and/or the means for determining a geographical location are integrated with another user device, such as a smartphone or a smartwatch.
The position detection means are configured to convey the position information to the feedback means 20. Figs. 2A and 2B schematically show a top view and side view of the feedback means 20. In an example shown in Figs. 2A and 2B, the feedback means 20 comprises an actuated two-dimensional array comprising a plurality of elements 22-11 to 22-nn. In a further embodiment, the two-dimensional array is irregularly shaped, i.e. some rows or columns may have a different number of elements.
In an embodiment, the actuated two- dimensional array comprises a binary Braille actuator, a multistep array actuator, an array of vibrating actuator elements, an array of temperature changing elements, an array of elements providing electrical stimulation, an array of elements producing ultrasound or a combination of any of the foregoing.
Fig 2A shows example feedback means being actuated to provide the user thereof with information of an example surrounding environment shown in Figs. 1 A and 1 B. As seen in Fig. 2A, the actuator elements depicted in black have been raised to indicate the position of a wall 120. Furthermore, the actuator elements depicted in black and white have been lowered to indicate an area of lower elevation on the ground.
The position of the raised and lowered actuator elements conveys information on the type of surroundings and their direction and distance with respect to the user. In a further embodiment, in which binary actuator elements are used, an area of lower elevation is indicated by repeatedly raising and lowering back the actuator elements in question. In a still further embodiment, vibration and/or temperature change and/or electrical or ultrasound stimuli of the elements is used to convey the information.
In an embodiment, the feedback means 20 are configured to provide the user with information about special objects in the surrounding environment. The special objects, in an embodiment, comprise special objects such as stairs, curbs, steps up or down. In an embodiment, the information about special objects in the surrounding environment is provided using spatio-temporal patterns.
In an embodiment, the feedback means 20 and/or the position detection means 10 are configured to adapt to the surrounding environment by modifying or adapting the position detection and or feedback, for example by adapting the algorithms used. As an example, ground detection and feedback thereon may be adapted or switched off in case the user is walking in tall grass.
Furthermore, in an embodiment, the user is able to select the functions used, for example by switching off certain features, i.e. the user may choose for example to use only obstacle detection.
The feedback means 20 are configured to provide the user with tactile information on the surrounding environment, i.e. the user is able to feel the actuator array. The feedback means is in an embodiment positioned in such a way that it is convenient to touch by the user, for example releasably attached to, or integrated with, the handle of a white cane used by the visually impaired user. In further embodiment, the feedback means are positioned against the skin of a user on a different body part, for example around the arm.
In an embodiment, the feedback means 20 are configured to provide the user with navigational data, i.e. directions, for example navigational instructions akin to a navigation system. In an embodiment, the feedback means are configured to provide the user with information including text and numbers for example using braille tactile writing system. In an embodiment, the feedback means 20 are configured to provide haptic and/or audio guidance, for example for describing special objects in the surrounding environment, such as store names, street names, house numbers, signs, or information at a public transport station. In an example embodiment, the audio guidance is provided via headphones. In an embodiment, such information is recognized using imaging.
In a further embodiment, the feedback means are integrated with, or attached to, another user device, such as a smartphone or a smartwatch.
Fig. 3 shows a block diagram of an apparatus 300 according to an example embodiment comprising the position detection means 10 and the feedback means 20. The apparatus 300 comprises a communication interface 310; a processor 320; a user interface 330; and a memory 340. In an embodiment, all the elements of the apparatus are integrated with the position detection means and the feedback means, i.e. there is no need for an extra housing or equipment.
The communication interface 310 comprises in an embodiment a wired and/or wireless communication circuitry, such as Ethernet; Wireless LAN; Bluetooth; GSM; CDMA; WCDMA; LTE; and/or 5G circuitry. The communication interface can be integrated in the apparatus 300 or provided as a part of an adapter, card or the like, that is attachable to the apparatus 300, or as a part of a charging station. The communication interface 310 may support one or more different communication technologies. The apparatus 300 may also or alternatively comprise more than one of the communication interfaces 310.
In this document, a processor may refer to a central processing unit (CPU); a microprocessor; a digital signal processor (DSP); a graphics processing unit; an application specific integrated circuit (ASIC); a field programmable gate array; a microcontroller; or a combination of such elements. The processor 320 is configured to control the apparatus and to cause it to carry out the methods according to example embodiments.
The user interface may comprise a circuitry for receiving input from a user of the apparatus 300, e.g., via a keyboard; speech recognition circuitry; or an accessory device; such as a smartphone, smartwatch or a headset; and for providing output to the user. In an example embodiment, the feedback means 20 are further configured to operate as user interface means for example by providing information on the on/off-status of the apparatus.
The memory 340 comprises a work memory 342 and a persistent memory 344 configured to store computer program code 346 and data 348. The memory 340 may comprise any one or more of: a read-only memory (ROM); a programmable read-only memory (PROM); an erasable programmable read-only memory (EPROM); a random-access memory (RAM); a flash memory; a data disk; an optical storage; a magnetic storage; a smart card; a solid- state drive (SSD); or the like. The apparatus 300 may comprise a plurality of the memories 340. The memory 340 may be constructed as a part of the apparatus 300 or as an attachment to be inserted into a slot; port; or the like of the apparatus 300 by a user or by another person or by a robot. The memory 340 may serve the sole purpose of storing data, or be constructed as a part of an apparatus 300 serving other purposes, such as processing data.
A skilled person appreciates that in addition to the elements shown in Figure 3, the apparatus 300 may comprise other elements, such as microphones; displays; as well as additional circuitry such as input/output (I/O) circuitry; memory chips; application-specific integrated circuits (ASIC); processing circuitry for specific purposes such as source coding/decoding circuitry; channel coding/decoding circuitry; ciphering/deciphering circuitry; and the like. Additionally, the apparatus 300 may comprise a disposable or rechargeable battery (not shown) for powering the apparatus 300 if external power supply is not available. Fig. 4 shows a flow chart according to an example embodiment. Fig. 4 illustrates a process comprising various possible steps while also further steps can be included and/or some of the steps can be performed more than once:
410. The position detection means 10 is used to probe the environment. In an example embodiment using structured light to direct multiple beams at the surrounding environment. 420. The position detection means 10 is used to detect a plurality of positions on the surrounding environments. In an embodiment, the direction and distance of the positions on which the multiple beams of light fall on the surfaces of the environment.
430. The feedback means 20 are used to provide the user with tactile information on the surrounding environment by actuating an array of actuator elements.
Without in any way limiting the scope of the appended claims, some technical effects of the apparatus and method according to example embodiments are explained in the following.
The apparatus and method according to example embodiments is configured to enable a blind or visually impaired user to navigate and move in an unfamiliar environment or in a familiar environment taking into account changes like for example other pedestrians, closed doors or parked cars.
Accordingly, a technical effect of example embodiments is enabling a visually impaired person more freedom of movement without overly intrusive equipment. A further technical effect of the example embodiments is enabling a visually impaired person more freedom of movement in a discreet manner. A still further technical effect of the example embodiments is to provide information on the surroundings with a wide and variable field of view. A still further technical effect of the example embodiments is to provide information from a variable range and field of view, for example a shorter range and wider field of view indoors and a longer range and narrower field of view outdoors, or depending on the size of the user.
Any of the afore described methods, method steps, or combinations thereof, may be controlled or performed using hardware; software; firmware; or any combination thereof. The software and/or hardware may be local; distributed; centralised; or any combination thereof. Moreover, any form of computing, including computational intelligence, may be used for controlling or performing any of the afore described methods, method steps, or combinations thereof. Computational intelligence may refer to, for example, any of artificial intelligence; neural networks; fuzzy logics; machine learning; genetic algorithms; evolutionary computation; or any combination thereof.
Various embodiments have been presented. It should be appreciated that in this document, words comprise; include; and contain are each used as open-ended expressions with no intended exclusivity.
The foregoing description has provided by way of non-limiting examples of particular implementations and embodiments a full and informative description of the best mode presently contemplated by the inventors for carrying out the invention. It is however clear to a person skilled in the art that the invention is not restricted to details of the embodiments presented in the foregoing, but that it can be implemented in other embodiments using equivalent means or in different combinations of embodiments without deviating from the characteristics of the invention.
Furthermore, some of the features of the afore-disclosed example embodiments may be used to advantage without the corresponding use of other features. As such, the foregoing description shall be considered as merely illustrative of the principles of the present invention, and not in limitation thereof. Hence, the scope of the invention is only restricted by the appended patent claims.

Claims

1 . An apparatus comprising at least one position detection means (10) configured to detect the position of a plurality of points on the surfaces of the surrounding environment; and at least one feedback means (20) configured to provide information on the position of the plurality of points on the surfaces of the surrounding environment; characterized in that the at least one feedback means (20) comprise a two-dimensional array of elements configured to provide tactile information on the position of the plurality of points on the surfaces of the surrounding environment.
2. The apparatus according to claim 1 , wherein the position detection means (10) comprise at least one illuminating means for illuminating the surrounding environment with structured light by splitting a laser light into multiple beams pointing in different directions and falling upon the surface of the surrounding environment or modifying the laser light into a pattern falling upon the surface of the surrounding environment; and at least one imaging means for detecting the positions on which the multiple beams or the pattern fall.
3. The apparatus according to claim 1 or 2, wherein the position detection means (10) comprise a sonic or ultrasonic radar or sonar, a lidar, a depth camera and/or 3D imaging means.
4. The apparatus according to any preceding claim, wherein the two-dimensional array of elements comprises actuator elements that are configured to be raised and/or lowered and/or vibrated and/or the temperature of which is configured to be changed and/or elements which are configured to provide electrical stimuli for providing the tactile information.
5. The apparatus according to any preceding claim, wherein the position detection means (10) are configured to be wearable or releasably attached to, or integrated with, a white cane used by visually impaired people.
6. The apparatus according to any preceding claim, wherein the feedback means (20) are configured to be wearable or releasably attached to, or integrated with, a white cane used by visually impaired people.
7. A method, comprising detecting the position of a plurality of points on the surfaces of the surrounding environment; and providing information on the position of the plurality of points on the surfaces of the surrounding environment; wherein information on the position of the plurality of points on the surfaces of the surrounding environment is provided as tactile information.
8. The method according to claim 7, wherein detecting the position of a plurality of points on the surfaces of the surrounding environment comprises illuminating the surrounding environment with structured light by splitting a laser light into multiple beams pointing in different directions and falling upon the surface of the surrounding environment; and detecting the positions on which the multiple beams fall by imaging.
9. The method according to claim 7 or 8, wherein providing information on the position of the plurality of points on the surfaces of the surrounding environment comprises operating a two-dimensional array of elements that are configured to be raised and/or lowered and/or vibrated and/or the temperature of which is configured to be changed and/or which are configured to provide electrical stimuli for providing the tactile information.
10. A computer program, comprising code for causing carrying out the method of any preceding claim 7-9, when executed by a processor comprised in the apparatus of any preceding claim 1 -6.
11 . A non-transitory memory medium, comprising the computer program of claim 10.
PCT/FI2023/050277 2022-05-18 2023-05-17 Apparatus and method for impaired visibility perception WO2023222951A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263343106P 2022-05-18 2022-05-18
US63/343,106 2022-05-18

Publications (1)

Publication Number Publication Date
WO2023222951A1 true WO2023222951A1 (en) 2023-11-23

Family

ID=86604642

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2023/050277 WO2023222951A1 (en) 2022-05-18 2023-05-17 Apparatus and method for impaired visibility perception

Country Status (1)

Country Link
WO (1) WO2023222951A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7855657B2 (en) * 2005-01-13 2010-12-21 Siemens Aktiengesellschaft Device for communicating environmental information to a visually impaired person
US20130093852A1 (en) * 2011-10-12 2013-04-18 Board Of Trustees Of The University Of Arkansas Portable robotic device
US8922759B2 (en) * 2010-09-24 2014-12-30 Mesa Imaging Ag White cane with integrated electronic travel aid using 3D TOF sensor
JP2019117684A (en) * 2017-10-04 2019-07-18 ピクシーダストテクノロジーズ株式会社 Method and system for generating interactive aerial volumetric image and spatial audio using femtosecond laser

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7855657B2 (en) * 2005-01-13 2010-12-21 Siemens Aktiengesellschaft Device for communicating environmental information to a visually impaired person
US8922759B2 (en) * 2010-09-24 2014-12-30 Mesa Imaging Ag White cane with integrated electronic travel aid using 3D TOF sensor
US20130093852A1 (en) * 2011-10-12 2013-04-18 Board Of Trustees Of The University Of Arkansas Portable robotic device
JP2019117684A (en) * 2017-10-04 2019-07-18 ピクシーダストテクノロジーズ株式会社 Method and system for generating interactive aerial volumetric image and spatial audio using femtosecond laser

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
IKEYA D ET AL: "Research and development of a hand-held vision system for the visually impaired", ROBOT AND HUMAN INTERACTION, 1999. RO-MAN '99. 8TH IEEE INTERNATIONAL WORKSHOP ON PISA, ITALY 27-29 SEPT. 1999, PISCATAWAY, NJ, USA,IEEE, US, 27 September 1999 (1999-09-27), pages 13 - 17, XP010530421, ISBN: 978-0-7803-5841-6, DOI: 10.1109/ROMAN.1999.900303 *
YELAMARTHI KUMAR ET AL: "Navigation assistive system for the blind using a portable depth sensor", 2015 IEEE INTERNATIONAL CONFERENCE ON ELECTRO/INFORMATION TECHNOLOGY (EIT), IEEE, 21 May 2015 (2015-05-21), pages 112 - 116, XP032790339, DOI: 10.1109/EIT.2015.7293328 *

Similar Documents

Publication Publication Date Title
Jafri et al. Visual and infrared sensor data-based obstacle detection for the visually impaired using the Google project tango tablet development kit and the unity engine
Wang et al. Enabling independent navigation for visually impaired people through a wearable vision-based feedback system
US11369543B2 (en) Non-visual precision spatial awareness device
Bousbia-Salah et al. A navigation aid for blind people
Dakopoulos et al. Wearable obstacle avoidance electronic travel aids for blind: a survey
EP2629737B1 (en) White cane with integrated electronic travel aid using 3d tof sensor
Al-Fahoum et al. A smart infrared microcontroller-based blind guidance system
WO2018156549A1 (en) Personal navigation system
Horvath et al. FingerSight: Fingertip haptic sensing of the visual environment
Dunai et al. Obstacle detectors for visually impaired people
Sharma et al. Design of micro controller Based Virtual Eye for the Blind
Ilag et al. Design review of Smart Stick for the Blind Equipped with Obstacle Detection and Identification using Artificial Intelligence
Khampachua et al. Wrist-mounted smartphone-based navigation device for visually impaired people using ultrasonic sensing
Bouteraa Smart real time wearable navigation support system for BVIP
Bernieri et al. A low cost smart glove for visually impaired people mobility
Madake et al. A Qualitative and Quantitative Analysis of Research in Mobility Technologies for Visually Impaired People
Meliones et al. Blindhelper: A pedestrian navigation system for blinds and visually impaired
Hossain et al. State of the art review on walking support system for visually impaired people
WO2023222951A1 (en) Apparatus and method for impaired visibility perception
KR100563193B1 (en) Signal transformation method and guidance system providing auditory and tactile indoor/outdoor traveling information for the blind
US20200238531A1 (en) Artificial intelligence moving robot and method for controlling the same
Varde et al. Computer vision based travel aid for blind
Bolla et al. Object Detection in Computer Vision Using Machine Learning Algorithm For Visually Impaired People
Gupta et al. A survey on indoor object detection system
Lesecq et al. Assistive Smart, Structured 3D Environmental Information for the Visually Impaired and Blind: Leveraging the INSPEX Concept.

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23726542

Country of ref document: EP

Kind code of ref document: A1