GB2487672A - Active sensory augmentation device - Google Patents

Active sensory augmentation device Download PDF

Info

Publication number
GB2487672A
GB2487672A GB201201654A GB201201654A GB2487672A GB 2487672 A GB2487672 A GB 2487672A GB 201201654 A GB201201654 A GB 201201654A GB 201201654 A GB201201654 A GB 201201654A GB 2487672 A GB2487672 A GB 2487672A
Authority
GB
United Kingdom
Prior art keywords
user
sensory
objects
augmentation device
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB201201654A
Other versions
GB201201654D0 (en
Inventor
Anthony Jason Prescott
Thomas Benjamin Mitchinson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Sheffield
Original Assignee
University of Sheffield
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Sheffield filed Critical University of Sheffield
Publication of GB201201654D0 publication Critical patent/GB201201654D0/en
Publication of GB2487672A publication Critical patent/GB2487672A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/003Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7455Details of notification to user or communication with user or patient ; user input means characterised by tactile indication, e.g. vibration or electrical stimulation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/08Devices or methods enabling eye-patients to replace direct visual perception by another kind of perception
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • G01S15/08Systems for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • A61H2003/063Walking aids for blind persons with electronic detecting or guiding means with tactile perception

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Remote Sensing (AREA)
  • Public Health (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Pain & Pain Management (AREA)
  • Epidemiology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Acoustics & Sound (AREA)
  • Rehabilitation Therapy (AREA)
  • Computer Hardware Design (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Neurosurgery (AREA)
  • Business, Economics & Management (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Educational Technology (AREA)
  • Educational Administration (AREA)
  • User Interface Of Digital Computer (AREA)
  • Physiology (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Neurology (AREA)
  • Vascular Medicine (AREA)
  • Ophthalmology & Optometry (AREA)

Abstract

A sensory array detects objects in an environment surrounding the user, such as a fireman or blind person, when the device is being worn, a tactile display to physically engage the user's body and to provide salient information about said environment. An algorithm controls one or both of the sensory array and tactile display so that the presentation of an object is adapted to circumstances other than the physical parameter of the object's proximity to the user. The output can therefore be tuned to provide salient feedback, based on the rate of change of position of the object with respect to the user, the physical attributes of the object, the time since first detection of the object by the electromagnetic or sonic sensory array; and/or the current situation or objectives of the user. This ensures that the user does not get too many distracting tactile responses for the user to understand their environment from vibrotactile stimulation of the skin.

Description

Active Sensory Augmentation Device [0001] The present invention relates to sensory augmentation devices and systems therefor.
BACKGROUND
[0002] Sensory augmentation devices, as known, have the purpose of reporting information about an environment or an object to a person beyond which, their normal senses are able to provide. They have shown potential for use by the visually impaired to provide them with sensory information to aid movement, for example.
[0003] Augmenting spatial awareness with Haptic Radar (Tenth International Symposium on Wearable Computers (ISWC)), Cassinelli, A., Reynolds, C. and Ishikawa, M., discloses a modular electronic device to allow users to perceive and respond simultaneously to multiple spatial information sources using haptic stimulus. Each module of this wearable "haptic radar" acts as a narrow-field range detector capable of sensing obstacles, measuring their approximate distance from the user and transducing this information as a vibro-tactile cue on the skin directly beneath the module.
[00041 However, information imparted to the user by such a system is not attuned to the users needs at the point when the information is imparted, or that the information is not specific or informative enough when it is needed or desired. At times, the system may also overload the user with excessive or irrelevant information that could serve to distract the user and make it harder to achieve their goals.
[0005] Personnel working in restricted environments may benefit from sensory assistance. For example, firemen in smoke filled buildings may not be able to locate objects such as furniture, people, doorways, etc, visually and rely on the capacity to touch and feel their way. Indeed the best-practice method employed by firemen in traversing smoke-filled environments has not changed in over 50 years, and involves the person staying in contact with a wall of the room and waving their hand in front of their face and body as they progress. Such personnel may benefit from sensory augmentation.
However, in challenging environments, the information imparted to the user must not inhibit their goals. An object of the present invention is to provide a system that overcomes these issues, or at least mitigates their undesirable effects.
BRIEF SUMMARY OF THE DISCLOSURE
[0006] In accordance with the present invention there is provided a sensory augmentation device for wear by a user, comprising a sensory array to detect objects in an environment surrounding the user when the device is being worn, a tactile display to physically engage the user's body and provide information about said environment, and computing means between the sensory array and tactile display that includes an algorithm controlling one or both of the sensory array and tactile display whereby the presentation of information to the user about an object is adapted to circumstances beyond the physical parameter of the object's proximity to the user, such circumstances including: rate of change of position of the object with respect to the user; the physical attributes of the object; the time since first detection of the object by the sensory array; and the current situation or objectives of the user.
[0007] Active sensing means not merely receiving input passively as a sensory device happens to encounter a stimulus but instead involves influencing the receipt of inputs.
This can be performed using a passive sensory device (for example when a blind person explores an environment with their walking stick). However, active sensing can also mean controlling the perception of inputs from a sensor, or even a combination of direct control of a sensor to control inputs received and to control how they are perceived.
[0008] The design of the device and of its processing algorithms takes inspiration from active proximal sensing systems observed in the natural world. For instance, rodents, such as mice and rats, possess long facial whiskers that are moved back-and-forth in order to explore the proximal environment surrounding the head. Studies of whisker motion have shown that these direct the whisker-tips towards surfaces of interest in the environment, causing increased contacts with such surfaces. Whisker motion is also controlled so that contacts (whisker bending) occur within a limited dynamic range. In the design of the present invention these principles have been generalised to non-contact proximity sensors, and to aspects of active sensing control (described hitherto) not observed (or not confirmed) in the natural world.
[0009] In a preferred embodiment, said sensory array comprises one or more electromechanical range sensors comprising one or more emitting modules and one or more receiving modules, wherein the receiving modules are configured to receive and detect signals emitted by the emitting modules and reflected by an object in the environment. Preferably, the sensory array comprises sensors with parameters set to be sensitive to the user's objectives as determined from the measurement of the user's acceleration, velocity, gyroscopic motion or position. Furthermore, the sensory array may be controlled by said algorithm such that the sensory array is active, through filtering sensory inputs according to the user's objectives. Preferably, said tactile display is controlled by said algorithm such that said tactile display is active, through modifying said tactile display according to the user's objectives.
[0010] In another preferred embodiment, the sensory array further comprises environment sensors with parameters set to be sensitive to environmental properties selected from temperature, humidity, light, sound or position or physical attributes of objects. Preferably, said sensory array is controlled by said algorithm such that the sensory array actively changes said sensing parameters in response to changes in the environment.
[0011] In a preferred embodiment, said computing means further comprises a timer and is adapted to determine the rate of change of the position of the object with respect to the user from a measurement of velocity change of the object position over a period of time.
[0012] In another preferred embodiment, said sensory array has means for detecting the physical attributes of the object and said algorithm comprises a library of physical attributes wherein said algorithm is adaptable to match said detected physical attributes to an item in said library. Said library may contain a corresponding vibrotactile symbol for said item and said tactile display is configurable to display said corresponding vibrotactile symbol. Preferably, the algorithm comprises means to store in a memory of said computing means, said detected physical attributes of the objects that have been detected by said sensors and whereby said presentation is adapted to display the time since first detection of the object, wherein said adapted display is pulsed and the amplitude of each pulse relating to the object decreases according to the time since it was first detected.
[0013] In a further preferred embodiment, said computing means further comprises a graphical user interface. In certain embodiments said algorithm may be programmable by said graphical user interface, to enable the objectives of the user to be entered into the algorithm.
[0014] In another preferred embodiment, the computing means may be adapted to control the emitting module to emit signals in a specific time frame, direction or form.
[0015] Preferably, said tactile display is controlled by said algorithm to present information to the user within a comfortable dynamic range, preferably user specific. Said comfortable dynamic range may be user specific and comprise a lower limit where the user cannot feel the tactile display and an upper limit where the tactile display is perceived as painful by the user. Preferably the computing means is adapted to adapt said comfortable dynamic range in real time through feedback of the user's reaction to the tactile display.
Furthermore, said graphical user interface may be adapted to enter the comfortable dynamic range of the user.
[0016] In a preferred embodiment said tactile display comprises a plurality of vibrotactile elements configured to present a vibrotactile language to the user. Preferably, said plurality of vibrotactile elements are actuators. In certain preferred embodiments, said vibrotactile language is built from a combination of drive parameters of the tactile display, comprising; amplitude, frequency, pulse width, pulse frequency. Said vibrotactile language may further comprise ascending or descending frequency pulses. Preferably, the tactile display is positionable over an area of the body of the user to engage with the cognitive core of the user, wherein the cognitive core of the user is the set of skills, habits and abilities, in perception and action, that a person can rely on without effortful deliberation and that may be interfaced to through vibrotactile stimulation of body surfaces Preferably, said area of the body is locatable on a skin position that is well served by nerve endings, or, said area of the body is locatable on a skin position in close proximity to a trigeminal nerve of the user. Said tactile display may be locatable about the temples of the user's head.
[0017] Preferably, the tactile display promotes a user reaction which is used to feedback to the control algorithm, and may directly affect the sensory array.
[0018] In accordance with the present invention, there is also provided a method for sensory augmentation, wherein information about an environment comprising objects surrounding a user is presented to the user, said method comprising the steps of: a) sensing the presence of said objects to generate output data about said objects; and using an algorithm in computing means, either i. to modify said sensing to affect said output data; and/or ii. to process what is sensed to generate said output data; and either or both: in response to circumstances of the user, said circumstances selected from the group comprising: * a rate of change of position of the object with respect to the user; * physical attributes of the object; * time since first detection of the object by the sensory array; * objectives of the user; and, * environmental parameters, and b) presenting said output data to the user in a tactile display, such that the user is presented with information about objects in the environment beyond the physical presence of the object.
[0019] In a preferred embodiment, step a) further comprises emitting a signal into said environment using emitters and receiving in receivers a signal reflected from said objects in said environment. Preferably, said one or more objectives of the user comprises their orientation, heading, velocity, acceleration. Preferably, said environmental parameters comprise temperature, humidity, light levels, noise levels.
[0020] In a preferred embodiment, the method for sensory augmentation further comprises the steps of: a) sensing the velocity and/or acceleration of the user and the presence of said objects in said environment; b) determining the direction of heading of the user; c) applying a filter to increase sensory information processed for objects in the direction of heading and reduce sensory information processed for objects not in the direction of heading; whereby, said output data set is modified.
[0021] In another preferred embodiment, the method for sensory augmentation further comprises the steps of: a) retaining sensory information about said objects as obtained at a first time; b) matching said sensory information about said objects as obtained at a later second time to said first time; c) applying a time-dependent reduction to matched sensory information as obtained at said second time to produce said output data, whereby, said presentation of said output data is time-dependent.
[0022] In a preferred embodiment, the method for sensory augmentation further comprises the steps of: a) retaining sensory information about said objects as obtained at a first time; b) matching said sensory information about said objects as obtained at a later second time to said first time; c) determining the velocity of said one or more objects; d) comparing the velocity of said user to the velocity of said objects to produce said filtered information; whereby, said presentation of said output data is velocity dependent. Preferably, said velocity dependent presentation of output data changes frequency over time according to the velocity, whereby: * the frequency of the presentation of objects towards a user varies with time in a first way; and * the frequency of the presentation of objects moving away from the user varies with time in a second way; wherein, said first way is opposite to said second way.
[0023] In a preferred embodiment, said presentation of said output data is pulsed.
[0024] In a preferred embodiment, the method for sensory augmentation further comprises the step of: comparing, within said algorithm, said sensory information to a library of sensory signatures, whereby said comparison is used to classify a property of the object wherein said classification of said object is presented to said user through a tactile presentation specific to said classification.
[0025] In a preferred embodiment said tactile display presents a vibrotactile language which is programmable or learnable by said user. Preferably said method for sensory augmentation is performed using the apparatus as hereinbefore described.
[0026] In accordance with the invention, there is further provided a sensory augmentation device configured to implement the method as hereinbefore described.
BRIEF DESCRIPTION OF THE DRAWINGS
[0027] Embodiments of the invention are further described hereinafter with reference to the accompanying drawings, in which: Figure 1 shows a user wearing a sensory augmentation device according to
the prior art;
Figure 2 shows a schematic representation of a sensory augmentation device, in use, according to an embodiment of the present invention; Figure 3 is a circuit diagram of a sensory augmentation device according to an embodiment of the present invention; Figure 4 is a schematic of sensitivity zones according to an embodiment of the present invention; Figure 5 is an illustration of time-dependent vibrotactile cues in the presence of multiple objects according to an embodiment of the present invention; Figure 6 is a schematic illustrating velocity-dependent vibrotactile cues according to an embodiment of the present invention; and, Figure 7 is an illustration of selection of vibrotactile symbols according to an embodiment of the present invention.
DETAILED DESCRIPTION
[0028] Figure 1 illustrates the prior art. It shows a user I in an environment defined by the objects 2 it contains. Light beams 3 are used to sense the objects and the information is relayed to the user by vibration motors mounted on a head piece 4.
[0029] Figure 2 illustrates data pathways between components of a system according to the present invention. The figure shows a sensory augmentation apparatus 100 in an environment 108 which is defined by objects 107 therein. The sensory augmentation device comprises a sensory array 101 in communication with a control algorithm 103 which in turn communicates with a tactile display 102. The tactile display is connected 204 to the user I as per the head piece 4 shown in Figure 1.
[0030] Returning to Figure 2, the sensory array 101 further comprises one or more emitting modules 105 and one or more receiving modules 106. The emitting modules emit a signal 208 into the environment 108 and the receiving modules receive a signal 209 reflected from an object 107 or physical property, such as temperature. The sensory array is controlled through instruction 202 by the control algorithm 103 and returns information 201 to the control algorithm.
[0031] The information returned 201 by the sensory array 101 comprises information about the environment. The information which is returned is dependent on the number of emitting 105 and receiving 106 modules, their type and the instructions 202 which they operate under. In a preferred embodiment, there is a variety of sensor types in the array 101 so that the array is sensitive to conditions such as: the position of objects 107 in relation to the user 1; the temperature levels in areas of the environment; light levels in the environment; the position and orientation of the user; and, the direction of movement of the user.
Through this, the sensory array receives a sufficiently complete description of the environment 108.
[0032] The instructions 202 which are sent to the sensory array have the purpose of enhancing information gathering. The loop between the sensory array 101 and the control algorithm 103 through lines 201 and 202 are such that the sensory array is continuously adapting to the environment 108 and the objects 107 which it detects.
[0033] Signals 201 fed into the control algorithm 103 from the sensory array are filtered by the algorithm which interprets the information to build a picture' of the environment. The algorithm uses this information both to instruct 202 the sensory array and further to operate 203 the tactile display.
[0034] The tactile display consists of one or more vibrotactile elements which are placed on the user 1. The vibrotactile elements are used to present 204 a vibrotactile language to the user. The vibrotactile language is made up of a combination of vibrotactile parameters comprising, amplitude, frequency, pulse width, pulse frequency.
[0035] The algorithm 103 operates the tactile display using the vibrotactile language through interpretation of the sensory information 201 provided by the sensory array 101.
The interpretation by the algorithm of the environment prevents a direct representation of the information received by the sensory array and allows for determination of salient information, which the algorithm decides on a run-time basis. Once the algorithm has processed the information it operates 203 the tactile display through implementation of the vibrotactile language.
[0036] Placement of the tactile display 102 will facilitate integration with the cognitive core of the user. In a preferred embodiment, this is performed by placing the vibrotactile elements on sensitive areas of the user's body, that is areas which have a high number of nerve endings.
[0037] Once the tactile display 102 presents 204 a stimulus to the cognitive core 104 a user reaction 105 is promoted. The user reaction is indirectly fed (206) back into the control algorithm 103 via 205 sensors which are sensitive to the position and orientation of the user. Such sensors may include but are not limited to acceleration and orientation (verticality, compass direction) sensors. The information is passed 201 to the control algorithm which makes run-time adjustments of the tactile display 102 to prevent the user I from feeling discomfort and to ensure that the user remains aware of the tactile display.
[0038] The comprising parts of the device 100 are interactive such that the device can react to changes in the environment 208, actively sensing these and presenting them to the user 1. A suitable application for this device may be for those that are visually impaired or subject to an occluded environment.
[00391 Thus the present invention provides a device which is controlled to present information about a surrounding environment to a user. The device is to be worn, and provides sensory augmentation' (or sensory substitution', depending on application domain). Sensory information is presented to the user by means of a sensor-to-wearer mapping algorithm provided by the computing means.
[0040] The device preferably uses "actively" controlled electromechanical sensors to gather information about the physical geometry of the wearer's proximal environment.
"Salient" aspects of this information may be conveyed to the wearer through a vibrotactile' display (an array of vibrotactile elements). Intrinsic to this process is a computational algorithm that transforms the raw sensory information into optimal signals to deliver the salient information about the environment to the user in an intuitive way. This algorithm may be application-specific (or, at least, tuned to the application).
[0041] Sensory Array and Tactile Display [0042] An example of the system of the current invention is configured as shown in Figure 3. Here, sensors 301, 302 are placed on a firefighter's helmet 303. The sensors are connected to a sensor-actuator bridge 304 via an interconnect bus 305. The sensor-actuator bridge is programmed using a computation unit 306. It will be appreciated that the computation unit 306 can be directly attached to the sensor-actuator bridge, incorporated therein, or used to program the sensor-actuator bridge remotely. A second interconnect 307, connects the sensor-actuator bridge to an actuator driver board 308.
[0043] The actuator-driver board 308 is connected to the tactile display 311 which, for example, includes actuators 309 which are disposed on an elastic headband 310 as worn under the helmet 303 by the user. It will be appreciated that alternative arrangements of tactile display are in the scope of this invention.
[0044] The sensor-actuator bridge 304 is a microcontroller based board that acts as a bridge between the computation unit 306 and the sensors and actuators, reads the sensor values and sends them to the computation unit.
[0045] The computation unit 306 receives the sensor-data, executes the algorithm, generates the actuation commands and sends them back to the bridge 304 for onward transmission to the actuators. The bridge 304 relays the actuation commands to the actuator driver board 308. The actuator driver board interprets the commands, prepares the waveform and drives the actuators through amplifiers.
[0046] Example 1: Physical system.
[0047] The example physical system contains: i. Computation Unit 306 ii. Sensory Array 312 iii. Ultrasound modules 301 iv. Actuators 309 v. Digital Triaxial Accelerometer 302 vi. Sensor-Actuator Control Bridge 304 vii. Actuator Driver Board 308 viii. Interconnects 305, 307 [0048] i. Computation Unit. The computation unit is the controller of the whole system, it reads the sensor data, executes the algorithm, and sends the actuation commands to the actuators. In this example, it is a small netbook running Matlab in a Windows 7 environment. This unit is portable.
[0049] ii. Sensory Array. The sensors scan the proximity of the wearer and send the information to the computation unit 306 through a bridge 304 on an 12C (Inter-Integrated Circuit) interconnect (bus) 305. The helmet shell 303 is instrumented with a ring of ultrasonic range finders 301. All sensors share the same serial 12C 305, so that a single four-core umbilical carries power and the bus lines back from the computation unit 306.
[0050] iii. Ultrasound modules. The Ultrasound modules 301 are part of the sensory array 312. In the present example, each sensor module 301 is either an SRFO8 or an S RFI 0 (Devantech ht[p;//robct-&ectronicsco.uk/aca[aIogJUftrasonicRangers.htmI) having characteristics according to Table 1.
Metric SRFO8 SRFIO Connectivity OV, +5V, 12C-I, 12C-C As SRFO8 Max Current l5mA As SRFO8 Frequency (US) 40kHz As SRFO8 Range 3-600 cm As SRFO8 Weight 12g 14g Size 43x20x17 mm 32x15x10 mm
Table I
[0051] iv. Actuators. An example actuator could be TactileLabs haptuator model TLOO2-14-A, displaying the specification as shown in Table 2, (http://www.tactilelabs.com/main/prod ucts/haptuator). The haptuators 309 receive the commands from the control unit (306) through the bridge 304.
Model Number TLOO2-14-A Outer Dimension (diameter x length) 14 x 29 mm Weight -30 grams Acceleration @ 3V input, 125 Hz 3.0 G 29.4 rn/s Rated Bandwidth 50 -500 Hz Typical Impedance 6.0 Ohms Maximum Input Voltage 3.0 V Maximum Input Current 0.5 A
Table 2
[0052] v. Digital Triaxial Accelerometer. The accelerometer 302 provides high resolution (14 bit) measurements of acceleration along three vertical axes enabling measurement of tilt and motion along these axes. An example accelerometer is the B MAI8O by Bosch (http://www. bosch-sensortec.com/content/languaqel/downioads/BST BMA180:fL00OM3pdf). It is mounted on a breakout board from Sparkfun Electronics (http://www.sparkfun.com/products/9723). The accelerometer is connected to the same 12C 305. It receives its commands from the computation unit via the bridge 304 and sends the measurements back via the same route.
[00531 vi. Sensor-Actuator Control Bridge. The example uses a board with a microcontroller running at 16MHz. The bridge 304 acts as the 12C bus 305 master and an interface between the ultrasonic sensors 301, the digital accelerometer 302, the actuator driver board 308, and the computation unit 306.
[0054] vii. Actuator Driver Board. The actuators 309 in this example use a 3V RMS (Route Mean Square) (peak) signal to drive them. To achieve the required power the example features a separate board with a PlC (Peripheral Interface Controller) microcontroller, one or more amplifiers and other necessary circuitry. This board is connected to the bridge 304 through a UART (Universal Asynchronous Receiver/Transmitter) bus 307. The onboard PlC receives the actuation commands, interprets the commands, prepares the output waveform and drives the amplifiers. The amplifiers then send their output to the actuators 309.
[0055] viii. Interconnects. Three main interconnects are used in the current example: 1) Computation Unit to Sensors-Actuator Bridge This is a standard USB (Universal Serial Bus) 2.0 standard cable 313 with Type-A plug and Type-B plugs.
2) Sensors-Actuator Bridge to Sensors This is an 12C bus cable 305 connecting the sensors to the bridge. In the current example, it is driven at 100 KHz by the board which acts as an 12C master and controls the sensors 301, 302. This interconnect 305 is coupled with power lines that supply all the sensors 301, 302 on the shell 303.
3) Sensors-Actuator Bridge to Actuators Driver Board This is a UART connection between the bridge 304 and the PlC microcontroller on the actuator driver board 308. It is unidirectional as the PlC receives the actuation commands and does not send any responses upstream (to the computation unit 306). It is appreciated that a bidirectional cable may be required for alternative embodiments.
[0056] It is appreciated that the equipment described in Example 1 may be replaced or augmented by alternative apparatus in order to achieve the required algorithm inputs. For example, the control of an ultrasonic sensor may be supplemented by automatic temperature compensation using information from temperature sensors. By means of a further example, pulse shaping techniques can preferably be used to distinguish both between pulses emitted by other sensors and in other time windows, allowing a substantial increase in sample rate but more advanced signal processing.
[0057] Another strategy may be to use continuous emission (rather than a series of pulses) to maximise the information gathered and, effectively, provide an unlimited sample rate. This approach is also much more robust to noise, though it does require more energy (therefore its applicability may depend on application domain). Through using continuous emission, a discrete number of sensors positioned on the helmet can be mapped to a continuous circle using Gaussian mapping. A discrete number of actuators are then mapped to the circle using the same technique. By using this technique, the number of sensors and actuators can vary. For example, eight sensors can be mapped to four actuators. An example protocol for adopting this strategy consists of the following steps: a) Emit signals into the environment b) Wait for new sensor data: If new data is available, proceed; If no new data is available, wait c) Read the sensor data d) Calculate the values for individual sensors e) Map the sensors to a circle using Gaussian mapping f) Map the circle back to the actuators using Reverse Gaussian mapping g) Prepare response for corresponding actuators h) Send the response to the actuators i) Return to step b) [0058] This approach has some overlap with the algorithms as exemplified below. It is understood that the algorithm can interpret sensor data in this way and its interpretation varies according to the physical arrangement that are used. As such, the algorithm is programmed according to the physical arrangement.
[0059] A further strategy is to use a small number of emitting modules (as few as one, possibly) for a large number of receiving modules. This removes ambiguity over the source of an emitted pulse whilst reducing energy requirements.
[0060] In one specific embodiment of the invention, ultrasonic (US) range finders are used in the sensory array to detect the position of objects relative to the user. These are effective in a visually occluded environment (including a particulate suspension). In an alternative arrangement, electromagnetic range finders may be used.
[0061] The Environment [0062] The device may be worn for a variety of uses and exposed to a variety of environments. Some examples of use include but are not limited to: * Everyday use Visually impaired persons might make use of the device for sensory substitution, allowing them to navigate in the world with greater confidence and/or negate the need for other sensory aids (e.g. the long cane). For this application, the device might take the form of a hat.
* Emergency use In some emergency environments, vision is occluded, and non-impaired persons might make use of the device, also for sensory substitution. For example, fire-fighters often work in a smoke-filled environment: currently, they rely heavily on arm waving' to ensure the space in front of them is clear before moving into it.
Safety equipment Even those with unimpaired vision are blind to the sides, above, and behind.
Incorporation of the device into, e.g., a construction worker's helmet might provide forewarning of danger. This is thus a sensory augmentation rather than * Miscellaneous uses There may be a variety of other domains where sensory augmentation would be of benefit, for example in some sports such as basketball or American football or pot-holing, diving, mountain biking.
[0063] Application in such environments demands that the algorithm and indeed the sensory device are adapted to the environment and the needs of the user. Salient features of the environment such as the walls, furniture, and exit points (i.e. doorways) are important features that should be presented to the user.
[0064] The Algorithm [0065] The algorithm is performed by the computation unit 306 and is used to control the sensory array, analyse data from it and supply data to the tactile display. It processes raw sensor data (multi-directional range information) into drive signals for the tactile display (that generates mechanical stimulations that are, in turn, sensed by the wearer through the skin) as well as providing control signals for the sensors themselves. The algorithm provides for active sensing and has as its focus: a high information bandwidth; minimisation of the processing the brain has to do (low cognitive load); and avoidance of distraction or discomfort for the user [0066] The terms active' and active sensing' will be understood to refer to run-time changes in the sensing parameters and in sensory processing, depending on environmental changes that serve to boost salient information. That is, sensor behaviour may be dependent on the current knowledge of the environment (e.g. range to local objects may affect the working range of the sensor, sampling rate, emissive power and/or emissive angle), and on the setting in which the device is being used. For example, a collision-avoidance device may be tuned to detect and emphasize objects immediately in front of the head and in the direction of travel, and at a range which is modified according to speed of travel (so that there is adequate time to initiate an avoidance response).
[0067] The algorithm is therefore adapted to circumstances beyond the physical parameter of the object's proximity to the user. Such circumstances may comprise: the rate of change of position of the object with respect to the user; the physical attributes of the object; and the time since first detection of the object by the sensory array.
[0068] This can be performed by the distinction between what sensor information is salient and what can be discarded. For example, once a user has entered a room and the device has displayed this information to the user, the information becomes less salient and a new object entering the room becomes more salient and is therefore presented to the user. Such information may be gathered by active sensing as herein described.
[0069] For example, the algorithm comprises the steps of: i. Processing raw sensor data to generate a data set that the device will convey to the user. This processing, for instance, may heighten the representation of more salient data (e.g. a nearby obstacle or a moving object) and lessen the representation of less salient data (e.g. stationary objects or objects far from the user) including discarding some data entirely (e.g. the location of distant objects).
ii. Conveying the processed data to the wearer through the tactile display. The tactile display cannot directly represent "range" or "obstacle" (because its language is "amplitude" and "pulse frequency"). The algorithm represents the former in the language of the latter, conveying the salient information to the wearer in an optimal way.
iii. Guiding the "active sensing" of the environment -that is, to control active features of the sensors, and sensory processing, to improve their performance at data gathering under the local and current conditions.
iv. Measuring the human involvement through the ongoing response to tactile stimulation and the impressions of the environment that this affords. By moving, the wearer may also contribute deliberately (or not) to the repositioning of the sensory apparatus in a manner that boosts information acquisition.
v. Modifying the parameters of the sensing device through controllable user-interfaces.
[0070] It will be appreciated that these steps are not necessarily chronological and are not entirely independent. Furthermore, the chronology of the algorithm is dependent on both design-time and run-time feedbacks between these processes. An example of design-time feedback is that the nature of perceptions generated in the wearer iv) by the tactile display ii) will impact upon the representation used in i). An example of run-time feedback is that changes in the operation of the sensors due to active sensing iii) will affect both the nature and the salience of representations i) as well as requiring that a different set of information be conveyed ii).
[0071] The data processing step may be best understood by means of an example.
Consider a hypothetical situation where the wearer is navigating an office. The low level data represents range in each direction to the nearest obstacle. Whilst this could be conveyed directly to the user for their brain to make sense of, if the information is first sorted" (or "filtered") such that only the most salient information is conveyed, more advantage can be taken of the limited bandwidth and the user's cognitive load.
[0072] If the user begins to move towards one of these obstacles, the presence of the obstacle becomes salient again (the user may have forgotten the presence of the obstacle), and the representation, thus, stronger, warning the user of the danger of collision. If someone were to enter the office (a moving obstacle), this would represent a salient feature and would be highlighted to the user. The symbolic representation of a moving object might differ from that of a stationary object, helping the user to distinguish between the two. If the user passes near a doorway, a further representation can be used to indicate the affordance offered by the doorway -that is, the opportunity to pass through it.
[0073] The algorithm's control over the tactile display conveys to the user through the vibrotactile language, which converts the salient information identified in the data processing step to physical parameters that are felt by the user. The salient data (hazards, affordances) is mapped into the vibrotactile language, in a way that is high bandwidth and intuitive. By intuitive, it is meant that the user will require little training to understand and interpret the vibrotactile language. The algorithm therefore takes advantage of the wearer's intuitive perceptions of drive signals, such that the need for user training is mitigated or eliminated entirely.
[0074] An example system of this sort integrates algorithmic ideas with array configuration and with "active sensing" which, as above, can be defined as modifying the behaviour of the sensor during use in a way that improves its performance in the given context. To illustrate the definition, specific examples of active sensing would include purely electrical changes such as changing the sampling rate of a range sensor as the range varies or turning off a sensor when the information it is collecting is not going to be used, as well as electromechanical changes such as physically redirecting a sensor at a salient part of the environment. Active sensing may operate to boost the quality, quantity or "salience" of the information collected or to reduce energy requirements.
[0075] The "saliency" of information or of aspects of information depends on the current motivations and past experience of the user. The salience varies from situation to situation and time to time, and with the task that the user is pursuing. However, it will be understood in the context of the present invention that "salient" information is that which the device determines is useful to the user in certain situations beyond the raw information that the sensors detect and could pass on to the user. Examples 2 to 5 indicate how salient information can be filtered. These aspects may be defined in advance of use or at run-time.
[0076] The algorithm informs active sensing at some or multiple levels. A low level operation might include changing the sampling frequency. For example, when an obstacle is detected nearby, a higher sampling frequency is used (due to the relationship of the sample time with object range through the speed of sound). Whereas a high level operation might include changing sensor direction. For example, locomotion may be sensed such that when the user is walking through the environment (as indicated by supplemental sensors), sensors towards the front may be redirected downwards to better detect trip hazards, or sensors towards the side redirected somewhat forwards. Note that this does not imply the necessary movement of the sensors through some actuation mechanism although this is possible. Instead it can relate to increasing sampling density or sampling frequency, using fixed sensors such as those described above, in the direction(s) of interest. Aspects such as this are best illustrated in the following examples.
[0077] Example 2
[0078] With reference to Figures 4(a) to (e), these examples demonstrate how the sensor range changes as the user moves about the environment.
[0079] Referring first to Figure 4(a) a user 400 is located with respect to an object 401.
Sensors on the user's helmet (neither shown), capable of detecting the object, are active over a full rotation about the user and this is illustrated by the hatched region 404 of the drawing. The response of the sensors to the presence of the object 401 is a spike 406 in a circular base line around the user 400. Haptoactuators 403,408 are disposed around the head of the user at four locations around the user's horizon. The degree to which each actuator is actuated depends on the presence of a spike 406 and the activation 404 of the sensors. That degree of actuation (of haptoactuator 403) is illustrated in bar chart 410, and varies between zero and 100% actuation. In this instance, the presentation of a stimulus (actuation of the haptoactuators) is conditional upon the orientation and movement of the user and in turn, the goals and intentions of the user.
[0080] The goals, intentions and demands of a user frequently will correlate with the user's movement. If the user is moving forwards, it may be more useful to receive information from further ahead of the user than behind. An accelerometer (or a similar device, e.g. gyroscope, compass, etc) can be used to measure the user's movement. This movement information can be used to change the sensory array, or the signals transmitted to the user, to reflect the demands of the user thereby making the sensory array active.
[0081] Figure 4(a) to (e) shows the effect of changing the active sensor range, dependent on user movement (measured with an accelerometer (not shown)). In each of Figure 4a-e, the user is moving in a different way. In Figure 4a, the user is stationary but is facing forwards (upwardly in the drawing); in Figure 4b, the user is moving forwards in the direction of the arrow 411 b; in Figure 4c, the user is moving backwards (while facing forwards) in the direction of the arrow 411 c; in Figure 4d, the user is rotating clockwise in the direction of the arrow 41 Id; in Figure 4e, the user is rotating anti-clockwise in the direction of the arrow 411e.
[0082] The object 401 appears at the left front side of the user respective to their direction of heading in all examples. This causes a change in the sensor readings 402 shown as the spike 406 in the sensor readings in that part of the field. Actuators 403, 408 are placed on the user as hereinbefore described at four locations. Because of the relative position of the object, only one actuator is activated by the algorithm following the change in the readings. In each case, the output of the activated actuator 403, 408 is displayed by a bar 410a-e. It should be appreciated that the actuator output is inversely proportional to the object distance over the sensor range. The sensor range 404 is modified according to the user's motion. The modified sensor range directly affects the actuation. For example, if the modified sensor range is less than the distance from the object, the actuation reduces.
[0083] In Figure 4(a), the user is stationary and therefore the sensor range is in its default setting meaning that the sensor range is equal in all directions. It is appreciated that the range of the sensor in this example is shown to be maximum in all directions, but it may reduce when the user is stationary or in a confined environment, as discussed below.
As the object 401 enters the sensor range, the sensor input 402 changes in the direction from which the object enters, as shown by the spike 406. The algorithm interprets this input and passes a signal to the actuator control for the actuator 403 at or near to the direction of the object 401. The output 410a of the actuator is shown to be 100% of the possible output. The remaining actuators 408 are not activated because there is no input in those directions and the actuator 403 is precisely on line with the object 401 (clearly, objects at different angles may cause actuation of two actuators with the relative value of the actuation being indicative of the angle of direction of the object).
[0084] Figure 4(b), shows the range 404b as the user is moving forwards (arrow 411 b).
Here the range is modified such that the sensors in the forward direction have a long range and those in the reverse direction have a shorter range. This is performed by either computationally filtering out any input from those sensors over a certain range or reducing the range of those sensors physically as hereinbefore described. In the examples shown, it is the former of these two approaches that are applied (because the spike 406 is still evident in full). The object 401 is presented by the actuator 403 in the same way as Figure 4(a).
[0085] Figure 4(c), shows the user moving backwards (arrow 411 c), while facing forwards. The sensory range is modified accordingly such that the signal from the forwards facing sensors is filtered and that from the backwards facing sensors are not.
The object 401 is in the same place as before and causes the spike in the sensor input 402. However, due to the filtered sensor range, the object's appearance at the actuator 403 is minimised (see output 410c). It should be noted that this is preferably not zero as the sensory range is not completely absent in the forward direction and as shown by 41 2c has a minimum range.
[0086] In Figures 4(b) and (c), a threshold is applied such that only when the forward or reverse acceleration is greater than the threshold is the sensory range modified in the way shown. It is appreciated that there may be multiple threshold levels which when overcome, extend the range of the sensors in the direction of motion at defined levels. It should also be understood that the movement sensor may be velocity sensitive such that continued movement (without acceleration) in a particular direction retains the sensor range in that direction.
[0087] Figure 4(d) shows how the sensory range is modified when the user is rotating clockwise (arrow 41 ld), or moving to the right (in the drawing). The sensor range in this example is modified such that the sensors facing the right side of the user are receptive (or at least the interpretation of their signals is not filtered), and those on the left side are restricted or filtered. The result of an object 401 entering the field causes the spike 406 in the front left part of the sensor input. However, due to the range of the sensors, as per the example in Figure 4(c), the spike lies outside the sensitive range and the actuator 403 signal 410d is minimised.
[0088] Figure 4(e) shows the user rotating anticlockwise (arrow 411 e), or moving left (in the drawing). The sensor range is modified such that the sensors on the left hand side of the user are given a larger range 404e. The presentation of the object 401 in the field causes the spike 406 in the sensor input 402 and it is inside the sensor range. Therefore, the presentation 410e of the object through the actuator is maximised.
[0089] As in Figures 4(b) and (c), rotational movement may also have a threshold acceleration or velocity value such that the user's rotation must overcome this for modification of the sensor range.
[0090] Additional parameter changes can be programmed. For example, increased movement speed could also change the extent of sensory asymmetry, for example sensor range ahead of the user may increase in proportion to forward velocity.
[0091] As briefly discussed above, the range adaptation can be achieved in at least three ways. Method one by adjusting the range of the ultrasonic sensor on the fly' during operation, which is possible with some sensors. Method two by filtering the input from the sensors according to the motion of the user. Method three by adjusting the range of signals that are transmitted to the user through the tactile display. The result of methods two and three is such that the sensors could be programmed to sense over a range of 2 meters, for example, but only objects within the closest 1 meter of the direction of motion elicit a vibration in the tactile display.
[0092] The sensor array could also be adapted for a particular task, through simple re-programming by the user or using a graphical user interface (GUI). The purpose is to allow the user to select only to detect moving objects, or far objects. The sensors could also be programmed to change their range depending on experience. For example, the sensitive range could reduce with the time that the user spent in a small environment, like a narrow corridor or lift.
[0093] Example 3
[0094] The salience of objects can depend on parameters other than their distance from the user. As briefly mentioned above, the time which the object has been in the sensor range may directly relate to the presentation of that object, through the actuators, to the user. Figures 5(a) to (f) demonstrates the salience of objects with time.
[0095] A continuous signal can be overbearing to the user if the device is used for extended periods of time. Continuous signals can also reduce a person's responsiveness to a sensory signal, for example listening to a constant loud noise will reduce a person's sensitivity to quieter noises (this is known as adaptation). One way to avoid these problems is to present information in a series of pulses. The pulses could be presented at a consistent interval, or could be triggered by head movement (using accelerometer information). Pulsed signals can also be useful for transmitting additional information about objects. The duration of a pulse could signify the objects speed for example, and the length of time between pulses could signify the object's proximity.
[0096] By using decaying pulses, triggered at the detection of a particular object, it is possible to convey the duration of time since first detecting that object. In Figures 5(a) to (e), the user wearing the device 501 according to the present invention is moving in a straight line through the environment in the direction of the arrow 502. In that environment, two objects 503, 504 are encountered, which can be identified by the device 501 in the method described in Example 5. Upon detection of a particular object, the presentation (vibrotactile symbol; here a unique waveform) of the object corresponding to that symbol is delivered to the user through the tactile display as a series of pulses. Over time the amplitude of each pulse is reduced, allowing new information to be presented without overloading the user. This is in addition to the active sensing described above, the present example describing active display of information depending on experience.
[0097] At Time 1 (Figure 5(a) and Time 1 in Figure 5(f)) no objects are within the sensor range. No new information is presented to the user.
[0098] At Time 2 (Figure 5(b) and Time 2 in Figure 5(f)) the first object 503 appears within the sensor range. The symbol for object 1 is presented to the user with large amplitude.
[0099] At Time 3 (Figure 5(c) and Time 3 in Figure 5(f)) the first object 503 is still within the sensor range, but no new objects have appeared. The first symbol 507 for the first object continues to be presented to the user but with reduced amplitude.
[00100] At Time 4 (Figure 5(d) and Time 4 in Figure 5(f)) the first object 503 is still within the sensor range and a second object 504 also appears within the sensor range. The first symbol 507 for the first object continues to be presented to the user but with further reduced amplitude, and the second symbol 508 for the second object is presented with large amplitude.
[00101] At Time 5 (Figure 5(e) and Time 5 in Figure 5(f)) the first object 503 has left the sensor range but the second object 504 remains therein. The symbol for the first object is now absent and the symbol for the second object has decreased.
[00102] Example 4
[00103] It may also be important to the user to display other salient features. Example 4 illustrates how the algorithm can be configured to display the rate of change of object position in the sensor range.
[00104] The example, as shown in Figure 6(a) to (c), uses chirps' in the vibrotactile language. A chirp is a signal in which the frequency increases with time (up-chirp' 610, Figure 6(c)) or decreases with time (down-chirp' 620, Figure 6(b)).
[00105] Referring to Figure 6(a), accelerometer information 601 and object (602) movement velocity 603 (determined by computing the rate of change of object range, dr/dt, with axes reflecting the heading of the user i.e. velocities towards the user are negative) are recorded by the sensory augmentation device 604 which is worn by the user (not shown). As the object enters the range 605 of the sensors which may or may not be modified by the accelerometer information according to Example 2, the velocity 603 of the object is measured.
[00106] The algorithm uses this information to determine the relative object movement velocity with respect to the user. The relative velocity of the object (V_relative) is calculated as the difference between the velocity of the user (V_user 601) and that of the object (V_object 603) see Equation 1.
V_relative = V_user V_object Equation I [00107] Following the calculation, the user is presented with a tactile stimulus which describes the relative velocity of the object (Figures 6(b) and (c)). In the present example, a vibrotactile up-chirp 610 is used to signify an object moving towards the user, and a down-chirp 620 to signify an object moving away from the user. The set of conditions applied by the algorithm, determine the output chirp are therefore: * If V_relative> 0. -3 output ascending chirp.
* If V_relative < 0. -3 output descending chirp * If V_relative = 0. -3 no output chirp (stationary objects) [00108] The up-chirp 610, as shown in Figure 6(c) is a tactile vibration, which is presented using aforementioned apparatus, whose frequency of vibration increases with time. For example, at time 611, the vibration could have a frequency of 1 Hz which rises to 100Hz at a later time 612. These values vary according to the user and the environment for reasons discussed further below.
[00109] The down-chirp 620, as shown in Figure 6(b) is a tactile vibration, which is presented using aforementioned apparatus, whose frequency of vibration decreases with time. For example, at time 621, the vibration could have a frequency of 100Hz which falls to 1 Hz at a later time 622. These values vary according to the user and the environment for reasons discussed further below.
[00110] Example 5
[00111] Ultrasonic sensors can be used to determine the shapes or textures of objects.
Classically this is used on submarines to distinguish such objects as land masses and battle ships. Ultrasonic sensors have also been used for building maps on board autonomous robots. Using symbols from the vibrotactile language the current example algorithm is able to inform the user of the shape and texture of the sensed object. Figure 7 illustrates the organisation of the algorithm according to this example.
[00112] Sensor measurements are taken by ultrasonic sensors 702 as in previous examples. An object classification 703 is performed on the data (using standard object recognition software such as a neural network or template matching). When an object 707 is detected a specific vibrotactile symbol is selected from a library 705. The symbols for each object 707 are set by or learnt by the user before use (the sensor measurement for each object 707 varies). The waveform of the selected vibrotactile symbol is then sent to the actuators (vibrotactile display) 706 through the sensor-actuator board and actuator driver board.
[00113] Vibrotactile Language [00114] Examples 4 and 5 show how vibrotactile symbols can be used to relay specific information about the environment to the user. The vibrotactile symbols are used to create the vibrotactile language which is a set of conventions or rules mapping raw tactile information (e.g. range) into a series of symbols to be presented to a user through the tactile display. This provides a rich encoding of the world, allowing the user to "see" objects, as opposed to merely way-find by being given a heading.
[00115] The tactile display may be controlled by the algorithm such that the information presented to the user comprises the vibrotactile language. It is understood that vibrotactile language is used to define the presentation of information to the user through the tactile display in such a way that it is intuitive to the user and requires little training by the user.
The vibrotactile language may be built from different drive parameters. These may include, but are not limited to, amplitude, frequency, pulse width, pulse frequency. Such parameters may form a vibrotactile stimulus alphabet which is used to construct a language.
[00116] The vibrotactile language can be set according to the user's needs through modification of the algorithm's parameters, or learnt by a user through experience with the device. Example features relayed by the vibrotactile language include: * Identity: An affordance, perhaps a doorway, might be represented by one particular symbol (e.g. a gentle rising-then-falling flutter); a hazard, such as a hole in the floor, by another (perhaps a hard-edged and insistent buzz).
* Proximity: In the same symbol space (language), proximity might be indicated by the frequency of the recurring symbols (with the symbols coming faster as the object is approached).
* Direction: The direction of the object might be indicated by the set of locations on the skin at which the symbol is presented.
* Parameters: In the case of an aperture (such as a doorway or a hole), its size might be indicated by the carrier frequency of the symbol (such a mapping of parameters may be entirely different for different symbols).
* Salience: Salience of any object might, at simplest, be mapped just into the amplitude of the vibration symbol. These mappings from sensory space to symbol space would likely be non-linear, reflecting the non-linear relationships between dimensions of raw tactile space (such as range) and salience to the user.
[00117] It should be understood that the vibrotactile language is not restricted to a particular set of rules and these may be adapted according to the environment where the system is in use or the user for which the device is intended.
[00118] The user [00119] In embodiments of the present invention, the algorithm is adaptive to the user's reaction and learns from that reaction to display information in a different manner. The user's reaction may include a movement, for example, which is recorded by the accelerometer or other suitable sensor. As such, the algorithm is sensitive to user (human) feedback. For example, as the wearer gets used to the device, they will, without prompting, begin to adopt strategies for its best use. Sensors may be placed to read the user's reactive movements using an accelerometer, so the device can respond at run-time as well and modulate its behaviour accordingly. Active training improves learning.
[00120] The current invention also permits increased information transfer by taking the logarithmic Weber-Fechner law of human sensory perception (that the just-noticeable difference between two stimuli is proportional to the magnitude of the stimuli) in to consideration. This can be achieved using a hyperbolic transfer function Equation 2.
(f(x) = MAX_DIST -x I MAX_DIST)"3) Equation 2 [00121] Where f(x) indicates the output function, x is the input e.g. reported distance, and MAX_DIST indicates the maximum range of the sensor. Literature suggests that the most sensitive frequency range: for a human is 150Hz (Morley and Rowe 1990) where at this frequency, "the different mechanosensory afferents are differentially recruited as amplitude is varied". An example protocol for achieving this information transfer comprises the following steps: a) Emit signals into the environment b) Wait for new sensor data. If new sensor data is available, continue; if no new sensor data is available, wait c) Read the sensor data d) Calculate the values for individual sensors e) Prepare response for corresponding actuators using hyperbolic function (Equation 2) f) Send the responses to the actuators g) Return to step b) [00122] User comfort is important to ensure practical use over long periods of time.
Certain low frequency vibrations (around 80Hz) are less comfortable when used consistently over long periods of time. This is perhaps due to the haptuator stimulators that are used having their largest amplitude response at this frequency. However, different people have slightly different comfortable frequency ranges, or prefer certain sensory information/vibrotactile mappings. The current invention provides for simple adjustment of the ranges of parameters, or which mappings to use in a particular task through a GUI interface.
[00123] By utilising active sensory approaches it is possible to use the sensory apparatus more efficiently (focusing the sensory elements on regions and parameters of interest), and to streamline the presentation of information to the user (only providing the most salient or useful information from moment to moment).
[00124] In a further example, the tactile display is auto-regulated so that the parameters of skin stimulation (e.g. amplitude) stay within a comfortable dynamic range. The comfortable dynamic range is user specific with an upper limit being set by a pain threshold of the user, above which the user would be uncomfortable. Thus, whilst confinement in a small space must feel distinct from being in an open space, there is not a simple or monotonic mapping from physical range to any particular drive signal parameter.
[00125] The present invention therefore provides a device that is integrated with the human psychology, which integration may be adapted in several different ways that will be apparent to the skilled person whereby the precise form of the algorithm will vary from application to application and will be subject to personal choice of the designer.
[00126] Throughout the description and claims of this specification, the words "comprise" and "contain" and variations of them mean "including but not limited to", and they are not intended to (and do not) exclude other moieties, additives, components, integers or steps.
Throughout the description and claims of this specification, the singular encompasses the plural unless the context otherwise requires. In particular, where the indefinite article is used, the specification is to be understood as contemplating plurality as well as singularity, unless the context requires otherwise.
[00127] Features, integers, characteristics, or groups described in conjunction with a particular aspect, embodiment or example of the invention are to be understood to be applicable to any other aspect, embodiment or example described herein unless incompatible therewith. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive. The invention is not restricted to the details of any foregoing embodiments. The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.

Claims (42)

  1. CLAIMS1. A sensory augmentation device (100) for wear by a user, comprising a sensory array (101) to detect physical parameters of objects (107) in an environment (108) surrounding the user when the device is being worn, a tactile display (102) to physically engage (204) the user's body and provide information about said environment, and computing means between the sensory array and tactile display that includes an algorithm (103) controlling one or both of the sensory array (101) and the tactile display (102), whereby the presentation of an object (107) is adapted to circumstances beyond the physical presence of the object, selected from the group comprising: * rate of change of position of the object with respect to the user; * physical attributes of the object; * time since first detection of the object by the sensory array; * objectives of the user; and, * environmental changes.
  2. 2. A sensory augmentation device as claimed in claim 1, wherein the sensory array (101) comprises one or more electromechanical range sensors comprising one or more emitting modules (105) and one or more receiving modules (106), wherein the receiving modules (106) are configured to receive and detect signals (209) emitted (208) by the emitting modules (105) and reflected by an object (107) in the environment (108).
  3. 3. A sensory augmentation device as claimed in claim I or 2 wherein the sensory array (101) comprises sensors with parameters set to be sensitive to the user's objectives as determined from the measurement of the user's acceleration, velocity, gyroscopic motion or position.
  4. 4. A sensory augmentation device as claimed in claim 3 wherein said sensory array (101) is controlled (202) by said algorithm (103) such that the sensory array is active, through filtering sensory inputs according to the user's objectives.
  5. 5. A sensory augmentation device as claimed in any of claims 3 or 4, wherein said tactile display (102) is controlled by said algorithm such that said tactile display is active, through modifying said tactile display according to the user's objectives.
  6. 6. A sensory augmentation device as claimed in any preceding claim wherein the sensory array (101) further comprises environment sensors with parameters set to be sensitive to environmental properties selected from temperature, humidity, light, sound or position or physical attributes of objects.
  7. 7. A sensory augmentation device as claimed in claim 6 wherein said sensory array (101) is controlled (202) by said algorithm (103) such that the sensory array actively changes said sensing parameters in response to changes in the environment.
  8. 8. A sensory augmentation device as claimed in any preceding claim wherein said computing means further comprises a timer and is adapted to determine the rate of change of the position of the object with respect to the user from a measurement of velocity change of the object position over a period of time.
  9. 9. A sensory augmentation device as claimed in any preceding claim wherein the sensory array has means for detecting the physical attributes of the object and said algorithm comprises a library of physical attributes wherein said algorithm is adaptable to match said detected physical attributes to an item in said library.
  10. 10. A sensory augmentation device as claimed in claim 9 wherein said library contains a corresponding vibrotactile symbol for said item and said tactile display is configurable to display said corresponding vibrotactile symbol.
  11. 11. A sensory augmentation device as claimed in claim 9 or 10 wherein the algorithm comprises means to store in a memory of said computing means, said detected physical attributes of the objects that have been detected by said sensors and whereby said presentation is adapted to display the time since first detection of the object, wherein said adapted display is pulsed and the amplitude of each pulse relating to the object decreases according to the time since it was first detected.
  12. 12. A sensory augmentation device as claimed in any preceding claim wherein said computing means further comprises a graphical user interface.
  13. 13. A sensory augmentation device as claimed in claim 12 when dependent on claim 3, wherein the algorithm is programmable by said graphical user interface, to enable the objectives of the user to be entered into the algorithm.
  14. 14. A sensory augmentation device as claimed in claim 2, wherein the computing means is adapted to control the emitting module to emit signals in a specific time frame, direction or form.
  15. 15. A sensory augmentation device as claimed in any preceding claim wherein said tactile display (102) is controlled (203) by said algorithm (103) to present (204) information to the user within a comfortable dynamic range, preferably user specific.
  16. 16. A sensory augmentation device as claimed in claim 15 wherein said comfortable dynamic range is user specific and comprises a lower limit where the user cannot feel the tactile display and an upper limit where the tactile display is perceived as painful by the user.
  17. 17. A sensory augmentation device as claimed in claim 16 wherein the computing means is adapted to adapt said comfortable dynamic range in real time through feedback (205, 206) of the user's reaction (105) to the tactile display.
  18. 18. A sensory augmentation device as claimed in claim 15 when dependent on claim 12, wherein the graphical user interface 306 is adapted to enter the comfortable dynamic range of the user.
  19. 19. A sensory augmentation device as claimed in any preceding claim wherein said tactile display (102) comprises a plurality of vibrotactile elements configured to present (204) a vibrotactile language to the user.
  20. 20. A sensory augmentation device as claimed in claim 19 wherein said plurality of vibrotactile elements are actuators.
  21. 21. A sensory augmentation device as claimed in claim 19 wherein said vibrotactile language is built from a combination of drive parameters of the tactile display, comprising; amplitude, frequency, pulse width, pulse frequency.
  22. 22. A sensory augmentation device as claimed in claim 19 wherein said vibrotactile language further comprises ascending or descending frequency pulses.
  23. 23. A sensory augmentation device as claimed in any preceding claim wherein the tactile display (102) is positionable over an area of the body of the user to engage with the cognitive core (104) of the user, wherein the cognitive core of the user is the set of skills, habits and abilities, in perception and action, that a person can rely on without effortful deliberation and that may be interfaced to through vibrotactile stimulation of body surfaces.
  24. 24. A sensory augmentation device as claimed in claim 23 wherein said area of the body is locatable on a skin position that is well served by nerve endings.
  25. 25. A sensory augmentation device as claimed in claim 23 wherein said area of the body is locatable on a skin position in close proximity to a trigeminal nerve of the user.
  26. 26. A sensory augmentation device as claimed in claim 24 wherein the tactile display is locatable about the temples of the user's (1) head.
  27. 27. A sensory augmentation device according to any preceding claim wherein the tactile display promotes a user reaction (105) which is used to feedback (207) to the control algorithm, and may directly affect (206) the sensory array (101).
  28. 28. A sensory augmentation device substantially as hereinbefore described with reference to the accompanying drawings.
  29. 29. A method for sensory augmentation, wherein information about an environment comprising objects surrounding a user is presented to the user, said method comprising the steps of: a) sensing the presence of said objects to generate output data about said objects; and using an algorithm in computing means, either i. to modify said sensing to affect said output data; and/or ii. to process what is sensed to generate said output data; and either or both: in response to circumstances of the user, said circumstances selected from the group comprising: * a rate of change of position of the object with respect to the user; * physical attributes of the object; * time since first detection of the object by the sensory array; * objectives of the user; and, * environmental parameters, and b) presenting said output data to the user in a tactile display, such that the user is presented with information about objects in the environment beyond the physical presence of the object.
  30. 30. A method for sensory augmentation as claimed in claim 29, wherein step a) further comprises emitting a signal into said environment using emitters and receiving in receivers a signal reflected from said objects in said environment.
  31. 31. A method for sensory augmentation as claimed in claim 29 or 30, wherein said one or more objectives of the user comprise their orientation, heading, velocity, acceleration.
  32. 32. A method for sensory augmentation as claimed in claim 29, 30 or 31, wherein said environmental parameters comprise temperature, humidity, light levels, noise levels.
  33. 33. A method for sensory augmentation as claimed in any of claims 29 to 32 further comprising the steps of: a) sensing the velocity and/or acceleration of the user and the presence of said objects in said environment; b) determining the direction of heading of the user; c) applying a filter to increase sensory information processed for objects in the direction of heading and reduce sensory information processed for objects not in the direction of heading; whereby, said output data set is modified.
  34. 34. A method for sensory augmentation as claimed in any of claims 29 to 33 further comprising the steps of: a) retaining sensory information about said objects as obtained at a first time; b) matching said sensory information about said objects as obtained at a later second time to said first time; c) applying a time-dependent reduction to matched sensory information as obtained at said second time to produce said output data, whereby, said presentation of said output data is time-dependent.
  35. 35. A method for sensory augmentation as claimed in any of claims 29 to 34 further comprising the steps of: a) retaining sensory information about said objects as obtained at a first time; b) matching said sensory information about said objects as obtained at a later second time to said first time; c) determining the velocity of said one or more objects; d) comparing the velocity of said user to the velocity of said objects to produce said filtered information; whereby, said presentation of said output data is velocity dependent.
  36. 36. A method for sensory augmentation as claimed in claim 35 wherein said velocity dependent presentation of output data changes frequency over time according to the velocity, whereby: * the frequency of the presentation of objects towards a user varies with time in a first way; and * the frequency of the presentation of objects moving away from the user varies with time in a second way; wherein, said first way is opposite to said second way.
  37. 37. A method for sensory augmentation as claimed in any of claims 29 to 36 wherein said presentation of said output data is pulsed.
  38. 38. A method for sensory augmentation as claimed in any of claims 29 to 37 further comprising the step of: comparing, within said algorithm, said sensory information to a library of sensory signatures, whereby said comparison is used to classify a property of the object wherein said classification of said object is presented to said user through a tactile presentation specific to said classification.
  39. 39. A method for sensory augmentation as claimed in any of claims 29 to 38 wherein said tactile display presents a vibrotactile language which is programmable or learnable by said user.
  40. 40. A method for sensory augmentation as claimed in any of claims 29 to 39, as performed using the apparatus as claimed in any of claims I to 28.
  41. 41. A method for sensory augmentation as herein before described with reference to the accompanying drawings.
  42. 42. A sensory augmentation device configured to implement the method of any of claims to 40.
GB201201654A 2011-01-31 2012-01-31 Active sensory augmentation device Withdrawn GB2487672A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB201101618A GB201101618D0 (en) 2011-01-31 2011-01-31 Active sensory augmentation device

Publications (2)

Publication Number Publication Date
GB201201654D0 GB201201654D0 (en) 2012-03-14
GB2487672A true GB2487672A (en) 2012-08-01

Family

ID=43824844

Family Applications (2)

Application Number Title Priority Date Filing Date
GB201101618A Ceased GB201101618D0 (en) 2011-01-31 2011-01-31 Active sensory augmentation device
GB201201654A Withdrawn GB2487672A (en) 2011-01-31 2012-01-31 Active sensory augmentation device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB201101618A Ceased GB201101618D0 (en) 2011-01-31 2011-01-31 Active sensory augmentation device

Country Status (2)

Country Link
GB (2) GB201101618D0 (en)
WO (1) WO2012104626A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015159237A1 (en) * 2014-04-16 2015-10-22 Fondazione Istituto Italiano Di Tecnologia Wearable sensory substitution system, in particular for blind or visually impaired people
EP2778854A3 (en) * 2013-03-15 2017-01-18 Immersion Corporation Wearable device and augmented reality device

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9646514B2 (en) * 2012-10-23 2017-05-09 New York University Somatosensory feedback wearable object
WO2016015099A1 (en) * 2014-07-28 2016-02-04 National Ict Australia Limited Determination of parameter values for sensory substitution devices
US10217379B2 (en) 2015-01-30 2019-02-26 Toyota Motor Engineering & Manufacturing North America, Inc. Modifying vision-assist device parameters based on an environment classification
US9914218B2 (en) 2015-01-30 2018-03-13 Toyota Motor Engineering & Manufacturing North America, Inc. Methods and apparatuses for responding to a detected event by a robot
US10037712B2 (en) 2015-01-30 2018-07-31 Toyota Motor Engineering & Manufacturing North America, Inc. Vision-assist devices and methods of detecting a classification of an object
WO2024018310A1 (en) * 2023-07-03 2024-01-25 Bajnaid Mohammadfawzi WISE-i: AN ELECTRONIC TRAVEL AND COMMUNICATION AID DEVICE FOR THE VISUALLY IMPAIRED

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080120029A1 (en) * 2006-02-16 2008-05-22 Zelek John S Wearable tactile navigation system
WO2010011045A2 (en) * 2008-07-24 2010-01-28 Park Sun Ho Apparatus and method for converting video information into a tactile sensitive signal
WO2011117794A1 (en) * 2010-03-21 2011-09-29 Ariel - University Research And Development Company, Ltd. Methods and devices for tactilely imparting information

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090312817A1 (en) * 2003-11-26 2009-12-17 Wicab, Inc. Systems and methods for altering brain and body functions and for treating conditions and diseases of the same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080120029A1 (en) * 2006-02-16 2008-05-22 Zelek John S Wearable tactile navigation system
WO2010011045A2 (en) * 2008-07-24 2010-01-28 Park Sun Ho Apparatus and method for converting video information into a tactile sensitive signal
WO2011117794A1 (en) * 2010-03-21 2011-09-29 Ariel - University Research And Development Company, Ltd. Methods and devices for tactilely imparting information

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Kris Walters; Seungyon Lee; Starner T; RuÌ diger Leibrandt; Lawo M, "Touchfire: Towards a glove-mounted tactile display for rendering temperature readings for firefighters" Wearable Computers (ISWC), 2010, pages 1-4, ISBN 978-1-4244-9046-2 *
Pradeep V; Medioni G; Weiland J, "A wearable system for the visually impaired", 2010, 2010 annual international conference of the IEEE Engineering in Medicine and Biology Society : (EMBC 2010) ; Buenos Aires, Argentina, 31 August - 4 September 2010, pages 6233-6236, ISBN: 978-1-4244-4123-5 *
Schwerdt H N; Tapson J; Etienne-Cummings R, "A color detection glove with haptic feedback for the visually disabled", Information Sciences and Systems, 2009. CISS 2009. 43rd Annual Conference, IEEE, Piscataway, NJ, USA, pages 681-686, ISBN 978-1-4244-2733-8 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2778854A3 (en) * 2013-03-15 2017-01-18 Immersion Corporation Wearable device and augmented reality device
US10269222B2 (en) 2013-03-15 2019-04-23 Immersion Corporation System with wearable device and haptic output device
WO2015159237A1 (en) * 2014-04-16 2015-10-22 Fondazione Istituto Italiano Di Tecnologia Wearable sensory substitution system, in particular for blind or visually impaired people

Also Published As

Publication number Publication date
WO2012104626A1 (en) 2012-08-09
GB201101618D0 (en) 2011-03-16
GB201201654D0 (en) 2012-03-14

Similar Documents

Publication Publication Date Title
GB2487672A (en) Active sensory augmentation device
Katzschmann et al. Safe local navigation for visually impaired users with a time-of-flight and haptic feedback device
US20140184384A1 (en) Wearable navigation assistance for the vision-impaired
US10528815B2 (en) Method and device for visually impaired assistance
US10119807B2 (en) Thermal sensor position detecting device
EP2629737B1 (en) White cane with integrated electronic travel aid using 3d tof sensor
US11801194B2 (en) Information processing apparatus and information processing method
US20160321955A1 (en) Wearable navigation assistance for the vision-impaired
WO2015116640A1 (en) Eye and head tracking device
KR101343860B1 (en) Robot avatar system using hybrid interface and command server, learning server, and sensory server therefor
US11567581B2 (en) Systems and methods for position-based gesture control
WO2015083183A1 (en) Hand wearable haptic feedback based navigation device
Liu et al. Electronic travel aids for the blind based on sensory substitution
Bertram et al. Sensory augmentation with distal touch: The tactile helmet project
Sharma et al. Design of micro controller Based Virtual Eye for the Blind
WO2023019376A1 (en) Tactile sensing system and method for using same
WO2022014445A1 (en) Detecting device, and detecting method
Patil et al. Environment sniffing smart portable assistive device for visually impaired individuals
JP2024045110A (en) Robots and charging stations and landmark devices for robots
Riener et al. “Personal Radar”: A self-governed support system to enhance environmental perception
JP2018192559A (en) Autonomous mobile robot detecting touch to body of curved surface shape
KR102313973B1 (en) Walking aid system for the visually impaired
Lawson Tactile displays for cueing self-motion and looming: What would Gibson think
Manukova et al. Concept for Building an Electronic Cane Control System for Blind People
US20200281771A1 (en) Movement Aid for the Visually Impaired

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)