GB201201654D0 - Active sensory augumentation device - Google Patents
Active sensory augumentation deviceInfo
- Publication number
- GB201201654D0 GB201201654D0 GB201201654A GB201201654A GB201201654D0 GB 201201654 D0 GB201201654 D0 GB 201201654D0 GB 201201654 A GB201201654 A GB 201201654A GB 201201654 A GB201201654 A GB 201201654A GB 201201654 D0 GB201201654 D0 GB 201201654D0
- Authority
- GB
- United Kingdom
- Prior art keywords
- user
- environment
- sensory array
- augumentation
- sensory
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
- G09B21/001—Teaching or communicating with blind persons
- G09B21/003—Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7455—Details of notification to user or communication with user or patient ; user input means characterised by tactile indication, e.g. vibration or electrical stimulation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/08—Devices or methods enabling eye-patients to replace direct visual perception by another kind of perception
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H3/00—Appliances for aiding patients or disabled persons to walk about
- A61H3/06—Walking aids for blind persons
- A61H3/061—Walking aids for blind persons with electronic detecting or guiding means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/06—Systems determining the position data of a target
- G01S15/08—Systems for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H3/00—Appliances for aiding patients or disabled persons to walk about
- A61H3/06—Walking aids for blind persons
- A61H3/061—Walking aids for blind persons with electronic detecting or guiding means
- A61H2003/063—Walking aids for blind persons with electronic detecting or guiding means with tactile perception
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Remote Sensing (AREA)
- Veterinary Medicine (AREA)
- Radar, Positioning & Navigation (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Rehabilitation Therapy (AREA)
- Pain & Pain Management (AREA)
- Physical Education & Sports Medicine (AREA)
- Biophysics (AREA)
- Epidemiology (AREA)
- Computer Networks & Wireless Communication (AREA)
- Acoustics & Sound (AREA)
- Computer Hardware Design (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Dermatology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physiology (AREA)
- Educational Technology (AREA)
- Educational Administration (AREA)
- Business, Economics & Management (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Neurosurgery (AREA)
- Neurology (AREA)
- Vascular Medicine (AREA)
- Ophthalmology & Optometry (AREA)
Abstract
A sensory array detects objects in an environment surrounding the user, such as a fireman or blind person, when the device is being worn, a tactile display to physically engage the user's body and to provide salient information about said environment. An algorithm controls one or both of the sensory array and tactile display so that the presentation of an object is adapted to circumstances other than the physical parameter of the object's proximity to the user. The output can therefore be tuned to provide salient feedback, based on the rate of change of position of the object with respect to the user, the physical attributes of the object, the time since first detection of the object by the electromagnetic or sonic sensory array; and/or the current situation or objectives of the user. This ensures that the user does not get too many distracting tactile responses for the user to understand their environment from vibrotactile stimulation of the skin.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB201101618A GB201101618D0 (en) | 2011-01-31 | 2011-01-31 | Active sensory augmentation device |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201201654D0 true GB201201654D0 (en) | 2012-03-14 |
GB2487672A GB2487672A (en) | 2012-08-01 |
Family
ID=43824844
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB201101618A Ceased GB201101618D0 (en) | 2011-01-31 | 2011-01-31 | Active sensory augmentation device |
GB201201654A Withdrawn GB2487672A (en) | 2011-01-31 | 2012-01-31 | Active sensory augmentation device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB201101618A Ceased GB201101618D0 (en) | 2011-01-31 | 2011-01-31 | Active sensory augmentation device |
Country Status (2)
Country | Link |
---|---|
GB (2) | GB201101618D0 (en) |
WO (1) | WO2012104626A1 (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014066516A1 (en) | 2012-10-23 | 2014-05-01 | New York University | Somatosensory feedback wearable object |
US9092954B2 (en) | 2013-03-15 | 2015-07-28 | Immersion Corporation | Wearable haptic device |
WO2015159237A1 (en) * | 2014-04-16 | 2015-10-22 | Fondazione Istituto Italiano Di Tecnologia | Wearable sensory substitution system, in particular for blind or visually impaired people |
JP6725133B2 (en) * | 2014-07-28 | 2020-07-15 | ナショナル・アイシーティ・オーストラリア・リミテッド | Determining parameter values for sensory replacement devices |
US10217379B2 (en) | 2015-01-30 | 2019-02-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | Modifying vision-assist device parameters based on an environment classification |
US10037712B2 (en) | 2015-01-30 | 2018-07-31 | Toyota Motor Engineering & Manufacturing North America, Inc. | Vision-assist devices and methods of detecting a classification of an object |
US9914218B2 (en) | 2015-01-30 | 2018-03-13 | Toyota Motor Engineering & Manufacturing North America, Inc. | Methods and apparatuses for responding to a detected event by a robot |
WO2024018310A1 (en) * | 2023-07-03 | 2024-01-25 | Bajnaid Mohammadfawzi | WISE-i: AN ELECTRONIC TRAVEL AND COMMUNICATION AID DEVICE FOR THE VISUALLY IMPAIRED |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090312817A1 (en) * | 2003-11-26 | 2009-12-17 | Wicab, Inc. | Systems and methods for altering brain and body functions and for treating conditions and diseases of the same |
US20080120029A1 (en) * | 2006-02-16 | 2008-05-22 | Zelek John S | Wearable tactile navigation system |
KR20100010981A (en) * | 2008-07-24 | 2010-02-03 | 박선호 | Apparatus and method for converting image information into haptic sensible signal |
WO2011117794A1 (en) * | 2010-03-21 | 2011-09-29 | Ariel - University Research And Development Company, Ltd. | Methods and devices for tactilely imparting information |
-
2011
- 2011-01-31 GB GB201101618A patent/GB201101618D0/en not_active Ceased
-
2012
- 2012-01-31 GB GB201201654A patent/GB2487672A/en not_active Withdrawn
- 2012-01-31 WO PCT/GB2012/050205 patent/WO2012104626A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
GB2487672A (en) | 2012-08-01 |
GB201101618D0 (en) | 2011-03-16 |
WO2012104626A1 (en) | 2012-08-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB201201654D0 (en) | Active sensory augumentation device | |
JP6955603B2 (en) | Providing priming queues for electronic device users | |
TW201612724A (en) | Electronic device and display interface adjusting method thereof | |
AU2019268179A1 (en) | Method and device for processing motion events | |
USD720360S1 (en) | Display screen or portion thereof with graphical user interface | |
IN2015KO00053A (en) | ||
EP2770404A3 (en) | Mobile device with instinctive haptic alerts | |
PH12016500524A1 (en) | Display of a visual event notification | |
GB2533520A (en) | Gaze-controlled interface method and system | |
EP2827225A3 (en) | System and method for activating a hidden control | |
GB2497383A (en) | Alert display on a portable electronic device | |
EP4252646A3 (en) | Wearable electronic device and method for controlling the same | |
MX353242B (en) | Method for controlling display of multiple objects depending on input related to operation of mobile terminal, and mobile terminal therefor. | |
GB2515436A (en) | Virtual hand based on combined data | |
GB2514971A (en) | A method, apparatus, and system for distributed pre-processing of touch data and display region control | |
EP3001650A3 (en) | Portable electronic device and method of controling the same | |
MX352773B (en) | Adaptive event recognition. | |
MX2016013152A (en) | Medical device placement system and a method for its use. | |
WO2013130203A3 (en) | Methods and apparatuses for operating a display in a wearable electronic device | |
WO2013130202A3 (en) | Methods and apparatuses for operating a display in an electronic device | |
EP2863276A3 (en) | Wearable device and method for controlling the same | |
MX2016000664A (en) | Object based contextual menu controls. | |
USD769937S1 (en) | Display screen with graphical alarm icon | |
EP2871590A3 (en) | User interface control in portable system | |
EP2778854A3 (en) | Wearable device and augmented reality device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |