WO2013018090A1 - System and method for non-visual sensory enhancement - Google Patents
System and method for non-visual sensory enhancement Download PDFInfo
- Publication number
- WO2013018090A1 WO2013018090A1 PCT/IL2012/050280 IL2012050280W WO2013018090A1 WO 2013018090 A1 WO2013018090 A1 WO 2013018090A1 IL 2012050280 W IL2012050280 W IL 2012050280W WO 2013018090 A1 WO2013018090 A1 WO 2013018090A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- stimulation
- data
- user
- sensory
- visual
- Prior art date
Links
- 230000001953 sensory effect Effects 0.000 title claims abstract description 97
- 230000000007 visual effect Effects 0.000 title claims abstract description 68
- 238000000034 method Methods 0.000 title claims description 25
- 230000000638 stimulation Effects 0.000 claims abstract description 180
- 238000012545 processing Methods 0.000 claims abstract description 45
- 238000006243 chemical reaction Methods 0.000 claims abstract description 10
- 238000003491 array Methods 0.000 claims description 4
- 210000003128 head Anatomy 0.000 description 8
- 239000011159 matrix material Substances 0.000 description 7
- 230000008859 change Effects 0.000 description 5
- 230000007613 environmental effect Effects 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 210000004556 brain Anatomy 0.000 description 3
- 238000013519 translation Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 210000004087 cornea Anatomy 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000026058 directional locomotion Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005562 fading Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003705 neurological process Effects 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000004936 stimulating effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 210000000857 visual cortex Anatomy 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/08—Devices or methods enabling eye-patients to replace direct visual perception by another kind of perception
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H3/00—Appliances for aiding patients or disabled persons to walk about
- A61H3/06—Walking aids for blind persons
- A61H3/061—Walking aids for blind persons with electronic detecting or guiding means
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
- G09B21/001—Teaching or communicating with blind persons
- G09B21/003—Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H3/00—Appliances for aiding patients or disabled persons to walk about
- A61H3/06—Walking aids for blind persons
- A61H3/061—Walking aids for blind persons with electronic detecting or guiding means
- A61H2003/063—Walking aids for blind persons with electronic detecting or guiding means with tactile perception
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/16—Physical interface with patient
- A61H2201/1602—Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
- A61H2201/165—Wearable interfaces
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5002—Means for controlling a set of similar massage devices acting in sequence at different locations on a patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5058—Sensors or detectors
- A61H2201/5092—Optical sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5097—Control means thereof wireless
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2205/00—Devices for specific parts of the body
- A61H2205/08—Trunk
- A61H2205/081—Back
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N1/00—Electrotherapy; Circuits therefor
- A61N1/02—Details
- A61N1/04—Electrodes
- A61N1/0404—Electrodes for external use
- A61N1/0472—Structure-related aspects
- A61N1/0476—Array electrodes (including any electrode arrangement with more than one electrode for at least one of the polarities)
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L21/00—Speech or voice signal processing techniques to produce another audible or non-audible signal, e.g. visual or tactile, in order to modify its quality or its intelligibility
- G10L21/06—Transformation of speech into a non-audible representation, e.g. speech visualisation or speech processing for tactile aids
Definitions
- the present invention generally relates to systems and methods for sensory enhancements and more particularly to systems and methods for sensory enhancement through conversion of visual input into non- visual sensory stimuli.
- a sensory enhancement system for providing non- visual sensory enhancement to a user, where the system includes a 3D data device for producing 3D data representative of a 3D environment; at least one stimulation devices for applying non-visual sensory stimuli to the user; and a control and processing unit for conversion of the 3D data into non- visual sensory stimulation, wherein the control and processing unit receives 3D data from the 3D data device, calculates at least one stimulation pattern according to the received 3D data and operates the at least one stimulation device according to the stimulation pattern.
- the sensory stimulation applied to the user allows the user to perceive at least part of the 3D environment through the non- visual sensory stimuli.
- the 3D data represents a real environment surrounding the user.
- the 3D data device comprises a computer that generates 3D data representative of a virtual 3D environment.
- the 3D data device comprises at least one: 3D camera; 3D scanner, computer module for generating virtual 3D data and/or a combination thereof.
- the stimulation device is configured for applying at least one of the following stimulation types: tactile stimuli; auditory stimuli, olfactory stimuli.
- the stimulation device comprises a multiplicity of non- visual stimulation output devices.
- the output devices may comprise a multiplicity of tactile stimulation output devices each separately controlled by said control and processing unit.
- the tactile output devices comprise electrodes for applying tactile stimuli by applying an electric pulse, wherein the control and processing unit separately controls intensity and operation of each respective electrode.
- Any type of tactile stimulation output devices can be used such as: electrodes, vibrating devices and/or pressure devices.
- the 3D data device and the control and processing unit are portable and designed to be carried by the user.
- the 3D data device creates 3D images, each 3D image including multiple arrays of points; each point represented by 3D coordinates in respect to at least one known reference 3D point.
- the sensory enhancement system is configured for allowing real time or near real time conversion of the respective 3D data into non- visual sensory stimuli.
- the stimulation device comprises at least two sections, each section comprising at least one group of output devices, wherein each section and each group is configured for applying non-visual stimuli over a different area of the user's body, wherein pairs of groups located at different sections of the stimulation device are associated by the control and processing unit for allowing associated operation thereof for representing 3D data.
- the stimulation pattern defines operational characteristics of non- visual stimulation output devices of the stimulation device, wherein the operational characteristics comprise at least one of: intensity of stimuli applied by each respective output device; sequential order for operating said output devices; timing of operating the respective output devices; duration of stimulation.
- the stimulation device comprises a combination of output devices enabling applying different types of stimulation.
- a sensory enhancement system for non-visual sensory enhancement comprising: at least one stimulation device for applying non-visual stimuli to a user; and at least one control and processing unit, which receives 3D data and operates the stimulation device according to the received 3D data to allow the respective user to perceive at least part of a 3D environment associated with the 3D data through non- visual sensory stimuli.
- a method for non- visual sensory enhancement comprising receiving 3D data from at least one 3D data device, wherein the 3D data is associated with a 3D environment; and applying non- visual stimulation to a user according to the received 3D data, using at least one stimulation device, for allowing the user to perceive the 3D environment through the non-visual stimulation.
- the method further comprises acquiring 3D data and transmitting the acquired 3D data to a control and processing unit for operating the at least one stimulation device according to the received 3D data.
- the method may additionally or alternatively also comprise calculating at least one stimulation pattern, according to the received 3D data and the configuration and according to definitions of the stimulation device, wherein the stimulation pattern comprises a blueprint for operating the stimulation device accordingly, to simulate the 3D data through non- visual sensory stimulation.
- the 3D data is transmitted from a computer game module to a control and processing unit for translating the respective 3D data into non- visual stimulation and controlling the stimulation device accordingly.
- the stimulation device comprises a multiplicity of output devices, wherein the stimulation is applied by defining characteristics of the sensory stimulation outputted through the output devices according to the received 3D data, and the output devices are operated according to these characteristics.
- the characteristics may comprise at least one of: operation mode of each output device; intensity of stimulation applied through each output device that is operated; stimulation timing characteristics.
- the 3D data is represented through 3D points each represented by 3D coordinates indicative of the respective point's distance from at least one known reference 3D point.
- the reference point is optionally a point located at a vicinity to the user's location and changes along with location changes of the user.
- Fig. 1 is a block diagram, schematically illustrating a sensory enhancement system for converting 3D visual data into non- visual stimulation, according to some embodiments of the present invention.
- FIG. 2 is a block diagram, schematically illustrating a sensory enhancement system for converting 3D visual data into non- visual stimulation, using a 3D camera for measuring 3D surrounding environment of a user, according to other embodiments of the present invention.
- Fig. 3A shows an example of a first stimulation pattern representation via a matrix indicative of positions of output devices of a first section of a stimulation device and operational data thereof, according to one embodiment of the present invention.
- Fig. 3B shows an example of a second stimulation pattern representation via a matrix indicative of positions of output devices of a second section of the same stimulation device as in Fig. 3A and operational data thereof, according to one embodiment of the present invention.
- Fig. 3C shows an example of a third stimulation pattern representation via a matrix indicative of positions of output devices of a third section of the same stimulation device as in Fig. 3A and 3B and operational data thereof, according to one embodiment of the present invention.
- FIG. 4 schematically illustrates a user wearing the sensory enhancement system having a human positioned at a left side thereof and the operational pattern characteristics of the stimulation device of the system according thereto.
- Fig.5A schematically illustrates a user wearing the sensory enhancement system having a wall with an opening at a right side thereof and the operational pattern characteristics of the stimulation device of the system according thereto.
- Fig. 5B schematically illustrates a user wearing another type of a sensory enhancement system, according to another embodiment of the invention, having a wall with an opening in front of the user and the operational pattern characteristics of a stimulation device of the sensory enhancement system according thereto.
- FIG. 6A schematically illustrates a user wearing the sensory enhancement system having a wall with an opening in front of the user and the operational pattern characteristics of the stimulation device of the system according thereto.
- Fig. 6B schematically illustrates a user wearing the sensory enhancement system having a wall with an opening in front of the user and the operational pattern characteristics of the stimulation device of the system according thereto.
- Fig. 7A schematically illustrates a user wearing the sensory enhancement system having a person at a left side of the user and the operational pattern
- Fig. 7B schematically illustrates a user wearing the sensory enhancement system having a person at a right side of the user and the operational pattern characteristics of the stimulation device of the system according thereto.
- Fig. 8 shows changes in operational characteristics of the stimulation device, when an object such as a person moves in respect to the location of the user over time, according to some embodiments of the present invention.
- Fig. 9 is a flowchart, very generally illustrating a process for representing 3D data through non- visual stimulation, according to some embodiments of the present invention.
- the present invention in some embodiments thereof, provides systems and methods for converting three-dimensional (3D) input data representing visual information of a real or virtual environment of a user, into non-visual sensory stimulation through stimulation patterns representing/simulating this 3D data to allow the user, such as a blind person or a video game player, to perceive the visual environment (e.g. nearby still objects, landscapes, approaching people and objects and the like) through other senses thereof.
- the sensory enhancement system allows applying sensory stimuli to the user (e.g. through tactile sensory stimulation means) according to the sensory stimulation pattern representing the real or virtuaBD surrounding environment of the user.
- the systems and methods of the present invention use one or more stimulation devices that can apply the non-visual stimuli to the user and include control and processing means for translating the 3D data into the sensory stimuli in real time or near real time by creating stimuli pattern corresponding to the received 3D data.
- the objective of the present invention is, inter alia, to provide a complex non-visual sensory translation of the 3D surrounding environment of the user since the visual world to simulate the complex 3D experience of the visual sense.
- the 3D data, indicative of the visual surrounding environment of the user should be translated into a complex non-visual stimulation that may use the physical space of the user (felt by the user through other senses thereof) to indicate the complex 3D data.
- the system includes the means for producing the 3D data such as one or more 3D cameras or a computer game system producing virtual 3D data of a real or virtual surrounding environment of the user, respectively.
- the 3D data producing means are external to the system.
- the system is configured to allow retrieval/receiving of 3D data from various 3Ddata devices that can represent a 3D environment through 3d data, such as 3D cameras, 3D software products such as 3D models and/or 3D graphics based video/computer games etc.
- a 3D camera of the system is either portable (worn by the user) or located remotely from the user.
- the 3D camera produces 3D images of the surrounding environment of the user in real time/near real time, wherein the images change according to changes in the
- 3D images are then translated into a stimulation pattern by a control and processing unit(s) of the sensory enhancement system, creating a stimulation pattern for each 3D image received from the 3D camera.
- the system can then apply non-visual stimulation (such as tactile stimulation) to the user, using the system's stimulation device(s) according to the respective stimulation pattern of the respective 3D image.
- non-visual stimulation such as tactile stimulation
- This allows the user to perceive the surrounding 3D space through non-visual senses thereof, through a learning process in which the brain (possibly first through the touch center and from there through the visual cortex) learns how to perceive 3D image and space through other senses specifically through stimulation patterns of the system.
- the stimulation device of the system can be a sleeve or sleeves configured to be worn around the arms or torso of the user, and/or any other body part.
- Each sleeve may have electrodes attached thereto for producing light electric signals (pulses) for producing tactile stimuli applied to the user's body in accordance with electric current stimuli patterns received by the control and processing unit of the sensory enhancement system.
- the electric current patterns are unique, each electric current pattern represents features of the surrounding space such as objects surfaces, their location in respect to the user, and/or additional selected identifiable input patterns such as color, speed of objects, etc.
- any device(s) that can apply non- visual stimulation may be used for applying the non-visual stimulation applying any type of stimuli such as tactile stimuli, auditory stimuli, taste and/or olfactory stimuli and the like and the stimulation patterns produced may be adapted to the specific one or more devices types used.
- the stimulation device may include for example one or more stimulation output devices (shortly referred to hereinafter also as "output device(s)) such as: electrodes, vibrating devices, pressure devices, devices that can apply heat over the user's skin, speakers for applying auditory stimuli or a combination thereof.
- some tactile stimulation output devices may be used for the main purpose of the sensory enhancement system such as electrodes while one or more other output devices may be used for allowing the user to perceive other aspects of the environment for improving visual perception thereof.
- the system 100 includes a control and processing unit 120, a 3Ddata devicel30 and a sensory stimulation device 110.
- the data devicel30 may be either included as part of the sensory enhancement system 100 or optionally be external thereto.
- the control and processing unit 120 receives 3D data in real time or near real time from the 3D data devicel30, for example, including 3D images of the surrounding area; translates each such 3D data (e.g. image) into a stimulation pattern (e.g. calculating the respective stimulation pattern); and controls the stimulation device 110 for applying non-visual sensory stimuli to the user at each given timeframe according to the stimulation pattern.
- the stimulation may be carried out by transmitting signals to the stimulation device 110 via a communication link such as link 92, which could be wireless or non-wireless.
- the stimulation device 110 includes multiple arrays of tactile stimuli output devices such as electrodes 50 arranged in groups.
- the stimulation device 110 can be designed for wear as a cape having a head opening 112.
- two groups of output devices are formed: a first group 111a for being positioned over the frontal side of the user when worn; and a second group 111b for being positioned over the back side of the user when worn.
- the user receives tactile stimuli over his front and back sides of his/her torso and the
- stimulations locations thereover are coordinated and can be synchronized by the control and processing unit 120.
- the 3D data devicel30 transmits 3D data to the control and processing unit 120 via a
- the 3D data devicel30 may be any device that can sense the 3D surrounding space of the user or a device that produces virtual 3D images related data such as a 3D camera producing 3D images in real time of the area, or a computer game module
- the 3D data devicel30 includes a 3D camera or a 3D scanner enabling to produce 3D images or models of the surrounding environment that has been photographed/scanned at each given timeframe, allowing transmitting 3D image(s) or data that allows building of the environmental 3D models to the control and processing unit 120.
- the control and processing unit 120 calculates a corresponding sensory stimulation pattern for each received 3D image/model, which is a blueprint for operating the sensory stimulation device 110.
- the stimulation pattern may include a signal pattern that includes machine- operable signals/data/commands for controlling and operating the stimulation device 110 to apply stimuli over specific areas of the user's body according to the 3D data received.
- the operation and controlling may include controlling operational characteristics such as: (i) stimulation intensity (signal intensity); (ii)stimulation timing (e.g. sequence timing between multiple output devices and/or duration of stimulation applied, intensity fading effects and the like); (iii)stimulation mode of each output device - which determines the number and location of output devices operated (turned on), and the like.
- the control and processing unit 120 uses translation algorithms, which may include one or more mathematical operators, for converting the 3D data to stimulation patterns according to predefined logics and methodology as well as according to the stimulation device 120 characteristics and abilities.
- translation algorithms may include one or more mathematical operators, for converting the 3D data to stimulation patterns according to predefined logics and methodology as well as according to the stimulation device 120 characteristics and abilities.
- distances scales may be represented by signal intensity scales, where the intensity of the stimuli corresponds to the distance between the object/object part or area in the 3D image/model and the user.
- Speeds may be represented by decreasing/increasing intensities of the stimulation applied where, for instance, a decrease in intensity represents that the object is moving away from the user (the distance increases) and vice versa.
- the methodology for calculating the respective stimulation pattern for each 3d image/model may correspond to the manner in which the virtual cortex perceives visual input and therefore may be based on studies that investigate these neurological processes.
- distance can be simulated by having pairs of stimulation output devices50 located at different locations over the stimulation devicellO and therefore over the user's body, such as pairs of electrodes 50 one of each located at the front of the user and the other at the back.
- the pair of electrodes50 forms a distance unit identifier and used for simulating the distance between an object/object's area to the user.
- the wearable stimulation device HO may have as many pairs of electrodes (back and front) as the respective user can distinguish.
- the stimulation device HO may have different resolutions to allow each person to have as many pairs of electrodes as he feels comfortable having. Both for distance range, as well as for how detailed the wish to see other identifiable patterns. Distances from areas/spots in the surrounding environment to the user may ultimately allow the user to perceive 3D shapes over time. This means that the number of stimulation devices may increase as the user is more skillful and quicker in perceiving the 3D data.
- each electrode 50 of the frontal first group 111a may be associated with an electrode 50 in the back second group 111b for creating sensory stimulation patterns of pairs of electrodes 50for allowing translating distances to differences between intensities of electrode pairs 50, as explained above.
- FIG. 2 schematically illustrating a user 10 using a sensory enhancement system 200, according to some embodiments of the present invention.
- the user 10 wears a portable 3D camera 230 over his head for allowing optimal proximity to the user's 10 eyes area and optimal comfort of wear.
- the 3D camera 230 acquires 3D data of the frontal environment of the user 10 at given timeframes (e.g. each few fractions of a second a 3D image is acquired) using depth- detection sensors, for instance, and transmits the 3D images (one at a time) to a control and processing unit 220 of the sensory enhancement system 200.
- the control and processing unit 220 operates a stimulation device 210 of the system 200 that includes multiple groups of tactile stimulation output devices (shortly referred to as output devices) such as electrodes50' for operating thereof.
- the 3D data of the camera 230 may include lists of 3D points (each point represented by its respective x, y and z coordinates' values thereof) each point is given in respect to a reference point that is related to the user's 10 location (e.g. referring to the user's head).
- the 3D data represents a still 3D image of the surrounding space of the user lOwhere a new 3D data pack is transmitted each given timeframe for updating changes in the environment in near real time, depending on the speed of data acquisition and transmission abilities of the system 200.
- the timeframe and other such features of the system 200 may be changeable through the control and processing unit 220 providing, for instance, a designated user interface that allows input and display options for setting these features.
- control and processing unit may be any device that allows
- control and processing unit 220 may be a mobile phone such as a smartphone or a tablet device having a designated application installed therein or operated thereby in any other configuration.
- the control and processing unit 220 receives each such 3D data and translates it to a corresponding stimulation pattern.
- This pattern includes a set of matrices each representing a different group of electrodes arranged according to the configuration (number of rows and columns) of the electrodes in each group. A number is given to each component of each matrix representing the intensity of the signal to be transmitted thereto and therefore indicative of the intensity of the stimuli applied thereby.
- Figures 3A-3B represent an example of a stimulation pattern for electrode groups based stimulation device having five columns and five rows of electrodes at each group. No signal is represented by the number "0" and the signal intensity is represented by a non-zero number indicative of the signal intensity according to a predefined scale that can optionally be changed according to environmental conditions.
- a first matrix 301a represents the first group 201a of electrodes.
- only electrode 4A is operated to an intensity of 2 according to a scale of 1-10, where 1 represents the lowest intensity and 10 represents the highest intensity.
- a second matrix 301b represents the second group201b of electrodes in which only electrode 4A is operated to an intensity level of 3
- a third matrix 301c represents the third group201c of electrodes in which only electrode 4A is operated to an intensity level of 1.
- the first group is located highest in the user's 10 back (see Fig. 2); the second group is located in a middle section of his back and the third at a lowest part thereof.
- This exemplary stimulation pattern may be associated with specific 3D information such as (as shown in Fig. 2) a human figure 20 positioned at the left of the user 10. [0058] According to this example, as illustrated in Fig. 2and Fig.
- control and processing unit 220 operates the electrodes 50 of the stimulation device 210 that correspond to A4 positions in the first, second and third groups 201a, 201b and 201c of electrodes, respectively. If we take, for instance, three points over a human object 20 in the nearby environment of the user 10, as illustrated in Fig.
- these three points having 3D coordinates relative to the one or more current location of the reference points: 21a, 21b and 21c are measured by the 3D camera 230 and 3D data thereof 24 is transmitted to the control and processing unit 220, which in turn creates stimulation patterns 301a- 301c for the three electrode groups 201a-201c respectively, operating each electrode separately by transmitting operational signals 23a, 23ba and 23c to each of the three electrodes 50, respectively for operating them according to their respective patterns 301a-301c (i.e. according to the intensities determined for each electrode that is to be "turned on”).
- the coordinates representing the 3D data are given in respect to a known reference point such as the current location of the 3D camera 230 and therefore the change in distances depends on changes in the surrounding environment as well as changes in the camera 230 location.
- a known reference point such as the current location of the 3D camera 230
- the point of reference is changing in time and therefore each 3D data pack (image or frame) represents the related distances between the surrounding environmental surfaces and the user/camera 10/230.
- the sensory stimulation may be designed to physically imitate the orientation of each 3D point in respect to the user 10 as much as possible to allow users to quickly and effectively perceive the sensory enhancement through their sense of direction/orientation.
- the lower first group 201a of electrodes 50 is dedicated to represent a lower section of the nearby environment
- the middle second group 201b of electrodes 50 is dedicated to represent a middle section of the environment
- the upper third group 201c of electrodes 50 is dedicated to represent an upper section of the environment.
- the exact electrodes 50 being turned one represent the distance as well as the angular direction between the reference point (the camera/user's head) and the respective 3D point.
- the orientation includes information relating to at least four directions: up, down, left and right, where the actual distance can be represented through the stimulation intensity, for instance. This will allow real time or near real time sensing of objects moving in respect to the user, where perceiving an object moving from the left to the right of the user 10 is done by gradually shifting the sensory stimuli from the left to the right side of the user 10 and similarly in descending/ascending movement of object, a combination of directional movements and the like.
- the scales of intensities as well as 3D points' locations can change according to changes in the environment and/or changes in the location and posture of the userlO. For example, if in a first timeframe the closest 3D point to the user'slO head was "Dl" and the farthest was “D2" and a timeframe or a number of timeframes later the closest distance was "dl " and the farthest was "d2" the intensity scaling and optionally also the selection of electrodes may be adjusted accordingly automatically by the control and processing unit220 of the sensory enhancement system200, to simulate the automatic adjustment of the eye cornea, pupil and/or lens, for instance.
- the angular and distance ranges defined by the operational characteristics of the stimulation output devices can also be adjusted according to environmental and/or selected view perspective features such as according to the viewing aperture or span defined automatically by the control and processing unit 220 or selected by the user 10.
- the viewing aperture or span defined automatically by the control and processing unit 220 or selected by the user 10.
- five electrodes of the same column represent length of lm while in another case the same column of five electrodes may represent a length of 30m.
- the user may be able to control the viewing span and therefore the scaling of the stimulation output devices (calibration thereof) through various control options provided to the user through the control and processing unit, for example, through zooming options and the like determining resolution.
- the output devices operation characteristics also include operational sequence, where a sequence defines operation timing of the output devices. For example, to represent a certain distance a few consecutively arranged electrodes are turned on at the same time or alternatively one after another where the time differences between the operation of each two consecutive electrodes may be determined according to the distance, angular positioning and/or any other visual features of the respective 3D data. Accordingly, simultaneous operation of multiple electrodes may have a different visual-related interpretation than sequential operation.
- the sensory enhancement system 100/200 further includes a language input device used for converting lingual input into commands that allow the control and processing unit 120/220 to adapt the stimulation to instructions from the user inputted by the language input device.
- the lingual input can be textual (e.g. inputted through typing through a keyboard of the control and processing unit 120/220), voice recognition or any other lingual input source and technique.
- Such a system can be used to coordinate between soldiers during combat (virtual or real). For example, a commander sitting in front of a screen can send electric pulses understood by the soldiers as specific commands.
- FIG. 5A schematically illustrates a scenario in which the environment includes a wall 30 with an opening 31 is located to the right of the user 10, wearing the sensory enhancement system 200, and the operational pattern characteristics of the stimulation device 210 in response the environmental surfaces it detect.
- Fig. 5B schematically illustrates the user 10 wearing another type of a sensory enhancement system 300, having a reduced number of output devices 51 for allowing the user 10 to practice the non- visual 3d sensory stimulation and how the stimulation can be translated into a 3D understanding of the surrounding environment, according to another embodiment of the invention.
- the system 300 includes a 3D camera 230, a control and processing unit 320 similar to those of system 200 having a shirt design of a stimulation device 210 including one column of tactile stimulation output devices 51 such as electrodes aligned in the area of the user's spine.
- five electrodes may be turned on in either the same intensity of stimuli or in a varying intensity to indicate the length of the opening in respect to a reference point at the user's head.
- the five electrodes may be operated in a sequential manner in which each electrode is turned on at a different time according to a calculated sequence (frequency) to indicate 3D features thereby such as the distances between the reference point and each respective 3D point in the environment and the like.
- FIG. 6A shows the wall30isin front of the user 10 and the operational pattern characteristics of the stimulation device220 are indicative of the distances between the reference point at the user's head and some other 3D points over the surface of the wall30.
- Fig. 6A also shows the electrodes indicating the distances to the wall 30(dark lines and dark colored electrodes 50) are operated differently than the open space of the opening 31 therein (showing no stimuli in electrodes representing those opening's 31 3D points- indicated in light yellow color).
- a person20 is positioned at a left-central side of the user 20 and the operational pattern characteristics of the stimulation device220 of the system200show how mainly the left electrodes 50 are operated.
- Fig. 7B the person 20 is located at a right side of the user 10 and therefore mainly the right and central electrodes are operated.
- Fig. 8 shows changes in operational characteristics of the stimulation device220 in response to movement of a person 20 in relation to the user 10, according to some embodiments of the present invention.
- the user 10 remains in the same location throughout the movement of the other person 20, having the reference point thereof static therefore.
- both the reference point (the user 10) and the other person(s) change their location and only the momentary respective distances and angular positioning of on in respect to the other are measured.
- a first timeframe represented by part A of Fig. 8 the person 20 is very distant for the user 10 and at a left side thereof.
- the person 20 is very distant for the user 10 and at a left side thereof.
- only three electrodes 50 are operated at the left bottom side of each of the three sections of the stimulation device 220 indicating the respective distance by the signal (stimuli) intensity and the respective dimensions of the figure (person 20) by the distance between the operated electrodes, for instance.
- Fig. 9 is a flowchart, very generally illustrating a method for representing 3D data through non- visual stimulation, using a sensory enhancement system, according to some embodiments of the present invention.
- the method includes: (i) providing receiving 3D data of each specific timeframe 81 from one or more 3D data devices such as a computer game console virtually producing 3D surroundings or a 3D camera/scanner acquiring 3D images from a real surrounding environment of the user; (ii) calculating stimulation pattern of the respective received 3D data82 according to the 3D data and according to the configuration of a stimulation device of the sensory enhancement system that is being used, using one or more predefined conversion algorithms/operators; and (iii) stimulating the user by operating the stimulation device of the sensory enhancement system, according to the calculated stimulation pattern of the respective timeframe 83.
- steps 81-81 are recursively repeated for each 3D data pack transmitted from the 3d data device at each timeframe, where in case the tie interval between each consecutive timeframes (transmissions) is small enough a real time or near real time acquisition and/or conversion is achieved.
Landscapes
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Life Sciences & Earth Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Educational Administration (AREA)
- Physical Education & Sports Medicine (AREA)
- Pain & Pain Management (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Epidemiology (AREA)
- Rehabilitation Therapy (AREA)
- Educational Technology (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Ophthalmology & Optometry (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Vascular Medicine (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A sensory enhancement system for providing non-visual sensory enhancement to a user, where the system includes a 3D data device for producing 3D data representative of a 3D environment; one or more stimulation devices for applying non-visual sensory stimuli to the user; and a control and processing unit for conversion of the 3D data into non- visual sensory stimulation. The control and processing unit receives 3D data from the 3D data device, calculates at least one stimulation pattern according to the received 3D data and operates the stimulation device according to the stimulation pattern. The sensory stimulation applied to the user allows the user to perceive at least part of the 3D environment through the non- visual sensory stimuli.
Description
SYSTEM AND METHOD FOR NON-VISUAL SENSORY
ENHANCEMENT
CROSS-REFERENCE TO RELATED PATENT APPLICATIONS
[0001] This application claims priority to US Provisional Patent Application No. 61/513,922 filed on August 1, 2011, which is incorporated herein by reference in its entirety.
FIELD OF THE INVENTION
[0002] The present invention generally relates to systems and methods for sensory enhancements and more particularly to systems and methods for sensory enhancement through conversion of visual input into non- visual sensory stimuli.
BACKGROUND OF THE INVENTION
[0003] Currently available devices for blind and poor sighted people such as audio indication systems, walking canes and the like offer very limited help and in any case do not simulate the 3D surrounding world but give limited indication to a much less complex information relating to the person's surrounding environment such as the most recommended path (direction) through which the blind person should walk, a red light indication when arriving a pedestrian crossing and the like.
BRIEF SUMMARY OF INVENTION
[0004] According to some aspects of the invention, there is provided a sensory enhancement system for providing non- visual sensory enhancement to a user, where the system includes a 3D data device for producing 3D data representative of a 3D environment; at least one stimulation devices for applying non-visual sensory stimuli to the user; and a control and processing unit for conversion of the 3D data into non- visual sensory stimulation, wherein the control and processing unit receives 3D data from the 3D data device, calculates at least one stimulation pattern according to the received 3D data and operates the at least one stimulation device according to the stimulation pattern. The sensory stimulation applied to the user allows the user to perceive at least part of the 3D environment through the non- visual sensory stimuli.
[0005] Optionally, the 3D data represents a real environment surrounding the user.
[0006] According to other embodiments, the 3D data device comprises a computer that generates 3D data representative of a virtual 3D environment.
[0007] Optionally, the 3D data device comprises at least one: 3D camera; 3D scanner, computer module for generating virtual 3D data and/or a combination thereof.
[0008] According to some embodiments, the stimulation device is configured for applying at least one of the following stimulation types: tactile stimuli; auditory stimuli, olfactory stimuli.
[0009] Optionally, the stimulation device comprises a multiplicity of non- visual stimulation output devices. The output devices may comprise a multiplicity of tactile stimulation output devices each separately controlled by said control and processing unit. For example, the tactile output devices comprise electrodes for applying tactile stimuli by applying an electric pulse, wherein the control and processing unit separately controls intensity and operation of each respective electrode. Any type of tactile stimulation output devices can be used such as: electrodes, vibrating devices and/or pressure devices.
[0010] According to some embodiments, the 3D data device and the control and processing unit are portable and designed to be carried by the user.
[0011] Additionally or alternatively, the 3D data device creates 3D images, each 3D image including multiple arrays of points; each point represented by 3D coordinates in respect to at least one known reference 3D point.
[0012] Optionally, the sensory enhancement system is configured for allowing real time or near real time conversion of the respective 3D data into non- visual sensory stimuli.
[0013] According to some embodiments, the stimulation device comprises at least two sections, each section comprising at least one group of output devices, wherein each section and each group is configured for applying non-visual stimuli over a different area of the user's body, wherein pairs of groups located at different sections of the stimulation device are associated by the control and processing unit for allowing associated operation thereof for representing 3D data.
[0014] Optionally, the stimulation pattern defines operational characteristics of non- visual stimulation output devices of the stimulation device, wherein the operational characteristics comprise at least one of: intensity of stimuli applied by each respective
output device; sequential order for operating said output devices; timing of operating the respective output devices; duration of stimulation.
[0015] Optionally, the stimulation device comprises a combination of output devices enabling applying different types of stimulation.
[0016] According to other aspects of the invention, there is provided a sensory enhancement system for non-visual sensory enhancement comprising: at least one stimulation device for applying non-visual stimuli to a user; and at least one control and processing unit, which receives 3D data and operates the stimulation device according to the received 3D data to allow the respective user to perceive at least part of a 3D environment associated with the 3D data through non- visual sensory stimuli.
[0017] According to yet another aspect of the invention, there is provided a method for non- visual sensory enhancement comprising receiving 3D data from at least one 3D data device, wherein the 3D data is associated with a 3D environment; and applying non- visual stimulation to a user according to the received 3D data, using at least one stimulation device, for allowing the user to perceive the 3D environment through the non-visual stimulation.
[0018] According to some embodiments, the method further comprises acquiring 3D data and transmitting the acquired 3D data to a control and processing unit for operating the at least one stimulation device according to the received 3D data.
[0019] The method may additionally or alternatively also comprise calculating at least one stimulation pattern, according to the received 3D data and the configuration and according to definitions of the stimulation device, wherein the stimulation pattern comprises a blueprint for operating the stimulation device accordingly, to simulate the 3D data through non- visual sensory stimulation.
[0020] Optionally, the 3D data is transmitted from a computer game module to a control and processing unit for translating the respective 3D data into non- visual stimulation and controlling the stimulation device accordingly.
[0021] Additionally or alternatively, the stimulation device comprises a multiplicity of output devices, wherein the stimulation is applied by defining characteristics of the sensory stimulation outputted through the output devices according to the received 3D data, and the output devices are operated according to these characteristics. The characteristics may comprise at least one of: operation mode of each output device;
intensity of stimulation applied through each output device that is operated; stimulation timing characteristics.
[0022] According to some embodiments, the 3D data is represented through 3D points each represented by 3D coordinates indicative of the respective point's distance from at least one known reference 3D point. The reference point is optionally a point located at a vicinity to the user's location and changes along with location changes of the user.
BRIEF DESCRIPTION OF THE DRAWINGS
[0023] Fig. 1 is a block diagram, schematically illustrating a sensory enhancement system for converting 3D visual data into non- visual stimulation, according to some embodiments of the present invention.
[0024] Fig. 2 is a block diagram, schematically illustrating a sensory enhancement system for converting 3D visual data into non- visual stimulation, using a 3D camera for measuring 3D surrounding environment of a user, according to other embodiments of the present invention.
[0025] Fig. 3A shows an example of a first stimulation pattern representation via a matrix indicative of positions of output devices of a first section of a stimulation device and operational data thereof, according to one embodiment of the present invention.
[0026] Fig. 3B shows an example of a second stimulation pattern representation via a matrix indicative of positions of output devices of a second section of the same stimulation device as in Fig. 3A and operational data thereof, according to one embodiment of the present invention.
[0027] Fig. 3C shows an example of a third stimulation pattern representation via a matrix indicative of positions of output devices of a third section of the same stimulation device as in Fig. 3A and 3B and operational data thereof, according to one embodiment of the present invention.
[0028] Fig. 4 schematically illustrates a user wearing the sensory enhancement system having a human positioned at a left side thereof and the operational pattern characteristics of the stimulation device of the system according thereto.
[0029] Fig.5A schematically illustrates a user wearing the sensory enhancement system having a wall with an opening at a right side thereof and the operational pattern characteristics of the stimulation device of the system according thereto.
[0030] Fig. 5B schematically illustrates a user wearing another type of a sensory enhancement system, according to another embodiment of the invention, having a wall with an opening in front of the user and the operational pattern characteristics of a stimulation device of the sensory enhancement system according thereto.
[0031] Fig. 6A schematically illustrates a user wearing the sensory enhancement system having a wall with an opening in front of the user and the operational pattern characteristics of the stimulation device of the system according thereto.
[0032] Fig. 6B schematically illustrates a user wearing the sensory enhancement system having a wall with an opening in front of the user and the operational pattern characteristics of the stimulation device of the system according thereto.
[0033] Fig. 7A schematically illustrates a user wearing the sensory enhancement system having a person at a left side of the user and the operational pattern
characteristics of the stimulation device of the system according thereto.
[0034] Fig. 7B schematically illustrates a user wearing the sensory enhancement system having a person at a right side of the user and the operational pattern characteristics of the stimulation device of the system according thereto.
[0035] Fig. 8 shows changes in operational characteristics of the stimulation device, when an object such as a person moves in respect to the location of the user over time, according to some embodiments of the present invention.
[0036] Fig. 9 is a flowchart, very generally illustrating a process for representing 3D data through non- visual stimulation, according to some embodiments of the present invention.
DETAILED DESCRIPTION OF SOME EMBODIMENTS OF THE INVENTION
[0037] The present invention, in some embodiments thereof, provides systems and methods for converting three-dimensional (3D) input data representing visual information of a real or virtual environment of a user, into non-visual sensory stimulation through stimulation patterns representing/simulating this 3D data to allow the user, such as a blind person or a video game player, to perceive the visual environment (e.g. nearby still objects, landscapes, approaching people and objects and the like) through other senses thereof. The sensory enhancement system allows applying sensory stimuli to the user (e.g. through tactile sensory stimulation means)
according to the sensory stimulation pattern representing the real or virtuaBD surrounding environment of the user.
[0038] According to some embodiments, the systems and methods of the present invention use one or more stimulation devices that can apply the non-visual stimuli to the user and include control and processing means for translating the 3D data into the sensory stimuli in real time or near real time by creating stimuli pattern corresponding to the received 3D data.
[0039] The objective of the present invention is, inter alia, to provide a complex non-visual sensory translation of the 3D surrounding environment of the user since the visual world to simulate the complex 3D experience of the visual sense. To do so, the 3D data, indicative of the visual surrounding environment of the user should be translated into a complex non-visual stimulation that may use the physical space of the user (felt by the user through other senses thereof) to indicate the complex 3D data.
[0040] According to some embodiments of the present invention, the system includes the means for producing the 3D data such as one or more 3D cameras or a computer game system producing virtual 3D data of a real or virtual surrounding environment of the user, respectively. Alternatively, the 3D data producing means are external to the system. In any case, the system is configured to allow retrieval/receiving of 3D data from various 3Ddata devices that can represent a 3D environment through 3d data, such as 3D cameras, 3D software products such as 3D models and/or 3D graphics based video/computer games etc.
[0041] According to some embodiments of the present invention, in which the system is configured for simulating a real 3D environment of a user, a 3D camera of the system is either portable (worn by the user) or located remotely from the user. The 3D camera produces 3D images of the surrounding environment of the user in real time/near real time, wherein the images change according to changes in the
surrounding environment and/or according to changes in the user's location and/or bodily positioning. These 3D images are then translated into a stimulation pattern by a control and processing unit(s) of the sensory enhancement system, creating a stimulation pattern for each 3D image received from the 3D camera. According to these embodiments, the system can then apply non-visual stimulation (such as tactile stimulation) to the user, using the system's stimulation device(s) according to the respective stimulation pattern of the respective 3D image. This allows the user to
perceive the surrounding 3D space through non-visual senses thereof, through a learning process in which the brain (possibly first through the touch center and from there through the visual cortex) learns how to perceive 3D image and space through other senses specifically through stimulation patterns of the system.
[0042] For example, the stimulation device of the system can be a sleeve or sleeves configured to be worn around the arms or torso of the user, and/or any other body part. Each sleeve may have electrodes attached thereto for producing light electric signals (pulses) for producing tactile stimuli applied to the user's body in accordance with electric current stimuli patterns received by the control and processing unit of the sensory enhancement system. The electric current patterns are unique, each electric current pattern represents features of the surrounding space such as objects surfaces, their location in respect to the user, and/or additional selected identifiable input patterns such as color, speed of objects, etc.
[0043] Any device(s) that can apply non- visual stimulation may be used for applying the non-visual stimulation applying any type of stimuli such as tactile stimuli, auditory stimuli, taste and/or olfactory stimuli and the like and the stimulation patterns produced may be adapted to the specific one or more devices types used. The stimulation device may include for example one or more stimulation output devices (shortly referred to hereinafter also as "output device(s)) such as: electrodes, vibrating devices, pressure devices, devices that can apply heat over the user's skin, speakers for applying auditory stimuli or a combination thereof.
[0044] According to some embodiments, some tactile stimulation output devices may be used for the main purpose of the sensory enhancement system such as electrodes while one or more other output devices may be used for allowing the user to perceive other aspects of the environment for improving visual perception thereof.
[0045] Reference is now made to Fig. 1, schematically illustrating a sensory enhancement system 100, according to some embodiments of the present invention. The system 100 includes a control and processing unit 120, a 3Ddata devicel30 and a sensory stimulation device 110. The data devicel30 may be either included as part of the sensory enhancement system 100 or optionally be external thereto. The control and processing unit 120 receives 3D data in real time or near real time from the 3D data devicel30, for example, including 3D images of the surrounding area; translates each such 3D data (e.g. image) into a stimulation pattern (e.g. calculating the respective
stimulation pattern); and controls the stimulation device 110 for applying non-visual sensory stimuli to the user at each given timeframe according to the stimulation pattern. The stimulation may be carried out by transmitting signals to the stimulation device 110 via a communication link such as link 92, which could be wireless or non-wireless.
[0046] According to some embodiments of the present invention, as illustrated in Fig. 1, the stimulation device 110 includes multiple arrays of tactile stimuli output devices such as electrodes 50 arranged in groups. The stimulation device 110 can be designed for wear as a cape having a head opening 112. In this configuration, two groups of output devices (electrodes 50) are formed: a first group 111a for being positioned over the frontal side of the user when worn; and a second group 111b for being positioned over the back side of the user when worn. In this way, the user receives tactile stimuli over his front and back sides of his/her torso and the
stimulations locations thereover are coordinated and can be synchronized by the control and processing unit 120.
[0047] According to some embodiments of the present invention, the 3D data devicel30 transmits 3D data to the control and processing unit 120 via a
communication line such as link 91, which can be a wireless or non-wireless link. The 3D data devicel30 may be any device that can sense the 3D surrounding space of the user or a device that produces virtual 3D images related data such as a 3D camera producing 3D images in real time of the area, or a computer game module
(computer/video game console) and the like, depending on the purpose and
requirements of the sensory enhancement system lOO.For example, if the sensory enhancement system 100 is designed for blind people, the 3D data devicel30 includes a 3D camera or a 3D scanner enabling to produce 3D images or models of the surrounding environment that has been photographed/scanned at each given timeframe, allowing transmitting 3D image(s) or data that allows building of the environmental 3D models to the control and processing unit 120. The control and processing unit 120 then calculates a corresponding sensory stimulation pattern for each received 3D image/model, which is a blueprint for operating the sensory stimulation device 110. The stimulation pattern may include a signal pattern that includes machine- operable signals/data/commands for controlling and operating the stimulation device 110 to apply stimuli over specific areas of the user's body according to the 3D data received. The operation and controlling may include controlling operational characteristics such
as: (i) stimulation intensity (signal intensity); (ii)stimulation timing (e.g. sequence timing between multiple output devices and/or duration of stimulation applied, intensity fading effects and the like); (iii)stimulation mode of each output device - which determines the number and location of output devices operated (turned on), and the like.
[0048] According to some embodiments of the present invention, the control and processing unit 120 uses translation algorithms, which may include one or more mathematical operators, for converting the 3D data to stimulation patterns according to predefined logics and methodology as well as according to the stimulation device 120 characteristics and abilities. For example, distances scales may be represented by signal intensity scales, where the intensity of the stimuli corresponds to the distance between the object/object part or area in the 3D image/model and the user. Speeds may be represented by decreasing/increasing intensities of the stimulation applied where, for instance, a decrease in intensity represents that the object is moving away from the user (the distance increases) and vice versa.
[0049] This will allow the user to learn how to perceive the visual surroundings with the non- visual sensory stimulation and to be able with time and practice to quickly translate in his/her brain what the specific stimuli visually means, in terms of the shape of objects, their distances therefrom, their speed of movements and change in location and the like, by having a predefined set of rules representing the algorithm(s) logics.
[0050] According to some embodiments, the methodology for calculating the respective stimulation pattern for each 3d image/model may correspond to the manner in which the virtual cortex perceives visual input and therefore may be based on studies that investigate these neurological processes.
[0051] According to some embodiments, distance can be simulated by having pairs of stimulation output devices50 located at different locations over the stimulation devicellO and therefore over the user's body, such as pairs of electrodes 50 one of each located at the front of the user and the other at the back. The pair of electrodes50 forms a distance unit identifier and used for simulating the distance between an object/object's area to the user.
[0052] For example, assuming a 1-10 range of electrodes' 50 electric signal intensities, very a nearby object area would be represented by intensity 9 in the front electrode and an intensity of 1 on the correlating (sister) back electrode. The intensity
difference between front and back will be translated by the brain into distance. As the person gets familiar with the system such translation will become automatic.
The wearable stimulation device HOmay have as many pairs of electrodes (back and front) as the respective user can distinguish. The stimulation device HOmay have different resolutions to allow each person to have as many pairs of electrodes as he feels comfortable having. Both for distance range, as well as for how detailed the wish to see other identifiable patterns. Distances from areas/spots in the surrounding environment to the user may ultimately allow the user to perceive 3D shapes over time. This means that the number of stimulation devices may increase as the user is more skillful and quicker in perceiving the 3D data.
[0053] According to embodiments, each electrode 50 of the frontal first group 111a may be associated with an electrode 50 in the back second group 111b for creating sensory stimulation patterns of pairs of electrodes 50for allowing translating distances to differences between intensities of electrode pairs 50, as explained above.
[0054] Reference is now made to Fig. 2, schematically illustrating a user 10 using a sensory enhancement system 200, according to some embodiments of the present invention. The user 10 wears a portable 3D camera 230 over his head for allowing optimal proximity to the user's 10 eyes area and optimal comfort of wear. The 3D camera 230 acquires 3D data of the frontal environment of the user 10 at given timeframes (e.g. each few fractions of a second a 3D image is acquired) using depth- detection sensors, for instance, and transmits the 3D images (one at a time) to a control and processing unit 220 of the sensory enhancement system 200. The control and processing unit 220 operates a stimulation device 210 of the system 200 that includes multiple groups of tactile stimulation output devices (shortly referred to as output devices) such as electrodes50' for operating thereof.
[0055] The 3D data of the camera 230 may include lists of 3D points (each point represented by its respective x, y and z coordinates' values thereof) each point is given in respect to a reference point that is related to the user's 10 location (e.g. referring to the user's head). At each given timeframe the 3D data represents a still 3D image of the surrounding space of the user lOwhere a new 3D data pack is transmitted each given timeframe for updating changes in the environment in near real time, depending on the speed of data acquisition and transmission abilities of the system 200. The timeframe and other such features of the system 200 may be changeable through the control and
processing unit 220 providing, for instance, a designated user interface that allows input and display options for setting these features.
[0056] The control and processing unit may be any device that allows
electronically controlling operations of the system 200 and processing data and optionally that allows input and display. For example, the control and processing unit 220 may be a mobile phone such as a smartphone or a tablet device having a designated application installed therein or operated thereby in any other configuration.
[0057] The control and processing unit 220 receives each such 3D data and translates it to a corresponding stimulation pattern. This pattern includes a set of matrices each representing a different group of electrodes arranged according to the configuration (number of rows and columns) of the electrodes in each group. A number is given to each component of each matrix representing the intensity of the signal to be transmitted thereto and therefore indicative of the intensity of the stimuli applied thereby. Figures 3A-3B represent an example of a stimulation pattern for electrode groups based stimulation device having five columns and five rows of electrodes at each group. No signal is represented by the number "0" and the signal intensity is represented by a non-zero number indicative of the signal intensity according to a predefined scale that can optionally be changed according to environmental conditions. A first matrix 301a represents the first group 201a of electrodes. In this example, only electrode 4A is operated to an intensity of 2 according to a scale of 1-10, where 1 represents the lowest intensity and 10 represents the highest intensity. Similarly, a second matrix 301b represents the second group201b of electrodes in which only electrode 4A is operated to an intensity level of 3 and a third matrix 301c represents the third group201c of electrodes in which only electrode 4A is operated to an intensity level of 1. The first group is located highest in the user's 10 back (see Fig. 2); the second group is located in a middle section of his back and the third at a lowest part thereof. This means that the user 10 will feel simultaneous signals applied at three points over his back with different intensities, where the highest intensity will be felt at the lowest left point of the middle second group 201b, the medium signal at his lowest left point of the heist third group 201c and the lowest signal at his lowest left point of the lowest first group 201a. This exemplary stimulation pattern may be associated with specific 3D information such as (as shown in Fig. 2) a human figure 20 positioned at the left of the user 10.
[0058] According to this example, as illustrated in Fig. 2and Fig. 4,in combined with the stimulation patterns described in Figures3A-3B, the control and processing unit 220operates the electrodes 50 of the stimulation device 210 that correspond to A4 positions in the first, second and third groups 201a, 201b and 201c of electrodes, respectively. If we take, for instance, three points over a human object 20 in the nearby environment of the user 10, as illustrated in Fig. 2, these three points having 3D coordinates relative to the one or more current location of the reference points: 21a, 21b and 21c are measured by the 3D camera 230 and 3D data thereof 24 is transmitted to the control and processing unit 220, which in turn creates stimulation patterns 301a- 301c for the three electrode groups 201a-201c respectively, operating each electrode separately by transmitting operational signals 23a, 23ba and 23c to each of the three electrodes 50, respectively for operating them according to their respective patterns 301a-301c (i.e. according to the intensities determined for each electrode that is to be "turned on").
[0059] As mentioned above, the coordinates representing the 3D data are given in respect to a known reference point such as the current location of the 3D camera 230 and therefore the change in distances depends on changes in the surrounding environment as well as changes in the camera 230 location. In this exemplary case, in which the 3D camera 230 is portably carried by the user 10 (worn on his/her head) - the point of reference is changing in time and therefore each 3D data pack (image or frame) represents the related distances between the surrounding environmental surfaces and the user/camera 10/230.
[0060] According to some embodiments of the present invention, the sensory stimulation may be designed to physically imitate the orientation of each 3D point in respect to the user 10 as much as possible to allow users to quickly and effectively perceive the sensory enhancement through their sense of direction/orientation. For example, in the case of multiple arrays of stimulation output devices and groups thereof, the lower first group 201a of electrodes 50 is dedicated to represent a lower section of the nearby environment; the middle second group 201b of electrodes 50 is dedicated to represent a middle section of the environment; and the upper third group 201c of electrodes 50 is dedicated to represent an upper section of the environment. The exact electrodes 50 being turned one represent the distance as well as the angular direction between the reference point (the camera/user's head) and the respective 3D
point. The orientation includes information relating to at least four directions: up, down, left and right, where the actual distance can be represented through the stimulation intensity, for instance. This will allow real time or near real time sensing of objects moving in respect to the user, where perceiving an object moving from the left to the right of the user 10 is done by gradually shifting the sensory stimuli from the left to the right side of the user 10 and similarly in descending/ascending movement of object, a combination of directional movements and the like.
[0061] Additionally or alternatively, the scales of intensities as well as 3D points' locations can change according to changes in the environment and/or changes in the location and posture of the userlO. For example, if in a first timeframe the closest 3D point to the user'slO head was "Dl" and the farthest was "D2" and a timeframe or a number of timeframes later the closest distance was "dl " and the farthest was "d2" the intensity scaling and optionally also the selection of electrodes may be adjusted accordingly automatically by the control and processing unit220 of the sensory enhancement system200, to simulate the automatic adjustment of the eye cornea, pupil and/or lens, for instance.
[0062] According to some embodiments of the present invention, the angular and distance ranges defined by the operational characteristics of the stimulation output devices can also be adjusted according to environmental and/or selected view perspective features such as according to the viewing aperture or span defined automatically by the control and processing unit 220 or selected by the user 10.. For example, in one case five electrodes of the same column represent length of lm while in another case the same column of five electrodes may represent a length of 30m. The user may be able to control the viewing span and therefore the scaling of the stimulation output devices (calibration thereof) through various control options provided to the user through the control and processing unit, for example, through zooming options and the like determining resolution.
[0063] Additionally or alternatively, the output devices operation characteristics also include operational sequence, where a sequence defines operation timing of the output devices. For example, to represent a certain distance a few consecutively arranged electrodes are turned on at the same time or alternatively one after another where the time differences between the operation of each two consecutive electrodes may be determined according to the distance, angular positioning and/or any other
visual features of the respective 3D data. Accordingly, simultaneous operation of multiple electrodes may have a different visual-related interpretation than sequential operation.
[0064] In some embodiments, the sensory enhancement system 100/200 further includes a language input device used for converting lingual input into commands that allow the control and processing unit 120/220 to adapt the stimulation to instructions from the user inputted by the language input device. The lingual input can be textual (e.g. inputted through typing through a keyboard of the control and processing unit 120/220), voice recognition or any other lingual input source and technique. Such a system can be used to coordinate between soldiers during combat (virtual or real). For example, a commander sitting in front of a screen can send electric pulses understood by the soldiers as specific commands.
[0065] Fig. 5A schematically illustrates a scenario in which the environment includes a wall 30 with an opening 31 is located to the right of the user 10, wearing the sensory enhancement system 200, and the operational pattern characteristics of the stimulation device 210 in response the environmental surfaces it detect.
[0066] Fig. 5B schematically illustrates the user 10 wearing another type of a sensory enhancement system 300, having a reduced number of output devices 51 for allowing the user 10 to practice the non- visual 3d sensory stimulation and how the stimulation can be translated into a 3D understanding of the surrounding environment, according to another embodiment of the invention. In this case, the system 300 includes a 3D camera 230, a control and processing unit 320 similar to those of system 200 having a shirt design of a stimulation device 210 including one column of tactile stimulation output devices 51 such as electrodes aligned in the area of the user's spine. In this example, in which the surroundings include the wall 30 and opening 31 therein, five electrodes may be turned on in either the same intensity of stimuli or in a varying intensity to indicate the length of the opening in respect to a reference point at the user's head. The five electrodes may be operated in a sequential manner in which each electrode is turned on at a different time according to a calculated sequence (frequency) to indicate 3D features thereby such as the distances between the reference point and each respective 3D point in the environment and the like.
[0067] In Figures 6Aand 6B the wall30isin front of the user 10 and the operational pattern characteristics of the stimulation device220 are indicative of the distances
between the reference point at the user's head and some other 3D points over the surface of the wall30. Fig. 6A also shows the electrodes indicating the distances to the wall 30(dark lines and dark colored electrodes 50) are operated differently than the open space of the opening 31 therein (showing no stimuli in electrodes representing those opening's 31 3D points- indicated in light yellow color).
[0068] In Fig. a person20is positioned at a left-central side of the user 20 and the operational pattern characteristics of the stimulation device220 of the system200show how mainly the left electrodes 50 are operated. In Fig. 7Bthe person 20 is located at a right side of the user 10 and therefore mainly the right and central electrodes are operated.
[0069] Fig. 8 shows changes in operational characteristics of the stimulation device220 in response to movement of a person 20 in relation to the user 10, according to some embodiments of the present invention. In this case, the user 10 remains in the same location throughout the movement of the other person 20, having the reference point thereof static therefore. Of course in most cases both the reference point (the user 10) and the other person(s) change their location and only the momentary respective distances and angular positioning of on in respect to the other are measured.
[0070] In a first timeframe represented by part A of Fig. 8, the person 20 is very distant for the user 10 and at a left side thereof. In this case only three electrodes 50 are operated at the left bottom side of each of the three sections of the stimulation device 220 indicating the respective distance by the signal (stimuli) intensity and the respective dimensions of the figure (person 20) by the distance between the operated electrodes, for instance.
[0071] In the next timeframes represented by parts B, C, D and E of Fig. 8, the person 20 approaches the user 10 a little from the left side thereof. In this case two more electrodes 50 in the upper section of the stimulation device are turned on (operated).
[0072] In the last timeframe represented by part E of Fig. 8, the person 20is at the left side of the user 10 and at a different distance therefrom, indicated by a different pattern of operated electrodes 50 of the stimulation device 220.
[0073] Reference is now made to Fig. 9 is a flowchart, very generally illustrating a method for representing 3D data through non- visual stimulation, using a sensory enhancement system, according to some embodiments of the present invention. The
method includes: (i) providing receiving 3D data of each specific timeframe 81 from one or more 3D data devices such as a computer game console virtually producing 3D surroundings or a 3D camera/scanner acquiring 3D images from a real surrounding environment of the user; (ii) calculating stimulation pattern of the respective received 3D data82 according to the 3D data and according to the configuration of a stimulation device of the sensory enhancement system that is being used, using one or more predefined conversion algorithms/operators; and (iii) stimulating the user by operating the stimulation device of the sensory enhancement system, according to the calculated stimulation pattern of the respective timeframe 83.
[0074] According to some embodiments of the present invention, steps 81-81 are recursively repeated for each 3D data pack transmitted from the 3d data device at each timeframe, where in case the tie interval between each consecutive timeframes (transmissions) is small enough a real time or near real time acquisition and/or conversion is achieved.
[0075] In case of a sensory enhancement system that is designed for
computer/video games, since most parts of the visual scenarios is known the data transmission and conversion can be really fast, especially for games requiring very low 3D visual resolution for identification of moving and still objects in the environment. Of course the user (player's) abilities to interpret the stimuli to the visual information it indicates paly a large part of the efficiency of the sensory enhancement.
[0076] In case of simulating real environment the speed of conversion depends both on the technical abilities of the system as well as the skills of the user to interpret the stimuli into visual information.
[0077] Many alterations and modifications may be made by those having ordinary skill in the art without departing from the spirit and scope of the invention. Therefore, it must be understood that the illustrated embodiment has been set forth only for the purposes of example and that it should not be taken as limiting the invention as defined by the following invention and its various embodiments
[0078] Therefore, it must be understood that the illustrated embodiment has been set forth only for the purposes of example and that it should not be taken as limiting the invention as defined by the following claims. For example, notwithstanding the fact that the elements of a claim are set forth below in a certain combination, it must be expressly understood that the invention includes other combinations of fewer, more or
different elements, which are disclosed in above even when not initially claimed in such combinations. A teaching that two elements are combined in a claimed
combination is further to be understood as also allowing for a claimed combination in which the two elements are not combined with each other, but may be used alone or combined in other combinations. The excision of any disclosed element of the invention is explicitly contemplated as within the scope of the invention.
[0079] The words used in this specification to describe the invention and its various embodiments are to be understood not only in the sense of their commonly defined meanings, but to include by special definition in this specification structure, material or acts beyond the scope of the commonly defined meanings. Thus if an element can be understood in the context of this specification as including more than one meaning, then its use in a claim must be understood as being generic to all possible meanings supported by the specification and by the word itself.
[0080] The definitions of the words or elements of the following claims are, therefore, defined in this specification to include not only the combination of elements which are literally set forth, but all equivalent structure, material or acts for performing substantially the same function in substantially the same way to obtain substantially the same result. In this sense it is therefore contemplated that an equivalent substitution of two or more elements may be made for any one of the elements in the claims below or that a single element may be substituted for two or more elements in a claim. Although elements may be described above as acting in certain combinations and even initially claimed as such, it is to be expressly understood that one or more elements from a claimed combination can in some cases be excised from the combination and that the claimed combination may be directed to a sub-combination or variation of a sub- combination.
[0081] Insubstantial changes from the claimed subject matter as viewed by a person with ordinary skill in the art, now known or later devised, are expressly contemplated as being equivalently within the scope of the claims. Therefore, obvious substitutions now or later known to one with ordinary skill in the art are defined to be within the scope of the defined elements.
[0082] The claims are thus to be understood to include what is specifically illustrated and described above, what is conceptually equivalent, what can be obviously substituted and also what essentially incorporates the essential idea of the invention.
Claims
1. A sensory enhancement system for providing non- visual sensory enhancement to a user, said system comprising:
(i) a three-dimensional (3D) data device for producing 3D data representative of a 3D environment;
(ii) at least one stimulation device for applying non-visual sensory stimuli to the user; and
(iii) a control and processing unit for conversion of 3D data into non- visual sensory stimulation, wherein said control and processing unit receives 3D data from said 3D data device, calculates at least one stimulation pattern according to the received 3D data and operates said at least one stimulation device according to said stimulation pattern, wherein said sensory stimulation applied to the user allows the user to perceive at least part of the 3D environment through non-visual sensory stimuli.
2. The sensory enhancement system according to claim 1, wherein said 3D data represents a real environment surrounding the user.
3. The sensory enhancement system according to claim 1, wherein said 3D data device comprises a computer that generates 3D data representative of a virtual 3D environment.
4. The sensory enhancement system according to claim 1, wherein said 3D data device comprises at least one: 3D camera; 3D scanner, Computer module for generating virtual 3D data or a combination thereof.
5. The sensory enhancement system according to claim 1, wherein said stimulation device is configured for applying at least one of the following stimulation types: tactile stimuli; auditory stimuli, olfactory stimuli.
6. The sensory enhancement system according to claim 1, wherein said stimulation device comprises a multiplicity of stimulation output devices.
7. The sensory enhancement system according to claim 6, wherein said multiplicity of output devices comprise a multiplicity of tactile stimulation output devices each separately controlled by said control and processing unit.
8. The sensory enhancement system according to claim 7, wherein said multiplicity of tactile stimulation output devices comprise electrodes for applying tactile stimuli by applying an electric pulse, said control and processing unit separately controls intensity and operation of each respective said electrode.
9. The sensory enhancement system according to claim 7, wherein said multiplicity of tactile stimulation output devices comprises: electrodes, vibrating devices and/or pressure devices.
10. The sensory enhancement system according to claim 1, wherein said 3D data device and said control and processing unit are portable and designed to be carried by the user.
11. The sensory enhancement system according to claim 1, wherein said 3D data device creates 3D images, each 3D image including multiple arrays of points, each point represented by 3D coordinates in respect to at least one known reference 3D point.
12. The sensory enhancement system according to claim 1, configured for allowing real time or near real time conversion of the respective 3D data into non- visual sensory stimuli.
13. The sensory enhancement system according to claim 1, wherein said stimulation device comprises at least two sections, each section comprising at least one group of output devices, wherein each section and each group is configured for applying non- visual stimuli over a different area of the user's body, wherein pairs of groups located at different sections of said stimulation device are associated by the control and processing unit for allowing associated operation thereof for representing 3D data.
14. The sensory enhancement system according to claim 1, wherein said stimulation pattern defines operational characteristics of non- visual stimulation output devices of said stimulation device, said operational characteristics comprise at least one of: intensity of stimuli applied by each respective output device; sequential order for operating said output devices; timing of operating the respective output devices; duration of stimulation.
15. The sensory enhancement system according to claim 1, wherein said stimulation device comprises a combination of output devices enabling applying different types of stimulation.
16. A sensory enhancement system for non- visual sensory enhancement, said system comprising: a) at least one stimulation device for applying non- visual stimuli to a user; and b) at least one control and processing unit, which receives 3D data and operates said at least one stimulation device according to the received 3D data to allow the respective said user to perceive at least part of a 3D environment associated with said 3D data through non-visual sensory stimuli.
17. A method for non-visual sensory enhancement, said method comprising: a) receiving 3D data from at least one 3D data device, said 3D data is associated with a 3D environment; b) applying non-visual stimulation to a user according to the received 3d data, using at least one stimulation device, for allowing said user to perceive said 3D environment through the non-visual stimulation.
18. The method according to claim 17 further comprising acquiring 3D data and transmitting said acquired 3D data to a control and processing unit for operating said at least one stimulation device according to the received 3D data.
19. The method according to claim 18 further comprising calculating at least one stimulation pattern, according to said received 3D data and the configuration and definitions of said stimulation device, said stimulation pattern comprises a blueprint for operating said stimulation device accordingly, to simulate the 3D data through non-visual sensory stimulation.
20. The method according to claim 17, wherein said 3D environment is a virtual environment, wherein said 3D data is transmitted from a computer game module to a control and processing unit for translating the respective said 3D data into non-visual stimulation and controlling said stimulation device accordingly.
21. The method according to claim 17, wherein said stimulation devices comprises a multiplicity of output devices, wherein said stimulation is applied by defining characteristics of the sensory stimulation outputted through the output devices according to the received 3D data, and operating the output devices according to said characteristics.
22. The method according to claim 21, wherein said characteristics comprise at least one of: a) operation mode of each output device; b) intensity of stimulation applied through each output device that is operated; and c) stimulation timing characteristics.
23. The method according to claim 17, wherein said 3D data is represented through 3D points each represented by 3D coordinates indicative of the respective point's distance from at least one known reference 3D point.
24. The method according to claim 23, wherein said reference point is a point located at a vicinity of the user and changes along with location changes of the user.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161513922P | 2011-08-01 | 2011-08-01 | |
US61/513,922 | 2011-08-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013018090A1 true WO2013018090A1 (en) | 2013-02-07 |
Family
ID=47628696
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2012/050280 WO2013018090A1 (en) | 2011-08-01 | 2012-07-31 | System and method for non-visual sensory enhancement |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2013018090A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105250119A (en) * | 2015-11-16 | 2016-01-20 | 深圳前海达闼云端智能科技有限公司 | Blind guiding method, device and equipment |
WO2016198721A1 (en) * | 2015-06-12 | 2016-12-15 | Eyesynth, S.L. | Portable system that allows blind or visually impaired persons to interpret the surrounding environment by sound or touch |
GB2554117A (en) * | 2016-07-05 | 2018-03-28 | Pawan Shyam Kaura Lakshya | An alerting system for a visually challenged pedestrian |
EP3195164A4 (en) * | 2014-07-28 | 2018-04-11 | National Ict Australia Pty Ltd | Determination of parameter values for sensory substitution devices |
FR3060297A1 (en) * | 2016-12-20 | 2018-06-22 | Universite Pierre Et Marie Curie (Paris 6) | ASYNCHRONOUS TOUCH STIMULATION SENSORY SUBSTITUTION SYSTEM |
CN110869095A (en) * | 2017-06-27 | 2020-03-06 | 森沃克斯有限公司 | Interactive entertainment device |
FR3089785A1 (en) * | 2018-12-17 | 2020-06-19 | Pierre Briand | Medical device to aid in the perception of the environment for blind or visually impaired users |
US11928981B2 (en) | 2022-06-22 | 2024-03-12 | Kevin Fan | Tactile vision |
GB2622184A (en) * | 2022-05-04 | 2024-03-13 | Kp Enview Ltd | Personal assistance systems and methods |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6055048A (en) * | 1998-08-07 | 2000-04-25 | The United States Of America As Represented By The United States National Aeronautics And Space Administration | Optical-to-tactile translator |
US20030151519A1 (en) * | 2002-02-14 | 2003-08-14 | Lin Maw Gwo | Guide assembly for helping and guiding blind persons |
GB2409798A (en) * | 2004-01-12 | 2005-07-13 | Graeme Donald Robertson | A garment that provides a tactile in response to a computer signal |
US20070016425A1 (en) * | 2005-07-12 | 2007-01-18 | Koren Ward | Device for providing perception of the physical environment |
-
2012
- 2012-07-31 WO PCT/IL2012/050280 patent/WO2013018090A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6055048A (en) * | 1998-08-07 | 2000-04-25 | The United States Of America As Represented By The United States National Aeronautics And Space Administration | Optical-to-tactile translator |
US20030151519A1 (en) * | 2002-02-14 | 2003-08-14 | Lin Maw Gwo | Guide assembly for helping and guiding blind persons |
GB2409798A (en) * | 2004-01-12 | 2005-07-13 | Graeme Donald Robertson | A garment that provides a tactile in response to a computer signal |
US20070016425A1 (en) * | 2005-07-12 | 2007-01-18 | Koren Ward | Device for providing perception of the physical environment |
Non-Patent Citations (1)
Title |
---|
S. MEERS ET AL.: "A vision system for providing 3D perception of the environment via transcutaneous electro- neural stimulation", 14 July 2004 (2004-07-14) * |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3195164A4 (en) * | 2014-07-28 | 2018-04-11 | National Ict Australia Pty Ltd | Determination of parameter values for sensory substitution devices |
US10441500B2 (en) | 2014-07-28 | 2019-10-15 | National Ict Australia Limited | Determination of parameter values for sensory substitution devices |
CN107708624B (en) * | 2015-06-12 | 2021-12-14 | 智能眼睛有限公司 | Portable system allowing blind or visually impaired people to understand the surroundings acoustically or by touch |
KR20180018587A (en) * | 2015-06-12 | 2018-02-21 | 아이신쓰, 에스.엘. | Portable system that allows the blind or visually impaired to understand the environment by sound or touch |
CN107708624A (en) * | 2015-06-12 | 2018-02-16 | 智能眼睛有限公司 | Blind person or visually impaired people is allowed to understand the portable system of surrounding environment by sound or tactile |
KR102615844B1 (en) | 2015-06-12 | 2023-12-21 | 아이신쓰, 에스.엘. | A portable system that allows blind or visually impaired people to understand their surroundings through sound or touch. |
EP3308759A4 (en) * | 2015-06-12 | 2019-02-27 | Eyesynth, S.L. | Portable system that allows blind or visually impaired persons to interpret the surrounding environment by sound or touch |
RU2719025C2 (en) * | 2015-06-12 | 2020-04-16 | Айсинт, С.Л. | Portable system that allows blind or visually impaired persons to interpret the surrounding environment by sound or touch |
WO2016198721A1 (en) * | 2015-06-12 | 2016-12-15 | Eyesynth, S.L. | Portable system that allows blind or visually impaired persons to interpret the surrounding environment by sound or touch |
US11185445B2 (en) | 2015-06-12 | 2021-11-30 | Eyesynth, S.L. | Portable system that allows blind or visually impaired persons to interpret the surrounding environment by sound and touch |
AU2016275789B2 (en) * | 2015-06-12 | 2021-03-11 | Eyesynth, S.L. | Portable system that allows blind or visually impaired persons to interpret the surrounding environment by sound or touch |
CN105250119A (en) * | 2015-11-16 | 2016-01-20 | 深圳前海达闼云端智能科技有限公司 | Blind guiding method, device and equipment |
GB2554117A (en) * | 2016-07-05 | 2018-03-28 | Pawan Shyam Kaura Lakshya | An alerting system for a visually challenged pedestrian |
FR3060297A1 (en) * | 2016-12-20 | 2018-06-22 | Universite Pierre Et Marie Curie (Paris 6) | ASYNCHRONOUS TOUCH STIMULATION SENSORY SUBSTITUTION SYSTEM |
JP2020501747A (en) * | 2016-12-20 | 2020-01-23 | ソルボンヌ・ユニヴェルシテSorbonne Universite | Sensory substitute system using asynchronous tactile stimulation |
CN110300562A (en) * | 2016-12-20 | 2019-10-01 | 索邦大学 | Use the sense organ alternative system of asynchronous haptic stimulus |
US11654055B2 (en) | 2016-12-20 | 2023-05-23 | Sorbonne Universite | Sensory substitution system using asynchronous tactile stimulation |
WO2018115627A1 (en) * | 2016-12-20 | 2018-06-28 | Universite Pierre Et Marie Curie (Paris 6) | System for sensory substitution by asynchronous tactile stimulation |
CN110869095A (en) * | 2017-06-27 | 2020-03-06 | 森沃克斯有限公司 | Interactive entertainment device |
FR3089785A1 (en) * | 2018-12-17 | 2020-06-19 | Pierre Briand | Medical device to aid in the perception of the environment for blind or visually impaired users |
WO2020128173A1 (en) * | 2018-12-17 | 2020-06-25 | Pierre Briand | Medical device for improving environmental perception for blind or visually impaired users |
US11684517B2 (en) | 2018-12-17 | 2023-06-27 | Pierre Briand | Medical device for improving environmental perception for blind or visually-impaired users |
GB2622184A (en) * | 2022-05-04 | 2024-03-13 | Kp Enview Ltd | Personal assistance systems and methods |
US11928981B2 (en) | 2022-06-22 | 2024-03-12 | Kevin Fan | Tactile vision |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013018090A1 (en) | System and method for non-visual sensory enhancement | |
EP2482760B1 (en) | Object tracking for artificial vision | |
US9690376B2 (en) | Wireless wrist computing and control device and method for 3D imaging, mapping, networking and interfacing | |
US20140184384A1 (en) | Wearable navigation assistance for the vision-impaired | |
KR101343860B1 (en) | Robot avatar system using hybrid interface and command server, learning server, and sensory server therefor | |
WO2007013833A1 (en) | Method and system for visualising virtual three-dimensional objects | |
TWI496027B (en) | Motion guidance prompting method, system thereof and motion guiding prompting device | |
US20160321955A1 (en) | Wearable navigation assistance for the vision-impaired | |
KR20150092444A (en) | A wireless exoskeleton haptic interface device for simultaneously delivering tactile and joint resistance and the method for comprising the same | |
JP7373800B2 (en) | Calibration system and interpupillary calibration method | |
KR102051946B1 (en) | Apparatus and method for controlling smart wear | |
Kerdegari et al. | Head-mounted sensory augmentation device: Designing a tactile language | |
Hu et al. | Stereopilot: A wearable target location system for blind and visually impaired using spatial audio rendering | |
JP2023501079A (en) | Co-located Pose Estimation in a Shared Artificial Reality Environment | |
CN106358024A (en) | Stroke monitoring system and stroke monitoring method | |
CN108961893A (en) | A kind of virtual reality interactive simulation experience system based on VR equipment | |
US20240302908A1 (en) | Virtual, Augmented and Mixed Reality Systems with Physical Feedback | |
KR20220058941A (en) | direction assistance system | |
KR20180034278A (en) | Visual perception training device, method and program for visual perception training using head mounted device | |
CN113035000A (en) | Virtual reality training system for central integrated rehabilitation therapy technology | |
KR102183398B1 (en) | Ocular muscle training method and system | |
Kálmán et al. | Wearable technology to help with visual challenges–two case studies | |
US20200146618A1 (en) | Device with a detection unit for the position and orientation of a first limb of a user | |
Sessner et al. | Multimodal feedback to support the navigation of visually impaired people | |
Velázquez et al. | Usability evaluation of foot-based interfaces for blind travelers |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12819584 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 23/09/2014) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12819584 Country of ref document: EP Kind code of ref document: A1 |