WO2015159237A1 - Wearable sensory substitution system, in particular for blind or visually impaired people - Google Patents

Wearable sensory substitution system, in particular for blind or visually impaired people Download PDF

Info

Publication number
WO2015159237A1
WO2015159237A1 PCT/IB2015/052749 IB2015052749W WO2015159237A1 WO 2015159237 A1 WO2015159237 A1 WO 2015159237A1 IB 2015052749 W IB2015052749 W IB 2015052749W WO 2015159237 A1 WO2015159237 A1 WO 2015159237A1
Authority
WO
WIPO (PCT)
Prior art keywords
system
estimation
control unit
according
apparatus
Prior art date
Application number
PCT/IB2015/052749
Other languages
French (fr)
Inventor
Monica GORI
Giulio Sandini
David Charles BURR
Antonio MAVIGLIA
Tiziana VERCILLO
Gabriel BAUD-BOVY
Original Assignee
Fondazione Istituto Italiano Di Tecnologia
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to ITTO2014A000323 priority Critical
Priority to ITTO20140323 priority
Application filed by Fondazione Istituto Italiano Di Tecnologia filed Critical Fondazione Istituto Italiano Di Tecnologia
Publication of WO2015159237A1 publication Critical patent/WO2015159237A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/08Devices or methods enabling eye-patients to replace direct visual perception by another kind of perception
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • A61H2003/063Walking aids for blind persons with electronic detecting or guiding means with tactile perception
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1635Hand or arm, e.g. handle
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/165Wearable interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • A61H2201/501Control means thereof computer controlled connected to external computer devices or networks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5023Interfaces to the user
    • A61H2201/5048Audio interfaces, e.g. voice or music controlled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5084Acceleration sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5097Control means thereof wireless

Abstract

The system comprises at least one module (11; 111) comprising, in turn: - a sensor apparatus (12), for being worn by a user and designed to detect data representing the arrangement assumed within the space by the part of the user's body on which said sensor apparatus (12) is worn by the user, - a signaling apparatus (14), for providing a signal that can be perceived by said user, and - a control unit (16; 116), connected to said sensor apparatus (12) and to said signaling apparatus (14) and configured to receive said data from said sensor apparatus (12) and to control said signaling apparatus (14) as a function of said data. The control unit (16; 116) is configured to: - calculate or receive an estimation of kinematic parameters referring to the movement performed by said sensor apparatus (12) supported by said part of the user's body as a function of said data, and - control the emission of said signal by said signaling apparatus (14), when said estimation of kinematic parameters fulfills criteria that are predetermined or can be determined discretionally.

Description

TITLE: "WEARABLE SENSORY SUBSTITUTION SYSTEM, IN PARTICULAR FOR BLIND OR VISUALLY IMPAIRED PEOPLE"

~k ~k ~k

DESCRIPTION

Technical field

The present invention relates to a wearable sensory substitution system, in particular for blind or visually impaired people.

Technological background

Sensory substitution is about turning features concerning sensory stimuli that cannot be completely of effectively perceived by a user into different sensory stimuli that, on the contrary, can be perceived by the same user. To this regard, we can mention, by way of example, an application designed for blind or visually impaired people, in which visual stimuli are translated into stimuli that can be perceived by them, such as vibrations or sounds. In this particular field, numerous sensory substitution systems have been developed.

One of the first sensory substitution devices for blind people (typically shorten with the acronym SSD which means "Sensory Substitution Devices") was designed by neuroscientist Paul Bach-y-Rita, who created the system called Tactile Visual Sensory Substitution (TVSS) . This system uses the conversion of the signals coming from a videocamera into tactile stimulations applied to the back of the subject.

Subsequently, thanks to technological progress, further technical solutions for sensory substitution devices were developed, which are based on small-sized electric and vibrotactile stimulators, which are located in different parts of the body of a user. Many of these systems are portable and wearable by the user and were developed based on the different needs to be fulfilled and on the different functions to be carried out.

Researches of the University of British Columbia (Canada) developed two prototypes of vibrotactile systems, which can be worn on the forearm and wrist. These systems are made up of two DC motors arranged 60 mm apart, which are able to generate vibrations at approximately 140 Hz. These tactile devices transmit information by means of intermittent warnings and were substantially used during surgeries to warn surgeons of changes in the cardiac frequency of a patient, without distracting their attention with acoustic alarms. Subsequently, besides this initial application, these devices were used to transmit simple warning information to blind subjects, for example when they are getting close to an obstacle.

Researches of the Carnegie Mellon University (USA) designed a system called Kahru Tactile Outdoor Navigator. This system consists of a wearable tactile harness-vest, which provides simple directional navigation instructions. A set of six motors, able to vibrate, generates tactile messages that can be recognized by the user as instructions of the type "forward", "backward", "to the left", "to the right", "accelerate" and "slow down" to guide the user through an environment. Communication with the harness-vest is ensured by an infrared receiver, which is worn at the height of the belt.

Company TNO Human Factors, a research institute located in Holland, developed a tactile system consisting of 128 elements, which are able to vibrate and are applied to a vest. Vibrations, which occur at a frequency of 160 Hz, provide the user with three-dimensional space information. Even though this system was initially designed to communicate information to a plane pilot in an intuitive manner, it can be used by blind or visually impaired people to provide them with information on a path to follow.

Researchers of the MIT (USA) developed a tactile system integrated in a vest that is fitted to the lower part of the torso of a user. This system is made up of a 4x4 matrix of motors, which are able to vibrate and are controlled by an electronic control unit in an independent manner. The electronic control unit receives commands from a remote computer by means of a wireless communication. This system can be used by blind subjects as a navigator for outdoor environments, as, during a series of experiments, researchers were able to prove that the eight vibrotactile stimulations schemes provided are able to be interpreted as direction instructions (for example "stop", "turn to the left", "walk faster", "walk slower") and as gestural instructions (for example "raise the arm horizontally", "raise the arm vertically") with an utmost precision.

The University of Michigan (USA) designed a wearable system called NavBelt, which is able to provide an acoustic feedback from a matrix of ultrasound sensors mounted around the abdomen of a user, at the height of the belt. The sensors provide information on the presence of local obstacles placed at a close distance, in the area of a 120° sector located in front of the user.

Researchers of the University of Keio (Japan) created a system called ActiveBelt, which consists of a tactile device mounted on a belt and able to work as a directional navigator. The ActiveBelt system is made up of a GPS sensor, a geomagnetic sensor, and eight vibrators arranged at regular intervals around the torso. Vibrations, which occur at a frequency ranging from 33 to 77 Hz, are provided by the ActiveBelt system in order to indicate to the user the direction to follow. A series of experiments confirmed that, through the use of the ActiveBelt system, users were able to identify eight direction indications along a path.

The University of Osnabriick (Germany) developed a prototype system integrated in a belt and comprising an electronic compass with thirteen vibrators arranged around the abdomen. This system allows the user to continuously perceive his/her orientation in the space through a vibrotactile stimulation. The accuracy of navigation and the long-term use of the belt are still under evaluation.

Furthermore, there are also three-dimensional tracking systems for determining position and orientation of a movable object (US 5,819,206, of October 6, 1998) .

More precisely, the present invention relates to a wearable sensory substitution system for blind or visually impaired people according to the preamble of the appended independent claim.

However, systems of the type described above suffer from some drawbacks.

One drawback lies in the fact that, in all the systems mentioned above, users always need to take part to numerous training sessions in order to understand the signals and how to adjust their behavior based on the stimuli provided. This makes it difficult for these systems to be used by children and, especially, newborn babies. Taking into account the fact that the first years of life of a baby are crucial for the creation and the development of sensory and motor abilities, a delayed use of these systems leads to results that are less significant compared to the ones that could be obtained by using them at a very early stage in their growth.

Summary of the invention

An object of the present invention is to provide a system that is able to help, especially blind and visually impaired people, develop a perception of the surrounding space in a direct and intuitive manner since their early childhood, when sensory and motor learning is particularly effective .

According to the present invention, this and other objects are reached by means of a wearable sensory substitution system, in particular for blind or visually impaired people, according to the appended independent claim .

As a matter of fact, the use of the system according to the present invention does not force users to learn new "languages" concerning the sensory stimuli provided by the system; therefore, the rehabilitation of space perception can be obtained in natural manner since the first months of life.

Furthermore, the use of the system according to the present invention does not force users to focus their attention on stimuli that are different or particular compared to the ones that they are normally able to receive in a passive fashion. In this way, users avoid a "cognitive overload", which, on the other hand, would make the system hard to integrate with the normal connections between action and perception that are developed by a blind or visually impaired person to make up for his/her visual deficit.

Furthermore, in some contexts, different systems according to the present invention can be worn by different people in the same environment, thus increasing the chances of relationship and communication between the blind or visually impaired person and the people around him/her (e.g. parents or friends) . By so doing, the feeling of "inclusion" in the environment on the part of the blind or visually impaired person is increased, thus improving social interaction.

Finally, the system according to the present invention can be manufactured with low-cost components, thus offering a solution that is simple and cheap.

The appended claims are an integral part of the technical teaches provided in the following detailed description concerning the present invention. In particular, the appended dependent claims define some preferred embodiments of the present invention and describe optional technical features.

Further features and advantages of the present invention will be best understood upon perusal of the following detailed description, which is provided by way of example and is not limiting, with reference, in particular, to the accompanying drawings, which are briefly described below :

Brief description of the drawings

Figures 1, la and lb are views that show different contexts in which different people are provided with a wearable sensory substitution system for blind or visually impaired people according to an explanatory embodiment of the present invention. In this figure the system is shown in a schematic manner.

Figure 2 is a block diagram concerning the functional representation of the system shown in the previous figures.

Figure 3 is a view that shows different contexts in which a person is provided with a wearable sensory substitution system for blind or visually impaired people according to an explanatory embodiment of the present invention .

Figure 4 is a block diagram concerning the functional representation of a possible variant of the system shown in figure 3.

Detailed description of the invention

Figures 1, la and lb show different contexts in which different people are provided with a wearable sensory substitution system, in particular for blind or visually impaired people, according to an explanatory embodiment of the present invention. The aforesaid system is indicated, as a whole, with number 10.

As you can see, system 10 - advantageously, though not necessarily - comprises a bracelet, which can be worn around the user's wrist, or a band, which can be worn around the user's ankle. This bracelet or band supports sensor apparatus 12 and signaling apparatus 14. Preferably, the bracelet or band also supports control unit 16. However, in other embodiments, control unit 16 could be located in a remote position relative to the bracelet or band .

With reference, in particular, to the block diagram shown in figure 2, system 10 comprises at least one module 11, which comprises, in turn:

- a sensor apparatus 12, which is suited to be worn by a user and is designed to detect data representing the arrangement assumed within the space by the part of the user's body on which sensor apparatus 12 is worn by the user,

- a signaling apparatus 14, which is suited to provide a signal that can be perceived by the user, and

- a control unit 16, which is connected to sensor apparatus 12 and to signaling apparatus 14 and is configured to receive the aforesaid data from sensor apparatus 12 and to control signaling apparatus 14 based on the data.

Said data represent the position and/or the acceleration assumed by said part of the user's body and control unit 16 is configured to:

- calculate or receive an estimation of the acceleration assumed by the part of the user's body as a function of said data, and

- control the emission of said signal by signaling apparatus 14, when the estimation exceeds a signaling threshold value.

In the embodiment shown, sensor apparatus 12 is provided with a measuring chain, in which the electric parameter, typically voltage, produced by a transducer (sensitive to movement, speed or acceleration of sensor apparatus 12) is subject to filtering, for example by means of a band-pass filter (details not shown) .

Therefore, the signal of signaling apparatus 14 is subordinate to the operations carried out by control unit 16, so as to provide a signal that can be perceived by the user only after the signaling threshold, which is predetermined or can be predetermined in a customized manner, has been exceeded, in order to make up for a visual sensory perception deficit. For example, this signal can be a vibration, a sound or, if necessary, even a light signal (in particular, if the user is a visually impaired subject, a light signal can be anyway perceived and used as a reference to make up for a visual deficit), as well as any combination of these types of signals.

As a person skilled in the art can easily understand, system 10 is not only exclusively suited to be used by a blind or visually impaired person, but it can also be used by other people standing in the same environment as said person. In this way, the subject affected by a visual deficit will be able to perceive in a natural and direct manner the movement of the other people sharing the same space, thus increasing the possibilities of interaction with other people surrounding him/her. In this case, it is convenient for system 10 to be able to (also) emit signals other than vibrations or, anyway, other than stimuli that can only be perceived by the person wearing system 10. These signals, if necessary, can be customized based on the user wearing the system.

By way of example, figure la shows a blind or visually impaired baby together with the father. In this case, the father of the blind user can wear a system 10, as well. Advantageously, this system 10 con be configured so as to be able to emit a signal that is different from the one associated with the system worn by the baby (such as a different sound and/or a light of a different color, should the user be a visually impaired baby) . By so doing, the baby will be able to feel the difference between the signals referring to his/her own movements and the signals due to the movements of the father and he/she will do so in a natural manner, without the need to take part to "learning" sessions.

By way of example, again, figure lb shows a group systems 10 assigned to some children that share the same space. In this case, each one of the systems worn by the children can be programmed so as to be able to cause the signaling apparatus to emit a different, customized signal (for example, a different sound and/or a different light) . By so doing, each child will be able to naturally feel the difference between the signals referring to his/her own movements and the signals of the other children, thus improving mutual interaction and social skills.

In the embodiment shown in figure 2, sensor apparatus 12 comprises a transducer (not shown), which is sensitive to at least one among the movement, the speed or the acceleration of the sensor. Furthermore, sensor apparatus 12 is designed to produce an electric parameter representing at least one among said movement, said speed and said acceleration. This electric parameter is then processed, for example by filtering it (in particular, through a band-pass filter), so that sensor apparatus 12 can produce the aforesaid data representing the arrangement assumed within the space by sensor apparatus 12. In particular, said representing data can be available in the form of an electric signal, whose electric voltage (or current) is a function of at least one between the position and the acceleration assumed by sensor apparatus 12.

Preferably, sensor apparatus 12 comprises an accelerometer, for example a three-axis and solid-state accelerometer. In particular, the estimation of the acceleration can be obtained by calculating its three components relative to the three space axes. In this case, the emission of the signal by the signaling apparatus can be controlled when at least one of the aforesaid three acceleration components calculated exceeds the associated predeterminable threshold. In this case, the three thresholds can be adjusted in an independent manner and have a different value, according to criteria that can be predetermined or changed in a customized manner based in the specific needs of the user.

In an advantageous embodiment, sensor apparatus 12 comprises the aforesaid accelerometer , a gyroscope and a magnetometer.

According to an explanatory embodiment, control unit 16 is configured to obtain an estimation of the dynamic acceleration deriving from the measurements of sensor apparatus 12, and to control the emission of said signal by signaling apparatus 14, when the estimation of said dynamic acceleration (or of the components thereof, referring to the three space axes) exceeds the signaling threshold value .

According to a further explanatory embodiment, control unit 16 is configured to obtain an estimation of the static acceleration deriving from the measurements of sensor apparatus 12, and to control the emission of the signal by signaling apparatus 14, when the estimation exceeds the signaling threshold value. In this way, when the variation of the angle of inclination (or the three components of said variation, referring to the three space axes) assumed by sensor apparatus 12 relative to the reference of the gravity acceleration direction exceeds the aforesaid threshold value, control unit 16 controls signaling apparatus 14 accordingly.

In the embodiment shown in figure 2, control unit 16 comprises a processing unit 16a and a memory 16b.

Preferably, processing unit 16a is designed to receive the data provided by sensor apparatus 12 and to calculate, based on said data, the aforesaid estimation of the acceleration assumed by the part of the user's body on which system 10 is worn. Furthermore, processing unit 16a is designed to control signaling apparatus 14 based on the estimation calculated.

In the embodiment shown in figure 2, processing unit 16a comprises a very low-consumption and high-speed microcontroller unit. In particular, this microcontroller unit is designed to remain in stand-by when it is inactive, so as to reduce the consumption of energy of system 10.

Memory 16b can be integrated in processing unit 16a or it can be remote relative to the rest of control unit 16 - and, therefore, relative to processing unit 16a. If memory 16b is remote, it can belong to an outer source provided with a data connection with processing unit 16. For example, said data connection can be a wireless connection.

Preferably, memory 16b contains information suited to be used by processing unit 16a, in particular in order to calculate the aforesaid estimation of the acceleration and/or to control signaling apparatus 14. In particular, memory 16b contains information concerning at least one digital coding of the signal to be emitted by signaling apparatus 14.

For example, one can consider a configuration of signaling apparatus 14 in which it is configured to emit sounds. In this configuration, control unit 16 can be configured to carry out at least one between the following operations :

- synthesizing sounds that are then supplied to the signaling apparatus, according to criteria that are predetermined or can be programmed based on the needs;

- reproducing sounds that are stored, for example, in memory 16b.

In an embodiment of the present invention, memory 16b can contain at least one piece of audio information (in particular, at least one audio file), adapted to be transmitted to signaling apparatus 14, so that the sound coded with this piece of information can be reproduced, when processing unit 16a has calculated the estimation of the acceleration and has evaluated that the aforesaid estimation has exceeded the threshold value provided. This piece of audio information can correspond to a synthesized or recorded sound.

In particular, signaling apparatus 14 is configured to reproduce, select or modulate sounds (if necessary, based on the piece of audio information stored in memory 16b) and, in doing so, it is controlled by control unit 16 according to the movement detected by sensor apparatus 12.

If the piece of audio information corresponds to a synthesized sound, control unit 16 can be configured to arithmetically process a sequence of samples of acoustic waves (for example, contained in memory 16b) in real time. The arithmetic processing can be obtained, for example, through the use of sinusoidal functions or through casual noisemakers or, if necessary, by means of a combination thereof. In particular, the parameters of the sinusoidal functions and/or of the casual noisemakers are adjusted based on the detections of sensor apparatus 12. If the piece of audio information corresponds to a recorded sound, control unit 16 can be configured - for example - to adjust the volume or to apply filters to the piece of audio information stored, regardless of the detections obtained by sensor apparatus 12, thus controlling signaling apparatus 14.

In the embodiment shown in figure 2, signaling apparatus 14 comprises a plurality of components, each suited to provide a different signal to the user. In particular, signaling apparatus 14 comprises a sound emitting device 18, a tactile stimulator (e.g. a vibrator 20) and a light emitting device 22, which are controlled by control unit 16 based on the data received from sensor apparatus 12.

In this embodiment, it is likely that control unit 16 can control a coordinated activation (for example, a simultaneous or - alternatively - a sequential activation) of sound emitting device 18, of the tactile stimulator (e.g. vibrator 20) and of light emitting device 22, when the estimation of the acceleration calculated by control unit 16 exceeds the threshold value. Preferably, control unit 16 can be configured so as to select an operating mode of system 10, in which at least one among the sound emitting device 18, the tactile stimulator (e.g. vibrator 20) and light emitting device 22 is activated, when the aforesaid estimation exceeds the threshold value.

In particular, as already mentioned above with reference to figures 1, la and lb, control unit 16 can be configured so as to cause signaling apparatus 14 to emit a perceivable signal that is customized according to the preferences and the needs of the user.

For example, control unit 16 can be connected to an external computer, especially a personal computer or a portable device, such as a smartphone or a tablet (details without numerical references), which is able to configure the customized perceivable signal (for example, the color - or even the blinking frequency - of the light emitted by light emitting device 22 and/or the sound emitted by sound emitting device 18) . In particular, based on the needs of the user, one can configure not only the type of signal to be emitted, but also which components to be activated among sound emitting device 18, the tactile stimulator (e.g. vibrator 20) and light emitting device 22 (for example, by setting an activation sequence for these components or by ordering the activation of only some of these elements), after the activation threshold has been exceeded.

Preferably, the communication with the external computer can take place through a communication interface 24.

By way of example, the communication interface can be bidirectional, wired or wireless. According to an embodiment of the invention, communication interface 24 can involve the use of a serial communication mode, for example comprising the RS-232, RS-485 Standard or the like. Furthermore, the communication channel can be multi-drop for a chain connection, on the same line, to a personal computer .

Clearly, in simpler versions (which are not shown) of system 10 one can find a signaling apparatus 14 provided with at least one among sound emitting device 18, the tactile stimulator (e.g. vibrator 20) and light emitting device 22. In particular, signaling apparatus 14 can be provided with the sole sound emitting device 18.

Preferably, sound emitting device 18 comprises a loudspeaker, which, if necessary, is provided with an audio amplifier. In particular, the loudspeaker is able to reproduce a sound corresponding to a piece of audio information contained in memory 16b, according to criteria established by processing unit 16a based on the data provided by sensor apparatus 12. Furthermore, in a preferred embodiment, the volume at which the loudspeaker - in necessary combined with audio amplifier - reproduces the sound can be adjusted, for example during the configuration of system 10 through an external computer (not shown), such as a personal computer.

In particular, the piece of audio information stored in memory 16b can include a coding in digital form concerning a sound emission, at audible frequencies, of at least one of the following types: pink noise, white noise, melodies and buzzers. Preferably, memory 16b can include a plurality of pieces of audio information and control unit 16 can be configured so as to select at least one desired piece of audio information according to the preferences and the needs of the user. In particular, should there be provided the use of a plurality of systems 10, each one worn by a different person, each control unit 16 can be programmed so that the piece of audio information selected, adapted to be reproduced by signaling apparatus 18, is different for each one of the users. In this way the interaction among the different users wearing system 10 becomes easier, thus simplifying mutual recognition, thanks to the setting of a differentiation in the perceivable signals adapted to be emitted by each system worn by a different user.

According to a preferred embodiment, control unit 16 can be designed to control signaling apparatus 14 in such a way that it can emit a perceivable signal with features that vary as a function of the representing data provided by sensor apparatus 12. In particular, the features of the perceivable signal can vary as a function of the least one among :

- the estimation of the acceleration calculated by control unit 16 as a function of said representing data,

- the position assumed by system 10 within the space (which can be deduced from the representing data detected by sensor apparatus 12), and

- the path followed by system 10 within the space (which can be deduced from the representing data detected by sensor apparatus 12) .

Let's consider, for example, the case in which processing unit 16a is suited to cause sound emitting device 18 to emit an acoustic wave having features or parameters (especially, in terms of at least one parameter among waveform, amplitude and frequency) that vary as a function of the representing data provided by sensor apparatus 12. Preferably, processing unit 16a can be designed to cause sound emitting device 18 to emit an acoustic wave having a predetermined waveform (if necessary, associated with a digital coding contained in memory 16b) , but with at least one between the frequency and the amplitude that varies as a function of the calculated acceleration. In particular, by mere way of example, given instantaneous acceleration estimations with a greater absolute value, processing unit 16a can be designed to cause sound emitting device 18 to emit an acoustic wave having a greater frequency; in other words, processing unit 16a can be configured to cause sound emitting device 18 to emit an acoustic wave 18 having a frequency that increases as the estimation of the calculated acceleration increases.

Preferably, vibrator 20 comprises a vibromotor. In particular, the vibromotor is a low-voltage and low- consumption vibromotor, for example not larger than 190 mm3.

Let's consider, furthermore, the case in which signaling apparatus 16 comprises only - or at least - light emitting device 22. Similarly to what described for sound emitting device 18, control unit 16 can be configured so as to cause light emitting device 22 to emit a desired piece of light information (for example, a selectable color) according to the preferences and the needs of the user. In particular, should there be provided the use of a plurality of systems 10, each one worn by a different person, each control unit 16 can be programmed so that the piece of light information selected and emitted is different for each one of the users (for example, by associating each user with a light emission of a different color) . In this way the interaction among the different users wearing system 10 becomes easier, thus simplifying mutual recognition, thanks to the setting of a differentiation in the perceivable signals that are emitted by each system worn by a different user.

Preferably, light emitting device 22 comprises a LED. In particular, the LED is a high-efficiency and low- consumption LED. Furthermore, the LED is conveniently provided with a suitable optical diffuser.

According to an advantageous embodiment, should there be provided a plurality of systems, each one associated with a different person, light emitting device 22 comprises a plurality of LEDs. Each one of the aforesaid LEDs is able to emit a light with a different color and can be activated by control unit 16 in a customized manner according to the needs of the user (for example, by associating a different color with each one of the users sharing the same environment) .

Alternatively, control unit 16 can be located in a remote position relative to sensor apparatus 12 and/or to signaling apparatus 14. In particular, control unit 16 is a personal computer or a portable device (such as a mobile phone, for example a smartphone or a tablet) , which can be connected, for example through control interface 24 described above, to sensor apparatus 12 and to signaling apparatus 14. In this case, control unit 16 and control interface 24 can be conveniently connected to one another through a wireless connection.

In the embodiment shown in figure 2, module 11 comprises, furthermore, a power supply device 26 to supply power to the different components of system 10 described above. In particular, power supply device 26 is a battery power supply device. Alternatively, power supply device 26 can be a cable power supply device.

In the embodiment shown, the architecture of system 10 involves the transmission of data, in particular among sensor apparatus 12, signaling device 14 and control unit 16, through a serial bus communication channel.

As a person skilled in the art can understand, there can be different ways to use the information provided by sensor apparatus 12.

For example, as partially mentioned before, the kinematic information obtained by sensor apparatus 12 can be used by control unit 16 to start, modulate or stop the signal emitted by signaling device 14. In particular, should signaling apparatus 14 be designed to emit sounds, the control unit can be suited to establish at least one among the following parameters: the beginning, the end, the modulation and the selection of the sound emission to be transmitted by signaling apparatus 14.

For example, the magnitude or module of the acceleration vector obtained by sensor apparatus 12 (in particular by the accelerometer or by the gyroscope) can be used to assess the beginning of the movement of system 10 and, hence, start the processing process that is able to emit the signal (in particular, the sound) . In particular the signaling threshold value mentioned above can be adjusted in a controlled manner by the user or can be adjusted according to other detections obtained by sensor apparatus 12, (if necessary, even not closely linked to the detection of the movement) .

In an advantageous embodiment, the way in which to process the kinematic movement information of system 10 is determined by the use of a sensor apparatus 12 that is simultaneously provided with an accelerometer , a gyroscope and a magnetometer. In this configuration of sensor apparatus 12, control unit 16 is suited to obtain a piece of six dimensional vector information including linear acceleration data (coming from the accelerometer), rotational speed data (coming from the gyroscope) and inclination data (coming from the magnetometer) . Said piece of vector information describes the instantaneous speed of system 10 within the space.

Preferably the aforesaid piece of vector information

(or a part thereof or a derivative processed starting from said piece of information - such as tangential and rotational speeds, and linear and angular movements) can be used to control signaling apparatus 14 by means of different logics and criteria, of which below you can find some examples and options, which should be understood as non-limiting of the scope of protection of the present invention. Said examples and options should not be considered as mutually exclusive, but they can be used singularly or in combination with one another (or a part of them) .

Signal activation and deactivation control: it is a simple implementation in which the sound (or, more generically, any acoustic, visual or tactile signal) is produced when the magnitude or module of the acceleration data or of the rotational speed data exceeds the signaling threshold value, whereas said sound is interrupted when the aforesaid magnitude or module is below an interruption threshold value (which can be identical to or different from the signaling threshold value) .

Signal selection: a different sound (or, more generically, any acoustic, visual or tactile signal) is produced when control unit 16 detects, from the aforesaid piece of vector information, that system 10 has been moved according to specific movements or gestures. For example, a specific type of sound can be produced when system 10 is moved in a direction, and a different type of sound can be produced when control system 16 detects, from the aforesaid piece of vector information, that system 10 has been moved in a different direction.

On-line signal modulation: the different parameters of a synthesized sound (or, more generically, of any acoustic, visual or tactile signal) are controlled by control unit 16 as a function of the aforesaid piece of vector information or of a part thereof. For example, these parameters can be the frequency, the pitch, the beginning or the end of the relative phases of different harmonics. In particular, control unit 16 can control the emission of a respective specific sound when the aforesaid piece of vector information (or part thereof) corresponds to a movement or a rotation in a respective specific direction, in which the volume of the sound can depend on the speed of the movement associated with the aforesaid piece of vector information (for example, the quicker the detected movement is, the higher is the volume) .

Besides these examples, there can be further additional solutions that can be used in the logics and processes carried out by processing unit 16. For example, the sound can be influenced by different sensors that sensor apparatus 12 can optionally be provided with.

In particular, the volume of the acoustic signal can be adjusted as a function of the noise in the environment, which is measured by a microphone, with which sensor apparatus 12 can be provided.

Furthermore, the sound can also be influenced as a function of information or detections received from devices that are external to sensor apparatus 12, for example connected to it directly or through a central device (for example, a smartphone, a tablet or a personal computer), to which system 10 can be connected, even with a wireless connection. For example, system 10 can be programmed to start emitting sounds when another system is close to system 10 or when said other system moves in a certain manner. The moving information resulting from a plurality of systems 10 located in the same environment can be used to select which systems emit sounds, hence so as be able to cause systems 10 to interact with one another in a controlled manner, if necessary according to the principles of selection, modulation, etc. described above. The control of this particular feature can be carried out by means of radio signals or by means of other suitable sensors, for example through the central device.

With reference, in particular, to figures 3 and 4, number 110 indicates a system according to a further explanatory embodiment of the present invention.

System 110 is substantially similar to system 10 shown in figures 1 and 2 and, therefore, for the preferred technical features and the optional manufacturing details of this embodiment reference is made to the ones shown and described above. On the other hand, below we will describe some additional features that distinguish system 110 from the system indicated with number 10 in figures 1 and 2.

With reference, in particular, to figures 3 and 4, system 110 comprises a plurality of modules 111, each suited to perform a sensory substitution in a preferably independent manner. In the embodiment shown, system 10 comprises a bracelet, which is designed to be worn by the user and in which modules 111 are arranged one next to the other .

According to a possible variant of the explanatory embodiment (not shown in the figures), each one of modules 111 is substantially manufactured according to what we showed and described for module 11 concerning the embodiment shown in figures 1 and 2. In particular, each one of modules 111 is able to operate in an autonomous manner, as it is provided with a respective sensor apparatus, a respective control unit and a respective signaling apparatus (without numerical references), which interact with one another as describe above for module 11 concerning the embodiment shown in figures 1 and 2.

On the other hand, in the embodiment shown in figure

4, system 110 comprises one (single) control unit 116, which is shared by a series of modules 111, preferably by all modules 111. This control unit 116 is able to receive data from the sensor apparatuses of each module 111 and to control the signaling apparatuses of each module 111 as a function of said data.

For example, in this case, control unit 16 can comprise a personal computer, which is remotely connected to the sensor apparatuses and to the signaling apparatuses of each module 111, so as to interact with them through suitable software. By way of example, in order to ensure an acquisition and reaction speed of the system, the piece of information is transferred from the personal computer to the signaling apparatus in a time frame not greater than 120 ]iS f and the command coming from the personal computer is executed not later than 20 iS after it has been received.

In particular, in the variant with a control unit 116 shared by a series of modules 111, in order to connect said modules 111 to the shared control unit 116, a serial communication channel is preferably adopted. In this way, all sensor apparatuses and all signaling apparatuses are able to communicate with control unit 116 through the aforesaid communication channel.

The technical solution of sharing a single control unit 116, adapted to control all apparatuses (sensors and signaling apparatuses) of all modules, has the following advantages :

- possibility to decide when to turn on and off the different devices in a smart manner, so that they interfere/do not interfere with one another;

- possibility to activate the tactile and/or visual component based on the needs; and

- possibility to adjust the features of the sound signal (for example intensity or frequency) according to the needs .

Naturally, the principle of the present invention being set forth, embodiments and implementation details can be widely changed relative to what described above and shown in the drawings as a mere way of non-limiting example, without in this way going beyond the scope of protection provided by the accompanying claims.

Claims

1. Wearable sensory substitution system (10; 110), in particular for blind or visually impaired people; said system comprising at least one module (11; 111) comprising: - a sensor apparatus (12), for being worn by a user and designed to detect data representing the arrangement assumed within the space by the part of the user's body on which said sensor apparatus (12) is worn by the user,
- a signaling apparatus (14), for providing a signal that can be perceived by said user, and
- a control unit (16; 116), connected to said sensor apparatus (12) and to said signaling apparatus (14) and configured to receive said data from said sensor apparatus (12) and to control said signaling apparatus (14) as a function of said data;
said system being characterized in that said control unit (16; 116) is configured to:
- calculate or receive an estimation of kinematic parameters referring to the movement performed by said sensor apparatus (12) supported by said part of the user's body as a function of said data, and
- control the emission of said signal by said signaling apparatus (14), when said estimation of said kinematic parameters fulfills criteria that are predetermined or can be determined discret ionally .
2. System (10; 100) according to claim 1, wherein the estimation of said kinematic parameters comprises at least one among the estimation of the acceleration assumed by said part of the body, the estimation of the rotational speed assumed by said part of the body, and the estimation of the inclination assumed by said part of the body.
3. System (10; 110) according to any of the previous claims, wherein the control unit (16; 116) is configured so as to cause the signaling apparatus (14) to emit a perceivable signal that is customized according to the preferences and the needs of the user.
4. System according to claim 3, wherein said control unit (16) is designed to control said signaling apparatus (14) so that said signaling apparatus (14) emits a perceivable signal with variable features or parameters as a function of said estimation of said kinematic parameters referring to said sensor apparatus (12) .
5. System according to any of the previous claims, wherein said control unit (16) is configured to control the emission of said signal, when the magnitude or module of at least one between said estimation of the acceleration and said estimation of the rotational speed exceeds a signaling threshold value that is predetermined or can be determined discretionally.
6. System according to any of the previous claims, wherein said control unit (16) is configured to control the emission of said signal, when the magnitude or module of at least one between said estimation of the acceleration and said estimation of the rotational speed is below a deactivation threshold value that is predetermined or can be determined discretionally.
7. System according to claim 5 and 6, wherein said signaling threshold value is substantially identical to said deactivation threshold value.
8. System according to any of the previous claims, wherein said control unit (16) is configured to control the emission of a selected or adjusted specific signal among a plurality of different signals as a function of said estimation of kinematic parameters.
9. System according to claim 8, wherein said control unit (16) is configured to store a plurality of movement or gesture profiles that can be carried out by the user, each one of said profiles corresponding to a respective estimation of kinematic parameters that is predetermined or can be determined discret ionally and to a respective specific signal; said specific signal being selected or adjusted when said estimation of kinematic parameters detected by said sensor apparatus (12) corresponds to the one associated with a respective profile.
10. System (10; 110) according to any of the previous claims, wherein said sensor apparatus (12) comprises at least among an accelerometer , a gyroscope and a magnetometer .
11. System (10; 110) according to any of the previous claims, wherein said signaling apparatus (14) comprises at least one device chosen among a sound emitting device (18), a tactile stimulator (20), and a light emitting device (22), which are controlled by said control unit (16; 116) as a function of the data received from said sensor apparatus ( 12 ) .
12. System (10; 110) according to any of the previous claims, wherein said control unit (16; 116) comprises a processing unit (16a) and a memory (16b) containing information to be used by said processing unit (16a) in order to make the calculation of said estimation of the acceleration and/or to control said signaling apparatus (14) .
13. System (10; 110) according to claim 12, wherein said memory (16b) contains information concerning at least one digital coding of the signal to be emitted by said signaling apparatus (14) .
14. System (10; 110) according to any of the previous claims, wherein said at least one module (11; 111) comprises a communication interface (24), for connecting said at least one module (11; 111) with an external computer.
15. System (10; 110) according to any of the previous claims, wherein said module (111) comprises a power supply device (26), for supplying power to at least one apparatus among said sensor apparatus (12), said signaling apparatus (14) , and said control unit (16, 116) .
16. System (110) according to claim 14 or 15 comprising a plurality of said modules (111), which are interconnected to one another.
17. System (110) according to claim 16, wherein said control unit (116) is shared by a group of said modules
(111) and is suited to control the sensor apparatuses and the signaling apparatuses of said group of modules (111) .
18. System (10; 110) according to any of the previous claims, wherein said system comprises a bracelet or a wristband that can be worn by said user and supports said sensor apparatus (12) and said signaling apparatus (14) .
PCT/IB2015/052749 2014-04-16 2015-04-15 Wearable sensory substitution system, in particular for blind or visually impaired people WO2015159237A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
ITTO2014A000323 2014-04-16
ITTO20140323 2014-04-16

Publications (1)

Publication Number Publication Date
WO2015159237A1 true WO2015159237A1 (en) 2015-10-22

Family

ID=50983026

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2015/052749 WO2015159237A1 (en) 2014-04-16 2015-04-15 Wearable sensory substitution system, in particular for blind or visually impaired people

Country Status (1)

Country Link
WO (1) WO2015159237A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5819206A (en) 1994-01-21 1998-10-06 Crossbow Technology, Inc. Method and apparatus for determining position and orientation of a moveable object using accelerometers
GB2487672A (en) * 2011-01-31 2012-08-01 Univ Sheffield Active sensory augmentation device
US20120221177A1 (en) * 2010-12-10 2012-08-30 Foundation Of Soongsil University-Industry Cooperation Method of controlling navigation of robot using electromyography sensor and acceleration sensor and apparatus therefor
WO2014066516A1 (en) * 2012-10-23 2014-05-01 New York University Somatosensory feedback wearable object

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5819206A (en) 1994-01-21 1998-10-06 Crossbow Technology, Inc. Method and apparatus for determining position and orientation of a moveable object using accelerometers
US20120221177A1 (en) * 2010-12-10 2012-08-30 Foundation Of Soongsil University-Industry Cooperation Method of controlling navigation of robot using electromyography sensor and acceleration sensor and apparatus therefor
GB2487672A (en) * 2011-01-31 2012-08-01 Univ Sheffield Active sensory augmentation device
WO2014066516A1 (en) * 2012-10-23 2014-05-01 New York University Somatosensory feedback wearable object

Similar Documents

Publication Publication Date Title
KR101909361B1 (en) Smart wearable devices and methods with attention level and workload sensing
US8228202B2 (en) Transmitting information to a user's body
US10248210B2 (en) Systems and methods for haptically-enabled conformed and multifaceted displays
JP6478461B2 (en) Mobile device with an intuitive alert
EP1656880A1 (en) Image display system, image display device, image display method
US9873200B2 (en) Personal robot
Tsukada et al. Activebelt: Belt-type wearable tactile display for directional navigation
US20170242497A1 (en) Peripheral Vision Head-Mounted Display for Imparting Information to a User Without Distraction and Associated Methods
US20160198319A1 (en) Method and system for communicatively coupling a wearable computer with one or more non-wearable computers
US20100286571A1 (en) System and Method for Providing Body Sway Feedback to a Body of a Subject
US20010029319A1 (en) System and method of monitoring and modifying human activity-based behavior
EP2661663B1 (en) Method and apparatus for tracking orientation of a user
US20180314339A1 (en) Wearable glasses and method of providing content using the same
Gemperle et al. Design of a wearable tactile display
Stefanov et al. The smart house for older persons and persons with physical disabilities: structure, technology arrangements, and perspectives
US20130218456A1 (en) Wearable tactile navigation system
US20140267076A1 (en) Systems and Methods for Parameter Modification of Haptic Effects
US20150145653A1 (en) Device control using a wearable device
US20170013360A1 (en) Multifunctional earphone system for sports activities
AU2004318969A1 (en) Ear associated machine-human interface
US8630633B1 (en) Adaptive, portable, multi-sensory aid for the disabled
US8803682B2 (en) Sleep-posture sensing and monitoring system
CN106464995A (en) Stand-alone multifunctional headphones for sports activities
EP1625841A4 (en) Device and method of applying skin sensory stimulation
US9013264B2 (en) Multipurpose controller for electronic devices, facial expressions management and drowsiness detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15721335

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15721335

Country of ref document: EP

Kind code of ref document: A1