SE1650649A1 - A system for translating visual movements into tactile stimuli. - Google Patents

A system for translating visual movements into tactile stimuli. Download PDF

Info

Publication number
SE1650649A1
SE1650649A1 SE1650649A SE1650649A SE1650649A1 SE 1650649 A1 SE1650649 A1 SE 1650649A1 SE 1650649 A SE1650649 A SE 1650649A SE 1650649 A SE1650649 A SE 1650649A SE 1650649 A1 SE1650649 A1 SE 1650649A1
Authority
SE
Sweden
Prior art keywords
light emitting
emitting device
camera
images
receptor
Prior art date
Application number
SE1650649A
Other languages
Swedish (sv)
Other versions
SE540186C2 (en
Inventor
Andersson Runo
Original Assignee
Andersson Runo
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Andersson Runo filed Critical Andersson Runo
Priority to SE1650649A priority Critical patent/SE540186C2/en
Priority to PCT/SE2017/050493 priority patent/WO2017196251A1/en
Publication of SE1650649A1 publication Critical patent/SE1650649A1/en
Publication of SE540186C2 publication Critical patent/SE540186C2/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/003Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
    • G09B21/004Details of particular tactile cells, e.g. electro-mechanical or mechanical layout
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/003Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

13 Abstract The present invention relates to a system and method for translating visual movements of anobject into tactile stimulations of a user. The system comprises a light emitting device (1)designed to be attached to the object and configured to emit light at a wavelength larger than700 nm, an IR camera (2) adapted to capture a series of images of the light emitting device,an image processing unit (4) adapted to receive the series of images from the camera, todetermine two dimensional positions of the light emitting device based on the receivedimages and to generate one or more signals representing the determined two dimensionalpositions of the light emitting device based on the received images. The system furthercomprises a receptor stimulating device (5) comprising a two dimensional matrix (16) havinga plurality of elements (6) adapted to, upon activation, stimulate at least one tactile receptorof a part ofthe user, and the receptor stimulating device is adapted to receive the signals fromthe image processing unit, and to activate an element ofthe matrix (16) corresponding to thereceived two dimensional position ofthe light emitting device. (Figure 2)

Description

A system for translating visual movements into tactile stimulations.
Field of the invention The present invention relates to a system for translating visual movements of an object intotactile stimulations of a user, comprising an IR camera adapted to capture a series of images,an image processing unit adapted to receive the series of images from the camera and togenerate one or more signals based on the received images, and a receptor stimulating devicecomprising a two dimensional matrix having a plurality of elements adapted to, uponactivation, stimulate at least one tactile receptor of a part of the user, and the receptorstimulating device is adapted to receive the signals from the image processing unit, and toactivate one or more ofthe elements based on the received signals.
Background of the invention Persons that are blind or have impaired vision are able to read and visualize their surroundingsusing a detailed sense of touch. For reading, the Braille display is used, whereby patterns ofraised dots pressed on paper represent Braille characters. Tactile receptors under the skin ofthe fingers allow the user to feel the different characters. Many computer based systems havebeen developed, whereby objects are translated into signals that can be received by tactilereceptors.
For example DE 197 11 125 and DE 42 41 937 disclose systems that allow a blind person toexperience the intensity of colors. A grid pattern of elements is used to send signals to theskin of a person. These elements may be pins moving from and to the skin surface, or electricalelements sending a current to the skin. Colored images from a camera are processed in acomputer device to produce signals for an element, whereby the signals have differentintensities.
GB 2523355 discloses a system to aid a blind person to learn musical notations. A camerarecords musical notes on paper and translates these notes to elements positioned under asole of a foot. The elements are transducers and positioned on specific spots under the foot,such that each note has its own position under the foot. Although this system works well forlearning musical notes, it cannot be used for quick reading of musical notes while playing asong.
US 8,154,392 discloses a system comprising a touch inducer pad that is in contact with a soleof a foot. The inducer pad has a grid pattern of electro vibrating inducers as elements that arewirelessly connected using a low radiofrequency for transmission. The system is adapted forarmed forces in combat, whereby each individual soldier has a gate code wearer's addressselector with its own frequency. The system is adapted to send simple signals, such asdirections to the soldiers. The source of the system may be equipped with a camera in combination with software manipulation to ”visualize” outlines of an object placed in front ofthe camera.
US 7,352,356 discloses a system to tactile simulate a virtual displayed image using a highdensity set of points (e.g. movable pressure responsive pins) at a finger or foot of a blindperson. During scanning, the image is translated into signals of the pins in the matrix in orderto visualize the scanned image. A pattern of the image is caused to move across a surface ofthe skin by selective activation and deactivation of the pins in a matric of pins. The pressureresponse is caused using a fluid in a microelectromechanical system (I\/|EI\/IS) comprising anarray of microvalve actuators.
Apart from systems that can visualize static objecs, systems for recoding moving objects havebeen developed. Some of the challenges in the development of these systems is the filteringof ”noise” such as movement of other objects in the surrounding. This is especially a problemwhen the moving object is small in a background, where larger objects move at the same time.This filtering can be done by complex computer programs. However, this slows down thetranslation process to the elements. Another challenge is the translation of fast movingobjects. Although software for tracking a recorded small object exists, translation of such fastmovement to the elements is difficult to realize in real time, i.e. without delay.
Different types of elements have been developed to overcome this problem. Electro elementsthat give an electric impulse have a good response time, but are unpleasant for the user andthe signal produced is not always clear or strong enough. Vibration elements sometimes sufferthe same problem. The use of pressure responsive pins as elements provide clear signaling tothe tactile receptor. However, the time needed to move the pins, e.g. by using a fluid in I\/|EI\/IScomprising an array of micro valve actuators may introduce a latency that prohibits sensingthe tactile signals in real time, i.e. without delay.
US2009/0326604 discloses a system for managing brain functions and improving sensoryperception. An electro-tactile screen of elements is placed in contact with the tongue of aperson by using an individual molded retainer to hold the screen. The elements are preferablywireless connected to a computer device or image processing unit, which sends output signalsto the elements. The image processing unit receives signals from e.g. an IR camera placed onthe head of a blind person. The system allows a blind person to visualize a ball and catch it.The image processing unit creates a complex multidimensional electro tactile image similar tovisual imagery. The stimulation by the element may be an electro-tactile stimulation, a vibro-tactile simulation or the input may be audio input, visual input or a temperature dependinginput using heat and cold for signaling.
Because the invention in US2009/0326604 is aimed at improving impaired brain functions,such as balance, the document is not aimed at filtering background movement ”noise”, such that small fast moving objects can be visualized or sensed by a blind person without delay. Inthis system, the where and when precision is more important than the speed of a movement.Further, using complex algorithms in an image processing unit slows down the translation ofthe movement to the elements. Infrared cameras have been used to record moving objects.A problem with infrared cameras may be that these cameras have a latency in their recording.This causes a delay in translation and makes it difficult to translate a moving object to anelement without delay.
A blind musician that wishes to play in an orchestra needs to read or sense musical notes orsense instructions from a conductor in order to be able to play together with the others.Especially for playing music together with others, significant delays in translation time fromsignal source to tactile receptor are not acceptable. There is a need for the translating visualmovements into tactile stimulations without delay in time. The existing systems do not allowa blind person to play in a concert, because the present systems do not allow approximatereal time translation from the movement of a baton of the conductor to a tactile signal.
Object and summary of the inventionIt is an object of the present invention to at least partly overcome the above problems, and toprovide an improved system for translating visual movements using a tactile stimulation.
This object is achieved by a system as defined in claim 1.
The system comprises an IR camera adapted to capture a series of images and an imageprocessing unit, such as a computer. This unit is adapted to receive the series of images fromthe camera and to generate one or more signals based on the received images. The systemalso comprises a receptor stimulating device comprising a two dimensional matrix of aplurality of elements, which extend along an X-axis and a Y-axis. This device is adapted to,upon activation, stimulate at least one tactile receptor of a part of the user, and the receptorstimulating device is adapted to receive the signals from the image processing unit, and toactivate one or more ofthe elements based on the received signals.
The system further comprises a light emitting device designed to be attached to the objectand configured to emit light of a wavelength larger than 700 nm. The IR camera is adapted tocapture a series of images of the light emitting device. The image processing unit is adaptedto determine two dimensional positions ofthe light emitting device in a defined 2D coordinatesystem based on the received images, and to generate one or more signals representing thedetermined two dimensional positions of the light emitting device, and the receptorstimulating device is adapted to activate at least one of the elements of the matrixcorresponding to the received two dimensional position of the light emitting device uponreceiving the one or more signals.
The light emitting device may be attached to a moving object, such as a stick/baton of aconductor. The system allows for rapid recording of a movement by the IR camera.Background ”noise” in the form of movements that do not need to be recorded, are prevented by using the light emitting device that emit light of a wavelength larger than 700 nm, orbetween 690 and 1050 nm, i.e. a wavelength in the IR spectrum.
The system according to the invention allows translation of visual movement into elementactivation almost without delay, i.e. practically in real time. This makes it for example possiblefor a blind musician to play together with other musicians in an orchestra. The system may beused for many other applications.
According to an embodiment ofthe invention, the light emitting device comprises at least onelight emitting diode. Light emitting diodes (LED) are small and can easily be attached to anobject, such as the baton of a conductor. Further, LEDs are electrically efficient. A ring ofdiodes may be used as a light emitting source.
In an embodiment, the IR camera has a resolution of at least 640 x 480. IR cameras may havea latency in the recoding. By using an IR camera having at least a resolution of at least 640 x480, or a latency of 20 milliseconds or less the translation ofthe recoded image to the receptorstimulating device can be achieved without delay.
In an embodiment, the receptor stimulating device is adapted to keep the elements activatedfor a defined time period after the position has been received. For example, the time periodis between 0.1ms and 1s. Thus, several of the elements can be activated simultaneously andthereby forming a pattern which can be recognized by the user. The pattern of activatedelements corresponds to the movements of the light emitting device and accordingly to themovements of the object.
In a further embodiment, the receptor stimulating device comprises a pushing memberadapted to move each individual element to and from the tactile receptor, depending on thesignal from the image processing unit. The pushing member allows all elements to be signaledindividually in sequence. This gives the user a tactile replication ofthe exact movement ofthelight emitting device.
In one embodiment, the pushing member is a writing ball in a holder. The round surface ofthe ball allows for a fluid movement along pins.
In another embodiment, the receptor stimulating device comprises two linearelectromagnetic motors adapted to move the pushing member along an X-axis and a Y-axis ofthe matrix. The motors are advantageously used in the receptor stimulating device becausethey can operate at high speed, such as at least 5 m/s. This allows movement of the writingball along the X- or Y-axis at a speed of e.g. 5 m/s and thus prevents delay in signaling themovement to the elements. The receptor stimulating device is preferably robust and comprises only a few parts. The material is light such as aluminum. The receptor stimulating device can be used by any individual, i.e. the device does not need to be fitted or moldedbefore it can be used. This reduces cost and improves ease and flexibility in use ofthe device. ln another embodiment, the elements are pressure responsive pins movable along a Z-axis,which extends substantially perpendicular to the X- and Y-axis. When movable pressureresponsive pins are used as elements, the pins will be pushed up by the writing ball one byone. ln a further embodiment, the image processing unit uses tracking software to register themovement ofthe light emitting device. The tracking software improves the speed of transportof the signal from the light emitting device into the movement of the elements in the matrixof the receptor stimulating device.
The object of the invention is also achieved by a method for translating visual movements ofan object into tactile stimulations of a user, comprising: - emitting light at a wavelength larger than 700 nm by a light emitting device, - capturing a series of images from the emitted light by an IR camera, - receiving the series of images from the IR camera, determining two dimensional positions ofthe light emitting device in a defined plane based on the received images and generating oneor more signals representing the determined two dimensional positions of the light emittingdevice by an image processing unit, - receiving the signals from the image processing unit by a receptor stimulating devicecomprising a two dimensional matrix of a plurality ofelements, - activating an element of the matrix corresponding to the received two dimensional positionof the light emitting device upon receiving the one or more signals, and - stimulating at least one tactile receptor of a part of the user by a receptor stimulating device.
A further object ofthe invention relates to a use ofthe system as defined above or the methoddefined above for visualizing the movement of an object for a user having impaired vision.
A further object ofthe invention relates to a use ofthe system as defined above or the methoddefined above for visualizing movements of a baton of a conductor for a user having impairedvision, wherein the light emitting device is attached to the baton. The invention relates to ause ofthe system and the method by a visually impaired person to play music while followingvisual instructions of a conductor. ln one embodiment the invention relates to a use of thesystem as defined above or the method defined above for providing a tactile pattern redition.
The terms ”without delay” and ”real time” refer to a latency of between 0.1 and 0.1 second.The term ”receptor” refers to a receptor or tactile receptor positioned under the skin ordermis, epithelia of a mammal, such as a human.
Brief description of the drawingsThe invention will now be explained more closely by the description ofdifferent embodimentsof the invention and with reference to the appended figures.
Fig. 1 shows a block diagram of a system for translating visual movements of anobject into tactile stimulations.
Fig. 2 illustrates an example of a system of the invention.
Fig. 3 shows an example of a receptor stimulating device positioned under a foot.
Fig.4 shows an example of a baton of a conductor provided with a light emittingdevice.
Fig. 5 shows details of a receptor stimulating device according to one embodiment ofthe invention.
Figs. 6a-b show two linear motors in a frame of the receptor stimulating device.
Fig. 7 shows an example of a matrix of elements and a position of a writing ball in amatrix.
Fig. 8 illustrates an example of a translation of a movement of a stick into signals ofactivated elements.
Fig. 9 shows a flow diagram of an example of a method of the invention.
Detailed description of preferred embodiments of the invention Figure 1 shows a block diagram of a system for translating visual movements of an object intotactile stimulations of a user according to an embodiment of the invention. The systemcomprises a light emitting device 1 designed to be attached to the object, which movementsare to be detected, and configured to emit light at a wavelength larger than 700 nm, i. e. inthe IR spectrum. Suitably, the light emitting device 1 is an IR Light Emitting Diode (LED). Thesystem further comprises an IR camera 2 adapted to register the IR light emitted from the lightemitting device 1 and to capture a series of images of the light emitting device 1. Preferably,the IR camera is a 2D camera and the images are 2D images. The IR camera is disposed at adistance from the light emitting device 1. Preferably, the IR camera is disposed at a distanceof between 0.5m to 5m or between 1m and 3m from the light emitting device 1 so that theview of the camera and accordingly the images will contain the light emitting device 1 eventhough the light emitting device is moved around, without moving the camera. Preferably, thecamera 2 has a fixed position in the surroundings ofthe moving object.
The system also comprises an image processing unit 4 connected to the IR camera 2 andadapted to receive the series of images from the IR camera 2, to determine two dimensionalpositions of the light emitting device in a defined plane based on the received images, and togenerate one or more signals representing the determined two dimensional positions of thelight emitting device based on the received images. The positions of the light emitting device are determined in a 2D coordinate system defined in relation to the view ofthe camera.
The camera comprises a lens, i.e. wide angle which defines the camera's image capture zone.The 2D coordinate system has two coordinate axes, such X and Y axes (X,Y). The imageprocessing unit 4 is, for example, configured to generate one signal corresponding to an Xcoordinate of the determined position and another signal corresponding to an Y coordinateof the of the determined position. Alternatively, the signal is a data signal comprising datapackages including information about the position, such the coordinates of the position. Theimage processing unit 4 comprises input and output means, processing means, such as a CPU,a FPGA or similar hardware, and and memory means, such as ROM, RAM or similar hardware.The image processing unit 4 also comprises software for providing image processing of theimages. The image processing unit is, for example, a computer, such as, a PC. Appropriate dataprocessing means and software for carrying out image processing as such are known by theskilled person and will thus not be explained in detail.
The system further comprises a receptor stimulating device 5 comprising matrix 16 having aplurality of elements 6 adapted to, upon activation, stimulate at least one tactile receptor ofa part of the user. For example, the elements are elongated and are moved in a longitudinaldirection upon activation along a Z-axis Z extending substantial perpendicular to the X- and Y-axes. Thus, tactile receptors of the user are stimulated by applying pressure on the receptor.Alternatively, the elements are caused to vibrate upon activation, and the tactile receptors ofa part of the user are stimulated by the vibrations. The matrix defines a two dimensional planecorresponding to the defined plane of the 2D positions of the light emitting device. Eachelement of the matrix corresponds to a position of the light emitting device. The receptorstimulating device 5 is connected to the image processing unit 4 and adapted to receive thesignals from the image processing unit 4, and to activate an element of the matrix 16corresponding to the received two dimensional position of the light emitting device uponreceiving the one or more signals. The element 6 of the matrix having a position in the matrixcorresponding to the received two dimensional position, is active upon receiving the positionque. A series of positions is received. Suitably, the receptor stimulating device 5 is adapted tokeep the elements 6 activated for a defined time period of between 0.1ms and 1s after theyhave been activated. Thus, several of the elements can be activated simultaneously orsequentially and thereby forming a pattern, as shown in figure 8, which pattern can berecognized by the user. The pattern of activated elements corresponds to the movements ofthe light emitting device and accordingly to the movements ofthe object.
Figure 2 illustrates a practical example ofthe system. In this example, the light emitting device1 is attached to a baton 10 moved by a conductor 11. Movement of the light emitting deviceis recorded with the IR camera 2. The IR camera captures a series of images from the emittedlight device. The series of images are received using a wire 3 or wireless by the imageprocessing unit 4. The image processing unit 4 translates the input from the IR camera into asignal using software. The signal is sent to a receptor stimulating device 5. The receptorstimulating device causes activation of the elements 6 comprised in the receptor stimulating device. In figure 2, the receptor stimulating device is positioned under a sole of a foot 7 of amusician 8 playing a flute, while seated on a chair 9. Preferably, all the parts ofthe system canbe held together and easily transported in a suitcase in order to be used at different locations.
Any light emitting device 1 may be used in the system that emits light at a wavelength largerthan 700 nm, or between 700 and 1050 nm, or between 700 and 1000 nm, or between 800and 900 nm, or between 825 and 875 nm. Suitably, light emitting diodes (LED) are used. TheIR camera 2 must have a high resolution that prevents latency in recording of images from thelight emitting device 1. Preferably, the IR camera has a resolution of at least 640 x 480, or 1280x 1024, or 1665 x 1088, or 2048 x 2048. The latency may be below 20, 15, 10, 9, 8, 7, 6, 5, 4,3, 2 milliseconds, or between 1 and 20 milliseconds. The pixel size may be between 3 and 7micrometers, or between 4 and 6 micrometers. The IR camera and/or the image processingunit 4 may use a tracking application that allows tracking ofthe movement ofthe light emittingdevice 5.
By using adapted software, the images from the IR camera 2 are translated into a signal, whichis send as an output signal to the receptor stimulating device 5. Various receptor stimulatingdevices and various elements may be used. The receptor stimulating device is a devicecapable of receiving signals from the image processing unit 4 and translating these signals intoelement activity. Such a device may use fluid or air to activate microvalves in electro-tactile 6,or the device may use electricity to provide an electrical element impulse or a vibration.Different types of elements may be used, such as electronic or electrical elements, vibrationelements, heat elements, pressure elements, and the like.
Figure 3 shows a foot 7 of a user positioned on the elements of the receptor stimulatingdevice. The receptor stimulating device 5 comprises a matrix 16 of a plurality of elements 6.The matrix extends along an X-axis and a Y-axis as shown in figure 7, 8. The matrix 16 ofelements 6 may cover a portion ofthe area under the foot or the entire area under the foot.The receptor stimulating device may comprise two linear electromagnetic motors positionedin a frame 18 under the matrix 16 as seen in a vertical direction along the Z-axis. One motor19a can be moved along a X-axis X and one motor 19b can be moved along a Y-axis Y. Themotors may be positioned on top of each other as shown in figures 6a, 6b. The motors receivethe output signal from the image processing unit 4, or an X,Y-signal. Depending on that signal,the motors will move along the X,Y-axes of the matrix of the elements 6.
A pushing member 20 may be attached to the motors. This pushing member is in contact withthe elements that are positioned above the pushing member in the direction of a Z axis. Asshown in figure 5, the pushing member pushes the elements up. The sensors move down bygravity. The pushing member 20 may be a writing ball 20b attached to a holder 20a. Themember pushes each individual element up, when the motors 19a, 19b move along the X- andY-axes.
The elements 6 may be movable pins 17. The extent by which the pins are pushed upwardstoward the foot must be such that the tactile receptors in the skin are stimulated by thepressure ofthe pins. The element can increase in upward extension. This allows to effectivelydemark an additional piece of information valuable to a user such as for musical interpretation. Solenoid magnets can me employed for the purpose.
Figure 9 shows a chart illustration of a method for translating visual movements into tactilestimulations. lt will be understood that some of the blocks of the flow chart can beimplemented by computer program instructions.
The method may comprise the following steps; - emitting light of a wavelength larger than 700 nm by a light emitting device 1 attached to theobject, - capturing a series of images from the emitted light by a an IR camera 2, - receiving the series of images from the camera, determining two dimensional positions ofthe light emitting device in a defined plane based on the received images and generating oneor more signals representing the determined two dimensional positions of the light emittingdevice, - receiving the signals from the image processing unit 4 by a receptor stimulating device 5comprising a two dimensional matrix of a plurality of elements 6, - activating an element of the matrix 16 corresponding to the received two dimensionalposition ofthe light emitting device 1 upon receiving the one or more signals, and - stimulating at least one tactile receptor of a part of the user by a receptor stimulating device.
The system or method may be used for visualizing an object for a user being blind or havingimpaired vision. The system and the method are especially useful for musicians using windinstruments, which do not allow stimulation of tactile receptors in a mouth of the user. Thereceptor may be located anywhere on the body of the user. The receptor may be located atthe sole of the foot of the user. The user may be a musician and the moving object may be abaton 10 of a conductor 11. The elements 6 may be pins 17 as described above.
Figure 7 shows a top side of the matrix 16. As illustrated in the figure, by pushing up asequence of pins by the pushing member, a cursor 21 is provided on the matrix. By movingthe pushing member 20 along the X axis a ”line” lx can be felt on the sole of the foot 7. Bymoving the pushing member along the Y axis a ”line” ly can be felt on the sole of the foot 7.The ”line” exists of a sequence of individual upward and downward moving pins 17 along thearea of the matrix 16.
Figure 8 symbolizes a translation on the matrix of the movement of the light emitting device,which has been moved up, down and then to the right. These types of movements arecommonly used with a baton of a conductor 11 to assign a task to an orchestra. Because theelements 6 react at the same time as the light emitting device is moved, i.e. without delay,the system ofthe present invention can be used by a blind person playing in an orchestra.Examples of further application can be for educational purpose such as teaching dance oryoga: where tactile stimulation is the preferred means of receiving instruction. The deaf-blindare can be especially appreciative of this form of communication and instruction.
The present invention is not limited to the embodiments disclosed but may be varied andmodified within the scope of the following claims. For example, the elements may be vibratingor heat elements. Different methods may be used to move the pins, or different types ofpushing member 20 may be applied. 1 light emitting device or IR diode2 IR camera 3 wire 4 image processing unit 5 receptor stimulating device6 element 7 foot 8 musician 9 chair 10 stick 11 conductor 16 matrix 17 pin 18 frame 19a,19b motor 20a,20b pushing member holder, writing ball21 cursor r recording lx, ly Line along X-or Y-axis X X axis Y Y axis Z Z axis

Claims (10)

11 m
1. A system for translating visual movements of an object into tactile stimulations of a user,comprising: - an IR camera (2) adapted to capture a series of images, - an image processing unit (4) adapted to receive the series of images from the camera and togenerate one or more signals based on the received images, and - a receptor stimulating device (5) comprising a two dimensional matrix (16) having a pluralityof elements (6) adapted to, upon activation, stimulate at least one tactile receptor of a partof the user, and the receptor stimulating device is adapted to receive the signals from theimage processing unit, and to activate one or more ofthe elements (6) based on the receivedsignals, characterized in that the system further comprises a light emitting device (1) designedto be attached to the object and configured to emit light at a wavelength larger than 700 nm,the IR camera (2) is adapted to capture images of the light emitting device, the imageprocessing unit (4) is adapted to determine two dimensional positions of the light emittingdevice in a defined 2D coordinate system based on the received images, and to generate oneor more signals representing the determined two dimensional positions of the light emittingdevice, and the receptor stimulating device (5) is adapted to activate an element of the matrix(16) corresponding to the received two dimensional position of the light emitting device uponreceiving the one or more signals.
2. The system according to claim 1, wherein the light emitting device (1) comprises at leastone light emitting diode.
3. The system according to claim 1 or 2, wherein the IR camera (2) has a resolution of at least640 x 480.
4. The system according to any one ofthe preceding claims, wherein the receptor stimulatingdevice (5) comprises a pushing member (20) adapted to move each individual element (6), toand from the tactile receptor, depending on the signal from the image processing unit (4).
5. The system according to any one ofthe preceding claims, wherein the receptor stimulatingdevice (5) comprises two linear electromagnetic motors (19a, 19b) adapted to move thepushing member (20) along an X-axis (X) and a Y-axis (Y) ofthe matrix (16).
6. The system according to any one of the preceding claims, wherein the pushing member (20)is a writing ball (20b) in a holder (20a).
7. The system according to any one of the preceding claims, wherein the elements (6) arepressure responsive pins (17) movable along a Z-axis (Z), which extends substantiallyperpendicular to the X-axis (X) and Y-axis (Y).
8. The system according to any one of the preceding claims, wherein the image processingunit (4) uses tracking software to register the movement of the light emitting device (1). 12
9. A method for translating visual movements of an object into tactile stimulations of a user,comprising: - emitting light at a wavelength larger than 700 nm by a light emitting device (1) attached tothe object, - capturing a series of images from the emitted light by an IR camera (2), - receiving the series of images from the IR camera, determining two dimensional positions ofthe light emitting device in a defined 2D coordinate system based on the received images andgenerating one or more signals representing the determined two dimensional positions ofthelight emitting device by an image processing unit (4), - receiving the signals from the image processing unit by a receptor stimulating device (5)comprising a two dimensional matrix (16) of a plurality of elements (6), - activating an element of the matrix corresponding to the received two dimensional positionof the light emitting device upon receiving the one or more signals, and - stimulating at least one tactile receptor of a part of the user by a receptor stimulating device.
10. Use of the system according to any one of claims 1 to 8 or the method of claim 9, forvisualizing movements of a baton of a conductor for a user having an impaired vision, whereinthe light emitting device is attached to the baton.
SE1650649A 2016-05-13 2016-05-13 A system for translating visual movements into tactile stimulations. SE540186C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE1650649A SE540186C2 (en) 2016-05-13 2016-05-13 A system for translating visual movements into tactile stimulations.
PCT/SE2017/050493 WO2017196251A1 (en) 2016-05-13 2017-05-15 A system for translating visual movements into tactile stimulations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1650649A SE540186C2 (en) 2016-05-13 2016-05-13 A system for translating visual movements into tactile stimulations.

Publications (2)

Publication Number Publication Date
SE1650649A1 true SE1650649A1 (en) 2017-11-14
SE540186C2 SE540186C2 (en) 2018-04-24

Family

ID=60266559

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1650649A SE540186C2 (en) 2016-05-13 2016-05-13 A system for translating visual movements into tactile stimulations.

Country Status (2)

Country Link
SE (1) SE540186C2 (en)
WO (1) WO2017196251A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3362496B2 (en) * 1994-03-25 2003-01-07 日本電信電話株式会社 Conductor baton tactile appreciation device for blind people
EP2419808B1 (en) * 2009-04-15 2015-06-10 Koninklijke Philips N.V. A foldable tactile display
WO2010142689A2 (en) * 2009-06-08 2010-12-16 Kieran O'callaghan An object detection device
WO2015054789A1 (en) * 2013-10-18 2015-04-23 Hagedorn Douglas Systems and methods for non-visual spatial interfacing with a computer

Also Published As

Publication number Publication date
SE540186C2 (en) 2018-04-24
WO2017196251A1 (en) 2017-11-16

Similar Documents

Publication Publication Date Title
JP6703688B2 (en) Tactile information conversion device, tactile information conversion method, tactile information conversion program, and element arrangement structure
US8760398B2 (en) Interactive video based games using objects sensed by TV cameras
US9390630B2 (en) Accelerated learning, entertainment and cognitive therapy using augmented reality comprising combined haptic, auditory, and visual stimulation
US20220065580A1 (en) Accelerated Learning, Entertainment and Cognitive Therapy Using Augmented Reality Comprising Combined Haptic, Auditory, and Visual Stimulation
US8111239B2 (en) Man machine interfaces and applications
US20110155044A1 (en) Kinesthetically concordant optical, haptic image sensing device
CA2467228A1 (en) Multi-tactile display haptic interface device
WO2016038953A1 (en) Detection device, detection method, control device, and control method
Alais et al. Multisensory processing in review: from physiology to behaviour
KR100812624B1 (en) Stereovision-Based Virtual Reality Device
US8194924B2 (en) Camera based sensing in handheld, mobile, gaming or other devices
US20140184384A1 (en) Wearable navigation assistance for the vision-impaired
CN1745404B (en) Interactive teaching and learning device with three-dimensional model
Sadihov et al. Prototype of a VR upper-limb rehabilitation system enhanced with motion-based tactile feedback
GB2496521A (en) Computerised musical instrument using motion capture and analysis
US20160321955A1 (en) Wearable navigation assistance for the vision-impaired
WO2018012062A1 (en) Information processing apparatus, information processing method, and program
US20230005457A1 (en) System for generating a signal based on a touch command and on an optical command
Lun Khoo et al. Designing and testing wearable range‐vibrotactile devices
JPH0990867A (en) Tactile sensing presentation device
Ryu et al. Using a vibro-tactile display for enhanced collision perception and presence
CN104933278B (en) A kind of multi-modal interaction method and system for disfluency rehabilitation training
SE1650649A1 (en) A system for translating visual movements into tactile stimuli.
Ozcelik et al. Gesture‐based interaction for learning: time to make the dream a reality
US20200146618A1 (en) Device with a detection unit for the position and orientation of a first limb of a user