EP1894083A1 - Apparatus and method for exploring graphical objects for users - Google Patents

Apparatus and method for exploring graphical objects for users

Info

Publication number
EP1894083A1
EP1894083A1 EP06753970A EP06753970A EP1894083A1 EP 1894083 A1 EP1894083 A1 EP 1894083A1 EP 06753970 A EP06753970 A EP 06753970A EP 06753970 A EP06753970 A EP 06753970A EP 1894083 A1 EP1894083 A1 EP 1894083A1
Authority
EP
European Patent Office
Prior art keywords
pointing device
graphical object
user
signal
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP06753970A
Other languages
German (de)
French (fr)
Inventor
Thimoty Barbieri
Licia Sbattella
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Politecnico di Milano
Original Assignee
Politecnico di Milano
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Politecnico di Milano filed Critical Politecnico di Milano
Publication of EP1894083A1 publication Critical patent/EP1894083A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI

Definitions

  • the present invention relates to an apparatus and a method for exploring graphical objects for users, particularly but without limitation for exploring a graphical object on a screen of a graphics interface for blind users, as defined in the preamble of claims 1 and 18.
  • US 6,078,308 is based on the use of a mouse that can generate vibrations and teaches a method for integrating the tactile sensations produced by this mouse with the navigation in a normal user interface having drop-down menus and buttons (GUI).
  • GUI drop-down menus and buttons
  • the above patents provide apparatus for transmitting vibration information, that require the installation of a specially designed software environment to control non absolute pointing devices, such as mouse.
  • the object of the present invention is to obviate the above mentioned prior art drawbacks.
  • this object is fulfilled by an apparatus and a method for exploring a graphical object on a screen of a graphics interface through the movement of a pointing device for a user, as defined by the features of claims 1 and 18.
  • the invention provides an apparatus that can allow a user to explore a graphical object displayed on a screen of a graphics interface, such exploration being based on the definition of a one-to-one correspondence between the position of the pointing device and the position of at least one portion of such graphical object.
  • the graphical object or image is arranged to generate a combination of auditory and tactile sensations.
  • the invention also provides a method that can allow a user to explore a graphical object displayed on a screen of a graphics interface, the user being able to draw information from the image by him-/herself.
  • - Figure 1 is a general diagrammatic view of the apparatus according to the present invention
  • - Figure 2 shows an embodiment of a device of the apparatus of Figure 1
  • - Figures 3 A and 3 B show two embodiments of waveforms that can be used in the apparatus of Figure 1, according to the present invention
  • FIG. 4 A and 4B show a graphical object to be explored by the apparatus and method according to the present invention
  • - Figure 5 shows another graphical object to be explored by the apparatus and method according to the present invention.
  • numeral 1 generally designates a preferred embodiment of an apparatus for exploring a graphical object 2 on a screen 3 of a graphics interface 4 through the movement of a pointing device 5 A by a user.
  • the apparatus 1 comprises an electronic device 6 having a buffer
  • the pointing device 5 A is associated to the electronic device 6.
  • Such pointing device 5A associated to the electronic device can define a one- to-one correspondence between the position of the pointing device 5A and the position of at least one portion of the graphical object 2 for a given movement of the pointing device controlled by the user.
  • the electronic device 6 further comprises a keyboard 8 for inputting alphanumeric data, a mouse 9 and a common audio signal reproducing device 10.
  • the graphics interface 4 is a Graphics User Interface such as Microsoft Windows, Linux, Palm OS, etc.
  • FIG. 1 further shows that the electronic device 6 comprises programmed means 11 and an Operating System 12.
  • the programmed means 11 allow to generate a signal perceptible by the user, as a function of the correspondence between the position of the pointing device 5 A and the position of at least one portion of the graphical object 2.
  • These programmed means are preferably provided in the form of an Image Map using HTML syntax, which is normally present in Web Browsers such as Internet Explorer, Fire Fox, Netscape, etc.
  • the programmed means may be created by Object Oriented programs, such as Visual Basic, C++, etc.
  • the input and output signals to and from the electronic device 6 are generated by means of I/O drivers 13, 14 and 15. These I/O drivers control the pointing device 5 A, and the audio signal reproducing device 10, the keyboard 8 and the mouse 9 respectively.
  • the screen 3 is not physically connected to the electronic device 6 by means of cables 6 or the like, it is outlined by broken lines to simulate what a blind user might imagine to explore thanks to the combination of tactile and auditory effects provided by the apparatus 1 , as described below.
  • the screen 3 may also be physically connected to the electronic device 6, to allow people having normal visual function to follow the progress of the operations performed by the blind user, or to allow a user having normal visual function to interact with applications that can exploit the whole potential of the electronic device 6, i.e. image display, generation of audible sounds and generation of tactile signals.
  • an audio device such as a PC sound card, not shown
  • the I/O driver 13 manages signal communication between the electronic device 6 and the audio signal reproducing device 10 and the pointing device 5 A.
  • Such audio device typically has at least two output audio channels (a left and a right channel to simulate stereo sound), although audio devices are currently available which have six or more output audio channels.
  • the electronic device 6 may be an ordinary Personal Computer or an equivalent multimedia system.
  • a tactile transducer 5 may be operatively associated to the pointing device 5 A.
  • the tactile transducer 5 is, for instance, a vibrational transducer. Particularly, the tactile transducer 5 is directly connected to the pointing device 5A, to generate an input signal for addressing the buffer 7 and provide a user- perceptible signal. It will be appreciated that, in a preferred embodiment of the invention, the perceptible signal and the graphical object 2 are downloaded in real time and later stored in the buffer 7 through a data transfer network connection (not shown), such as the Internet and the Web.
  • the tactile transducer 5 is adapted to be worn by a user, e.g. on a finger, and is therefore separated from the pointing device 5 A. Therefore, in this embodiment, the tactile transducer 5 is operatively associated to the electronic device
  • a one-to-one correspondence may be defined between the position of the tactile transducer 5 and the position of at least one portion of the graphical object 2 for a given movement of the pointing device 5 A controlled by the user.
  • one audio channel of the audio device is operatively connected to the audible signal reproducing device 10, whereas another audio channel is operatively connected to the tactile transducer 5.
  • the pointer device 5A may be selected from the group consisting of a touch screen display, a pen and tablet PC, or a digitizer with digitizing tablet, etc.
  • Figure 2 shows a circuit diagram that can control the tactile device 5 according to the present invention.
  • the perceptible signal on one output audio channel "R" of the audio device e.g. the right channel
  • the sound reproducing device 10 e.g. the two earphones of a headset
  • FIG. 3 A A schematic representation of the perceptible signal on the right audio channel, i.e. the signal that can actuate the audio device 10, is shown in Figure 3 A.
  • the output signal from the feedback amplifier 16 is rectified by the series of a rectifier diode 17 and a Dc capacitor and finally transmitted to a power transistor 5B, for instance of the IGBT type, having a bias resistor Rp, the transistor 5B being able to generate the proper operating voltage of the tactile transducer 5.
  • FIG. 3B A schematic representation of the perceptible signal on the left audio channel, i.e. the signal that can actuate the tactile transducer 5, is shown in Figure 3B.
  • Such left channel perceptible signal is obtained with a high-intensity, low- frequency waveform (usually a simple sinus, with amplitude varied to provide signals of different nature).
  • Such LED is used to promptly advise the user or the person assisting the blind user, that a circuit abnormality occurred.
  • the tactile transducer 5 is formed by an offset motor which hits a rigid plastic cell in its rotation, thereby generating a vibrational effect, in other words a user-perceptible vibration.
  • the supply Vcc voltage required by the tactile transducer 5 may be supplied by the USB port of the Personal Computer.
  • the power source of the tactile transducer 5 might be a battery, or a transformer implementation that can generate the proper supply voltage and current, by transforming the normal mains voltage (220 V; 50 Hz ⁇ OHz).
  • An equalizer 19 is optionally provided between the amplifier 16 and the tactile transducer 5 to adjust the intensity of the signal entering the power transistor 5B.
  • the perceptible signal is a signal composed of an audio signal and a tactile signal.
  • the audio signal and the tactile signal are on the right and the left audio channels of the audio device respectively. These audio and tactile signals are simultaneously transmitted both to the tactile transducer 5 and to the sound reproducing device 10.
  • the user can provide direct and absolute pointing to the area of the screen 3 in which the graphical object 2 is displayed, thereby receiving graphical information, sound information and tactile information.
  • the graphical object 2 is a bitmap image (although it might also be a JPG or PNG image), with active areas, implemented by the programmed means 11 , associated thereto.
  • the operating system is Microsoft® Windows XP®
  • the graphics interface 4 is Internet Explorer®
  • the application program 11 is an Image Map using HTML syntax
  • the graphical object 2 enriched with vector plots may be downloaded from an Internet Explorer® Web page.
  • the active areas of the graphical object 2 are connected by JavaScript callback to the transmission of the perceptible signal, e.g. a stereo wav file, retrieved from the buffer 7.
  • the user receives the composite (audio-tactile) perceptible signal through the audio device.
  • the audio component of the perceptible signal reproduces an audible signal (e.g. a synthetic voice, a beep, etc.), that can describe the characteristic of the active area associated to the graphical object 2, whereas the tactile component of the perceptible signal actuates the tactile transducer 5 that can provide further characteristics of the active area associated to the graphical object 2.
  • an audible signal e.g. a synthetic voice, a beep, etc.
  • the perceptible signal associated to the active areas of the graphical object 2 has been prepared before exploration, which means that the graphical object 2, the areas actuated by the pointing device 5 A and the perceptible signal are defined beforehand.
  • the information to be presented to the user (such as name, size, position, etc. of the graphical object 2), which are carried by the perceptible signal, i.e. the audible sound and the tactile signal, are stored in the buffer 7.
  • the active areas of the graphical object and the related perceptible signals may be created by using a software that allows automation thereof.
  • the user selects a portion of the graphical object 2 to be presented to the user for exploration.
  • Graphical objects 2 are processed by an edge detection software section. Similar to what has been described above, active areas are associated to the detected edges, and the perceptible signal (audio-tactile stimulus) selected for edge identification is associated to such areas.
  • the graphical object 2 is dynamically generated from certain data required to the user. For example, a profile of a mathematical function may be generated from the required function values.
  • the active edge exploration areas are automatically associated to the graphical object, and are determined from mathematically or geometrically known image profiles.
  • perceptible signals audio-tactile stimuli
  • Figure 4 shows a possible graphical object 2, e.g. a two-dimensional graphical object such as a square, to be submitted to exploration by the apparatus 1.
  • a possible graphical object e.g. a two-dimensional graphical object such as a square
  • the apparatus 1 is configured with a tactile transducer 5 wearable by the user (e.g. on a finger of a hand) and that the pointing device 5A is the screen 3 of a tablet PC equipped with an optical pen pointer
  • a possible implementation of the exploration may follow the following procedure:
  • edge detection can be consolidated by following such edge by a straight motion, with no zig-zag oscillations.
  • the edge of the graphical object 2 has thus been recognized by the blind user.
  • the above procedure concerning the case of a blind user and a graphical object 2 consisting of a medium simplicity image averagely takes a variable time of 180 to 300 seconds.
  • the audio/tactile signals generated by the apparatus 1 may be automatically generated during exploration or upon request by a click, e.g. by tapping twice the finger on the surface or pressing the button of the pen.
  • the above apparatus may provide the following advantages: a) no particular hardware or software configuration is required, a stereo audio output (i.e. left and right audio channels) and a USB port for power supply to the tactile transducer 5 being simply needed; b) the user can promptly use the apparatus 1, no use and/or design of special drivers being required; c) the tactile transducer 5 is controlled by a waveform ( Figure 3A) obtained by commonly available audio processing software.

Abstract

The present invention relates to an apparatus and a method for exploring a graphical object (2) on a screen (3) of a graphics interface (4) through the movement of a pointing device (5A) for a user. Particularly, the apparatus (1) comprises an electronic device (6) having a buffer (7) for storage of information to be presented to said user; the pointing device (5A) is operatively associated to said electronic device (6) to generate an input signal for addressing said buffer (7) and is characterized in that the pointing device (5A) can define a one-to-one correspondence between the position of the pointing device (5A) and the position of at least one portion of said graphical object (2) for a given movement of the pointing device (5A) controlled by said user and in that it comprises programmed means (11) operating in said electronic device (6) for producing a signal perceptible by said user, as a function of said correspondence between the position of the pointing device (5A) and the position of at least one portion of said graphical object (2).

Description

DESCRIPTION
Title: "Apparatus and method for exploring graphical objects for users"
*******
The present invention relates to an apparatus and a method for exploring graphical objects for users, particularly but without limitation for exploring a graphical object on a screen of a graphics interface for blind users, as defined in the preamble of claims 1 and 18.
Many computer-controlled or assisted reading apparatus have been proposed, which have been specially designed as an aid to exploration of graphical objects or images displayed on the screen of a computer for users.
Examples of such apparatus are described in the following documents: a) US 6,125,385, "FORCE FEEDBACK IMPLEMENTATION IN WEB PAGES"; b) US 6,088,017, "TACTILE FEEDBACK MAN-MACHINE INTERFACE DEVICE"; c) US 6,078,308, "GRAPHICAL CLICK SURFACES FOR FORCE FEEDBACK APPLICATIONS TO PROVIDE USER SELECTION USING CURSOR INTERACTION WITH A TRIGGER POSITION WITHIN A BOUNDARY OF A GRAPHICAL OBJECT".
Particularly, the document US 6,125,385 discloses:
- the provision of a mouse that can generate and provide tactile feedback to the user;
- the use, on a web page, of specific areas of such web page "containing" tactile effects;
- an authoring environment, allowing to create tactile effects as a function of specific needs. US 6,125,385 specifically mentions the need of installing a plug-in in the Web browser and using particular programming APIs to link certain portions of the web page to tactile effects that can be loaded on a hard disk of a Personal Computer. These tactile effects may be modified by means of an authoring interface. The mouse, connected to the Personal Computer, transmits the feedback to the user by means of a pair of detents, driven by an electromechanical pulse, to allow a user to experience more interactive force effects during web page navigation.
Therefore the patent shows that a specific client system needs to be installed on the Personal Computer of the user, and that suitable mechanical means need to be integrated in the mouse, such means being able to be driven by the electromechanical pulse.
However, such apparatus has the main drawback of making the tactile feedback wholly useless, as the position that the mouse may take is unrelated from the absolute position notion with respect to the image displayed on the personal computer screen. According to US 6,088,017, a user has to wear a hardware unit, consisting of a glove and other small spring actuators, the latter being arranged all over the body of the user. A virtual three-dimensional image is displayed on the screen of a Personal Computer. By moving the hand that wears the glove, the user is able to grasp or "touch" the solids displayed on the screen and to receive tactile feedback on various parts of his/her body (e.g. independently on each finger). The control software and the wearable hardware are connected to a PC-dedicated interface.
US 6,078,308 is based on the use of a mouse that can generate vibrations and teaches a method for integrating the tactile sensations produced by this mouse with the navigation in a normal user interface having drop-down menus and buttons (GUI). The method is essentially based on the determination of the entering or exiting of the mouse through a boundary of a graphics interface.
Therefore, the above patents provide apparatus for transmitting vibration information, that require the installation of a specially designed software environment to control non absolute pointing devices, such as mouse. In view of the above prior art, the object of the present invention is to obviate the above mentioned prior art drawbacks.
According to the present invention, this object is fulfilled by an apparatus and a method for exploring a graphical object on a screen of a graphics interface through the movement of a pointing device for a user, as defined by the features of claims 1 and 18.
The invention provides an apparatus that can allow a user to explore a graphical object displayed on a screen of a graphics interface, such exploration being based on the definition of a one-to-one correspondence between the position of the pointing device and the position of at least one portion of such graphical object. Particularly, the graphical object (or image) is arranged to generate a combination of auditory and tactile sensations.
The invention also provides a method that can allow a user to explore a graphical object displayed on a screen of a graphics interface, the user being able to draw information from the image by him-/herself. The features and advantages of the invention will appear from the following detailed description of one practical embodiment, which is illustrated without limitation in the annexed drawings, in which:
- Figure 1 is a general diagrammatic view of the apparatus according to the present invention; - Figure 2 shows an embodiment of a device of the apparatus of Figure 1 ; - Figures 3 A and 3 B show two embodiments of waveforms that can be used in the apparatus of Figure 1, according to the present invention;
- Figures 4 A and 4B show a graphical object to be explored by the apparatus and method according to the present invention; - Figure 5 shows another graphical object to be explored by the apparatus and method according to the present invention.
Referring to the accompanying figures, numeral 1 generally designates a preferred embodiment of an apparatus for exploring a graphical object 2 on a screen 3 of a graphics interface 4 through the movement of a pointing device 5 A by a user. Particularly, the apparatus 1 comprises an electronic device 6 having a buffer
7 (such as a hard disk) for storage of information to be presented to such user. The pointing device 5 A is associated to the electronic device 6.
Such pointing device 5A associated to the electronic device can define a one- to-one correspondence between the position of the pointing device 5A and the position of at least one portion of the graphical object 2 for a given movement of the pointing device controlled by the user.
The electronic device 6 further comprises a keyboard 8 for inputting alphanumeric data, a mouse 9 and a common audio signal reproducing device 10.
The graphics interface 4 is a Graphics User Interface such as Microsoft Windows, Linux, Palm OS, etc.
Figure 1 further shows that the electronic device 6 comprises programmed means 11 and an Operating System 12.
The programmed means 11 allow to generate a signal perceptible by the user, as a function of the correspondence between the position of the pointing device 5 A and the position of at least one portion of the graphical object 2. These programmed means are preferably provided in the form of an Image Map using HTML syntax, which is normally present in Web Browsers such as Internet Explorer, Fire Fox, Netscape, etc. Alternatively, the programmed means may be created by Object Oriented programs, such as Visual Basic, C++, etc. The input and output signals to and from the electronic device 6 are generated by means of I/O drivers 13, 14 and 15. These I/O drivers control the pointing device 5 A, and the audio signal reproducing device 10, the keyboard 8 and the mouse 9 respectively.
It will be appreciated that, although the screen 3 is not physically connected to the electronic device 6 by means of cables 6 or the like, it is outlined by broken lines to simulate what a blind user might imagine to explore thanks to the combination of tactile and auditory effects provided by the apparatus 1 , as described below.
Nevertheless, the screen 3 may also be physically connected to the electronic device 6, to allow people having normal visual function to follow the progress of the operations performed by the blind user, or to allow a user having normal visual function to interact with applications that can exploit the whole potential of the electronic device 6, i.e. image display, generation of audible sounds and generation of tactile signals. Still with reference to Figure 1 , it will be appreciated that, advantageously, by means of an audio device (such as a PC sound card, not shown), the I/O driver 13 manages signal communication between the electronic device 6 and the audio signal reproducing device 10 and the pointing device 5 A.
Such audio device typically has at least two output audio channels (a left and a right channel to simulate stereo sound), although audio devices are currently available which have six or more output audio channels.
The electronic device 6 may be an ordinary Personal Computer or an equivalent multimedia system.
According to an embodiment of the present invention, a tactile transducer 5 may be operatively associated to the pointing device 5 A.
The tactile transducer 5 is, for instance, a vibrational transducer. Particularly, the tactile transducer 5 is directly connected to the pointing device 5A, to generate an input signal for addressing the buffer 7 and provide a user- perceptible signal. It will be appreciated that, in a preferred embodiment of the invention, the perceptible signal and the graphical object 2 are downloaded in real time and later stored in the buffer 7 through a data transfer network connection (not shown), such as the Internet and the Web.
Alternatively, the tactile transducer 5 is adapted to be worn by a user, e.g. on a finger, and is therefore separated from the pointing device 5 A. Therefore, in this embodiment, the tactile transducer 5 is operatively associated to the electronic device
6 but physically separated from the pointer device 5 A, to receive a perceptible signal from the electronic device 6 and provide a perceptible signal to the user.
In other words, a one-to-one correspondence may be defined between the position of the tactile transducer 5 and the position of at least one portion of the graphical object 2 for a given movement of the pointing device 5 A controlled by the user.
In accordance with an advantageous aspect of the present invention, one audio channel of the audio device is operatively connected to the audible signal reproducing device 10, whereas another audio channel is operatively connected to the tactile transducer 5.
The pointer device 5A may be selected from the group consisting of a touch screen display, a pen and tablet PC, or a digitizer with digitizing tablet, etc.
Figure 2 shows a circuit diagram that can control the tactile device 5 according to the present invention.
As is shown in this figure, the perceptible signal on one output audio channel "R" of the audio device, e.g. the right channel, is directly connected to the sound reproducing device 10 (e.g. the two earphones of a headset).
A schematic representation of the perceptible signal on the right audio channel, i.e. the signal that can actuate the audio device 10, is shown in Figure 3 A.
On the other hand, the perceptible signal on the other audio channel of the audio device "L", i.e. the left channel, is first amplified by a logarithmic amplifier 16 [whose amplification is represented by the formula Vu=α*log(Vi/(Ri*Io)), where Vi is the input voltage, Vu is the output voltage, Ri is the input resistance, Io the reverse saturation current, α represents the characteristic parameters of the diode Dl] with threshold voltage compensation on the positive half-wave of the signal on the audio channel.
Then, the output signal from the feedback amplifier 16 is rectified by the series of a rectifier diode 17 and a Dc capacitor and finally transmitted to a power transistor 5B, for instance of the IGBT type, having a bias resistor Rp, the transistor 5B being able to generate the proper operating voltage of the tactile transducer 5.
A schematic representation of the perceptible signal on the left audio channel, i.e. the signal that can actuate the tactile transducer 5, is shown in Figure 3B.
Such left channel perceptible signal is obtained with a high-intensity, low- frequency waveform (usually a simple sinus, with amplitude varied to provide signals of different nature).
A light-emitting device 18, such as a LED, is connected parallel to the tactile transducer 5. Such LED is used to promptly advise the user or the person assisting the blind user, that a circuit abnormality occurred. In a preferred embodiment, the tactile transducer 5 is formed by an offset motor which hits a rigid plastic cell in its rotation, thereby generating a vibrational effect, in other words a user-perceptible vibration.
It shall be noted that the supply Vcc voltage required by the tactile transducer 5 may be supplied by the USB port of the Personal Computer. Alternatively, the power source of the tactile transducer 5 might be a battery, or a transformer implementation that can generate the proper supply voltage and current, by transforming the normal mains voltage (220 V; 50 Hz÷όOHz).
An equalizer 19 is optionally provided between the amplifier 16 and the tactile transducer 5 to adjust the intensity of the signal entering the power transistor 5B.
It will be appreciated that, advantageously, the perceptible signal is a signal composed of an audio signal and a tactile signal. Particularly, the audio signal and the tactile signal are on the right and the left audio channels of the audio device respectively. These audio and tactile signals are simultaneously transmitted both to the tactile transducer 5 and to the sound reproducing device 10.
Thanks to the electronic device 6 with the tactile transducer 5 interacting with the above mentioned programmed means 11 , the user can provide direct and absolute pointing to the area of the screen 3 in which the graphical object 2 is displayed, thereby receiving graphical information, sound information and tactile information. In a first basic embodiment, the graphical object 2 is a bitmap image (although it might also be a JPG or PNG image), with active areas, implemented by the programmed means 11 , associated thereto. For example, if the operating system is Microsoft® Windows XP®, the graphics interface 4 is Internet Explorer®, and the application program 11 is an Image Map using HTML syntax, the graphical object 2 enriched with vector plots, may be downloaded from an Internet Explorer® Web page. In this specific embodiment, the active areas of the graphical object 2 are connected by JavaScript callback to the transmission of the perceptible signal, e.g. a stereo wav file, retrieved from the buffer 7.
Therefore, by passing the pointing device 5A with the tactile transducer 5 on the graphical object 2, the user receives the composite (audio-tactile) perceptible signal through the audio device.
Particularly, the audio component of the perceptible signal reproduces an audible signal (e.g. a synthetic voice, a beep, etc.), that can describe the characteristic of the active area associated to the graphical object 2, whereas the tactile component of the perceptible signal actuates the tactile transducer 5 that can provide further characteristics of the active area associated to the graphical object 2.
It shall be noted that the perceptible signal associated to the active areas of the graphical object 2 has been prepared before exploration, which means that the graphical object 2, the areas actuated by the pointing device 5 A and the perceptible signal are defined beforehand.
The information to be presented to the user (such as name, size, position, etc. of the graphical object 2), which are carried by the perceptible signal, i.e. the audible sound and the tactile signal, are stored in the buffer 7.
Alternatively, the active areas of the graphical object and the related perceptible signals (i.e. audio-tactile effects) may be created by using a software that allows automation thereof.
In this second embodiment, the user selects a portion of the graphical object 2 to be presented to the user for exploration. Graphical objects 2 are processed by an edge detection software section. Similar to what has been described above, active areas are associated to the detected edges, and the perceptible signal (audio-tactile stimulus) selected for edge identification is associated to such areas.
In a third embodiment, the graphical object 2 is dynamically generated from certain data required to the user. For example, a profile of a mathematical function may be generated from the required function values. The active edge exploration areas are automatically associated to the graphical object, and are determined from mathematically or geometrically known image profiles. Here again, perceptible signals (audio-tactile stimuli) are then assigned to the active areas to allow audio- tactile exploration.
The operation of the above apparatus 1 described in its components will be now described with reference to Figures 4 A and 4B.
Figure 4 shows a possible graphical object 2, e.g. a two-dimensional graphical object such as a square, to be submitted to exploration by the apparatus 1.
It will be appreciated that exploration may concern two categories:
1) exploration of geometric shapes, e.g. circles, squares, complex polygons; 2) research of characteristics within an image, e.g. study of a mathematical function (Figure 5), exploration of a geographical map, etc.
Referring to Figures 4A and 4B, considering that the apparatus 1 is configured with a tactile transducer 5 wearable by the user (e.g. on a finger of a hand) and that the pointing device 5A is the screen 3 of a tablet PC equipped with an optical pen pointer, a possible implementation of the exploration may follow the following procedure:
- holding the left hand open, lying on the exploration surface 5B of the pointing device 5 A, and identifying the edges 5C of the screen 3 of the device 5 A;
- with the right hand holding the pen and wearing the tactile transducer 5, moving such right hand to keep contact with the left hand (that explores the edges
5C) to explore the graphical object 2;
- moving the hands at a linear speed of about 5 cm/sec;
- drawing the diagonals 20 and 21 by the pen/transducer over the surface 5B of the device 5 A to define the size of the graphical object 2. When the pen/transducer of the device 5 A comes close to at least one portion of the graphical object 2 prepared using the programmed means 11, any information present thereon is signaled by the apparatus 1 through the audio device by the generation of a perceptible signal;
- once the size of the graphical object 2 has been defined, selecting a starting angle 23 where a vibration or an audio signal signals the presence of an edge or a graphical characteristic of the graphical object 2;
- advancing linearly by small amplitude oscillations (shown in Figure 4B by zig-zag arrows 24) to determine the profile of the graphical object 2 on the screen 3;
- once the zig-zag exploration 24 in one direction encounters a change in the direction of the profile 25, exploring the surroundings through 360° to find the new direction.
- following the edge of the graphical object 2 a number of times, until edge detection can be consolidated by following such edge by a straight motion, with no zig-zag oscillations. The edge of the graphical object 2 has thus been recognized by the blind user. The above procedure, concerning the case of a blind user and a graphical object 2 consisting of a medium simplicity image averagely takes a variable time of 180 to 300 seconds.
It shall be noted that the audio/tactile signals generated by the apparatus 1 may be automatically generated during exploration or upon request by a click, e.g. by tapping twice the finger on the surface or pressing the button of the pen.
Referring now to Figure 5, if exploration does not concern edges but images having differentiated contents, e.g. mathematical functions or a geographical map, it takes a time proportional to the number of information elements to be positioned in one's mental map of the image.
The above apparatus may provide the following advantages: a) no particular hardware or software configuration is required, a stereo audio output (i.e. left and right audio channels) and a USB port for power supply to the tactile transducer 5 being simply needed; b) the user can promptly use the apparatus 1, no use and/or design of special drivers being required; c) the tactile transducer 5 is controlled by a waveform (Figure 3A) obtained by commonly available audio processing software.
Those skilled in the art will obviously appreciate that a number of changes and variants may be made to the arrangements as described hereinbefore to meet specific needs, without departure from the scope of the invention, as defined in the following claims.

Claims

1. An apparatus for exploring a graphical object (2) on a screen (3) of a graphics interface (4) through the movement of a pointing device (5A) for a user, said apparatus comprising:
- an electronic device (6) having a buffer (7) for storage of information to be presented to said user;
- said pointing device (5A) being operatively associated to said electronic device (6) to generate an input signal for addressing said buffer (7); wherein said pointing device (5A) is adapted to define a one-to-one correspondence between the position of the pointing device (5A) and the position of at least one portion of said graphical object (2) for a given movement of the pointing device (5A) controlled by said user and in that it comprises programmed means (11) operating in said electronic device (6), said programmed means (11) being adapted to produce a signal perceptible by said user, as a function of said correspondence between the position of the pointing device (5A) and the position of at least one portion of said graphical object (2).
2. An apparatus as claimed in claim 1, characterized in that said pointing device (5A) comprises a tactile transducer (5) which may be operatively associated to said pointing device (5A).
3. An apparatus as claimed in claims 1 to 2, characterized in that said tactile transducer (5) is directly connected to said pointing device (5A).
4. An apparatus as claimed in claims 1 to 2, characterized in that said tactile transducer
(5) is adapted to be worn by a user. S. An apparatus as claimed in claims 1 to 4, characterized in that it comprises an audio device operatively associated to said electronic device (6), having at least one first and one second output audio channels.
6. An apparatus as claimed in claims 1 to 5, characterized in that said signal perceptible by said user is the composition of an audio signal and a tactile signal.
7. An apparatus as claimed in claims 1 to 6, characterized in that said tactile transducer (5) is operatively connected to said electronic device (6) through said first output audio channel of said audio device, said tactile transducer (5) being adapted to produce said signal perceptible by said user as a function of said correspondence between the position of the pointing device (5A) and the position of at least one portion of said graphical object (2).
8. An apparatus as claimed in claims 1 to 7, characterized in that said audio signal is reproduced by a sound reproducing device (10) connected to said electronic device (6) through said second output audio channel of said audio device.
9. An apparatus as claimed in claims 1 to 8, characterized in that said programmed means (11) include an Image Map using HTML syntax, which is present in a Web Browser.
10. An apparatus as claimed in claims 1 to 8, characterized in that said programmed means (11) include Object Oriented programs such as Visual Basic, C++.
11. An apparatus as claimed in any of the preceding claims, characterized in that said graphical object (2) to be represented on a screen (3) of a graphical interface (4) is a two-dimensional graphical object.
12. An apparatus as claimed in any of the preceding claims, characterized in that said at least one portion of said graphical object (2) is a portion adapted to be activated by said pointing device (5A).
13. An apparatus as claimed in any of the preceding claims, characterized in that said user is a blind user.
14. An apparatus as claimed in any of the preceding claims, characterized in that said predetermined movement of said pointing device occurs at a speed of about 5 cm/sec.
15. An apparatus as claimed in any of the preceding claims, characterized in that said electronic device (6) is a personal computer or a multimedia system.
16. An apparatus as claimed in any of the preceding claims, characterized in that said pointing device (5A) is a touch screen, an optical pen and tablet PC, a digitizer with a digitizing tablet.
17. An apparatus as claimed in any of the preceding claims, characterized in that said two-dimensional graphical object (2) is an image that can be downloaded from a Web page.
18. A method for exploring a graphical object (2) on a screen (3) of a graphics interface (4) through the movement of a pointing device (5A), said method being characterized in that it comprises the steps of:
- associating an area to be activated by said pointing device (5A) to at least one portion of said graphical object (2);
- defining a one-to-one correspondence between the position of the pointing device (5A) and the position of at least said portion of said graphical object (2) for a given movement of the pointing device (5A) controlled by said user;
- generating a signal perceptible by the user, as a function of said correspondence between the position of the pointing device (5A) and the position of at least one portion of said graphical object.
19. A method as claimed in claim 18, characterized in that it comprises the additional step of preparing said pointing device (5A) to:
- generate an input signal for addressing a buffer (7) in an electronic device (6);
- receive a perceptible signal from said electronic device; - provide said perceptible signal to said user.
20. A method as claimed in claims 18 to 19, characterized in that said step of associating an area to be activated by said pointing device (5A) to at least one portion of said graphical object occurs before the exploration of the graphical object.
21. A method as claimed in claims 18 to 19, characterized in that said step of associating an area to be activated by said pointing device (5A) to at least one portion of said graphical object occurs automatically.
22. A method as claimed in any of the preceding claims 18 to 21, characterized in that said perceptible signal is a signal composed of an audio signal and a tactile signal.
EP06753970A 2005-06-06 2006-05-30 Apparatus and method for exploring graphical objects for users Withdrawn EP1894083A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ITMI20051043 ITMI20051043A1 (en) 2005-06-06 2005-06-06 "SYSTEM AND METHOD FOR THE EXPLORATION OF GRAPHIC ITEMS FOR USERS".
PCT/EP2006/005135 WO2006131237A1 (en) 2005-06-06 2006-05-30 Apparatus and method for exploring graphical objects for users

Publications (1)

Publication Number Publication Date
EP1894083A1 true EP1894083A1 (en) 2008-03-05

Family

ID=36649836

Family Applications (1)

Application Number Title Priority Date Filing Date
EP06753970A Withdrawn EP1894083A1 (en) 2005-06-06 2006-05-30 Apparatus and method for exploring graphical objects for users

Country Status (3)

Country Link
EP (1) EP1894083A1 (en)
IT (1) ITMI20051043A1 (en)
WO (1) WO2006131237A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090251421A1 (en) * 2008-04-08 2009-10-08 Sony Ericsson Mobile Communications Ab Method and apparatus for tactile perception of digital images

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5186629A (en) * 1991-08-22 1993-02-16 International Business Machines Corporation Virtual graphics display capable of presenting icons and windows to the blind computer user and method
JP3236180B2 (en) * 1994-12-05 2001-12-10 日本電気株式会社 Coordinate pointing device
US5736978A (en) * 1995-05-26 1998-04-07 The United States Of America As Represented By The Secretary Of The Air Force Tactile graphics display
US6834373B2 (en) * 2001-04-24 2004-12-21 International Business Machines Corporation System and method for non-visually presenting multi-part information pages using a combination of sonifications and tactile feedback

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2006131237A1 *

Also Published As

Publication number Publication date
WO2006131237A1 (en) 2006-12-14
ITMI20051043A1 (en) 2006-12-07

Similar Documents

Publication Publication Date Title
MacLean et al. Do it yourself haptics: Part ii [tutorial]
Paneels et al. Review of designs for haptic data visualization
US5736978A (en) Tactile graphics display
Biggs et al. Haptic interfaces
Freeman et al. Multimodal feedback in HCI: haptics, non-speech audio, and their applications
SE9600686D0 (en) Device and method of display
Nagano et al. Tactile feedback system of high-frequency vibration signals for supporting delicate teleoperation of construction robots
Wang et al. EV-Pen: Leveraging electrovibration haptic feedback in pen interaction
Hachisu et al. Vibration feedback latency affects material perception during rod tapping interactions
Hachisu et al. Augmentation of material property by modulating vibration resulting from tapping
Grabowski et al. Data visualization methods for the blind using force feedback and sonification
CA2923867C (en) Haptic rendering for a flexible computing device
Sun et al. Effects of multimodal error feedback on human performance in steering tasks
Brydges et al. Surface exploration using laparoscopic surgical instruments: The perception of surface roughness
Romano et al. Toward tactilely transparent gloves: Collocated slip sensing and vibrotactile actuation
Li et al. Tapping and rubbing: exploring new dimensions of tactile feedback with voice coil motors
WO2006131237A1 (en) Apparatus and method for exploring graphical objects for users
Deng et al. The roughness display with pen-like tactile device for touchscreen device
Del Piccolo et al. Path following in non-visual conditions
Asque et al. Haptic-assisted target acquisition in a visual point-and-click task for computer users with motion impairments
Sun et al. Effects of multimodal error feedback on human performance in steering tasks
McGee Investigating a multimodal solution for improving force feedback generated textures
Müller Designing with haptic feedback
KR101750506B1 (en) Controlling method and apparatus of haptic interface
Liu Multimodal interaction: developing an interaction concept for a touchscreen incorporating tactile feedback

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20071114

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

17Q First examination report despatched

Effective date: 20080314

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20080724