EP2067092A2 - Zeigegerät - Google Patents

Zeigegerät

Info

Publication number
EP2067092A2
EP2067092A2 EP07823469A EP07823469A EP2067092A2 EP 2067092 A2 EP2067092 A2 EP 2067092A2 EP 07823469 A EP07823469 A EP 07823469A EP 07823469 A EP07823469 A EP 07823469A EP 2067092 A2 EP2067092 A2 EP 2067092A2
Authority
EP
European Patent Office
Prior art keywords
keyboard
touch surface
contact
touch
symbol
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP07823469A
Other languages
English (en)
French (fr)
Inventor
Vincent Lauer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from FR0607791A external-priority patent/FR2905484A1/fr
Priority claimed from FR0702850A external-priority patent/FR2915294A1/fr
Priority claimed from FR0704451A external-priority patent/FR2917861A1/fr
Application filed by Individual filed Critical Individual
Publication of EP2067092A2 publication Critical patent/EP2067092A2/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/021Arrangements integrating additional peripherals in a keyboard, e.g. card or barcode reader, optical scanner
    • G06F3/0213Arrangements providing an integrated pointing device in a keyboard, e.g. trackball, mini-joystick
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual

Definitions

  • pointing devices for pointing a remote screen, including computer mice, trackballs, and touchpads.
  • Computer mice require a horizontal surface to function, which is why they are replaced by less bulky touch pads on laptops. Their use requires a large movement of the hand that must leave the keyboard to grab the mouse, which is a waste of time. The repeated use of the mouse and in particular its pimples, for many years, generates occupational diseases related to the constant repetition of the same gestures.
  • the touch pads are usually positioned horizontally in front of the keyboard for example in US Patent 6,501,462 entitled “ergonomics touch pad". These touch pads are usually made using a so-called capacitive technology, which requires touching the touch pad with the fingertip and does not allow the detection of a contact made for example with the finger nail.
  • these touch pads Being positioned horizontally, these touch pads use a large space on a horizontal work table, and their size is often limited for this reason, even if they are not used with laptops.
  • the use is relatively unintuitive because of the horizontal position and the small size of the pad.
  • the use is also unpleasant in the long run because it is necessary to slide the pulp of the finger, particularly sensitive, on the surface of the touchpad. It has also been proposed to set such a touchpad to the leg of a user for ease of use (publication "ergonomics touchpad" of the company Synaptics on IP.com on 30/11/2005).
  • Some touch pads can also be positioned horizontally behind the keyboard.
  • touch screens are not meant to point to a remote screen.
  • portable computers equipped with such touch screens which can be used in combination with a projector, in which case the same image is superimposed on the touch surface and projected onto a screen.
  • Such laptops are not intended to be used as pointing devices, on the one hand because of their excessive cost, on the other hand because the displayed image is that intended to be observed directly and is therefore not suitable for pointing a remote screen when the user does not look or look at the touch screen.
  • the tactile surfaces used to detect the position of a finger on a contact surface are of various types.
  • the least costly tactile surfaces to be manufactured are so-called "resistive" screens, in which the finger coming into contact with the surface creates resistivated variations which are detected and from which the position of the finger can be recalculated.
  • This type of tactile surface allows the detection of a contact made with the pulp or the nail of a finger or with a stylus. Surfaces operating by detection of capacity variations can they delight only the pulp of a finger. Both types of tactile surfaces as well as other common types can only delight one contact at a time, which limits the possibilities of use to simulate for example the activation of a mouse button.
  • a method of pointing and activating a virtual button using a stylus touching a touch screen is described in US Patent 6,791,536 B2. The pointing method described in this patent requires the use of a stylus, and allows the activation of only a limited number of buttons.
  • tactile surfaces make it possible to simultaneously identify a large number of contacts.
  • Such tactile surfaces operate for example by the principle of total internal reflection which means that when a finger touches the surface the fingerprint is visible on the other side and can be recorded by cameras. From the image recorded by the cameras and with appropriate processing we can identify several contacts made with the finger.
  • touch surfaces make it possible to considerably improve the possibilities of use since several contacts can be detected simultaneously.
  • An example of use of this possibility is illustrated by US Patent 5,870,083.
  • these tactile surfaces are relatively expensive.
  • tactile surfaces including a proximity delimiter for detecting the position of a finger in the vicinity of a solid surface, but which does not necessarily touch the solid surface.
  • a set of infrared beams may form a grid in front of a glass plate.
  • the system gives the position of the finger on the glass plate.
  • proximity sensors for example, the position of the finger can be detected by detection on a camera using a linear sensor, or by capacitive sensors in a plate forming the touch surface, or by using multiple cameras with linear or matrix sensors.
  • the position of a finger is detected near a material surface on which the finger can come to bear.
  • An example of a capacitive proximity sensor touch surface is described in US Pat. No. 6,323,846.
  • An example of a linear camera touch surface is described in published patent application number US 2005/0248540.
  • Other examples of tactile surfaces using linear or non-linear cameras are given by US Patent Applications 7,236,162 and US 4,782,328.
  • the infrared beam grid mentioned above can work even without a glass plate. In this case, it constitutes an "immaterial" tactile surface. It allows the detection of the movement of the finger touching the surface, but without there being the return related to touching a material surface.
  • Various types of intangible tactile surfaces can be made. However, it is difficult to obtain a stable score using such devices, because the finger, in the absence of support, tends to tremble.
  • the user has no feedback indicating that he is pointing, which is likely to cause false movements of the pointer related to involuntary movements of the user.
  • Some tactile surfaces can detect not only the position of the finger but also the pressure exerted by the finger. Such a touch surface is described for example in US Patent 5,159,159.
  • Touch sensitive surfaces also fall into this category. They usually comprise a rigid plate and at least three force sensors arranged on this plate. The user touching the rigid plate exerts forces on the three sensors that depend on the position of the contact. The forces are measured and the position of the contact is recalculated.
  • This type of tactile surface is described for example in US Patents 4,121,049 and US 5,376,948.
  • the pointing symbol In some pointing devices based on tactile surfaces, when the user touches the touch surface, the pointing symbol is immediately positioned at a point that depends only on the position of the contact on the touch surface. In contrast, the user of a computer mouse can pick up the mouse without changing the position of the pointing symbol. The position of the pointing symbol in such a system based on a tactile surface is immediately disturbed by the use of the touch surface, which is not practical for example to move the pointing symbol over short distances and with precision, what the user often has to do.
  • the pointing symbol is moved only when the contact on the touch surface is itself moved without interruption of contact.
  • the above problem is solved, but the device is deprived of one of the major advantages of touch-screen pointing systems, namely that it is possible to point instantly any point of the screen by placing correctly his finger on the touch surface.
  • a six-dimensional control device employing a plurality of differently-oriented touch-sensitive surfaces is also disclosed in US Patent 5,805,137.
  • this device six tactile surfaces forming a cube are used to control the position and orientation of a real or virtual object.
  • This device is not intended to be used as a replacement for a computer mouse.
  • a device that can be used as a secondary display screen or as a graphics tablet, associated with a main display screen, is illustrated by US Patent 5,995,085.
  • the graphics tablet is not used to point the main display screen; it is superimposed on a secondary display screen and is used to draw directly on this secondary display screen.
  • the graphics tablet is not suitable to be pointed with the finger.
  • the invention aims to solve various problems existing in the state of the art and mentioned above.
  • the invention aims to allow a pointing using a principle similar to the touchpad, but better ergonomics, allowing it to compete favorably with the mouse in terms of ease of use.
  • a current device using a touchpad can be defined as comprising:
  • a computer connected to the main display screen, a keyboard allowing a user to send alphanumeric characters to the computer and arranged to allow the user striking alphanumeric characters to simultaneously observe the main display screen, the keyboard determining a mean geometric plane of the keyboard which minimizes the average distance between the keys of the keyboard and the average plane of the keyboard, - a touch surface not superimposed on the main display screen, arranged for a user to point with his finger on the touch surface and observes at the same time time the main display screen, the touch surface determining a medium geometric plane of the touch surface which minimizes the average distance between the touch surface and the average plane of the touch surface, and
  • the computer being adapted to move a pointing symbol on the main display screen according to the position of the finger on the touch surface
  • the invention differs from a system using a conventional touchpad in that the keyboard and the touch surface are arranged so that:
  • the angle between the average plane of the tactile surface and the average plane of the keyboard oriented positively in the direction that goes, at the front of the keyboard, on the side of the keyboard or there are no keys to the side keyboard or keys, ranging from 30 degrees to 135 degrees.
  • the intersection between the average plane of the touch surface and the average plane of the keyboard is at the back of all the keys of the keyboard.
  • the intersection between the mean plane of the tactile surface and the average plane of the keyboard is preferably oriented along the transverse dimension of the keyboard, that is to say if the keyboard is approximately rectangular, along its longest side.
  • the intersection between the mean plane of the tactile surface and the average plane of the keyboard is preferably oriented approximately, that is to say for example to 30 degrees, in a direction parallel to the width direction of the letters drawn on the part.
  • central keyboard which usually corresponds to the transverse dimension of the keyboard. This orientation allows the user facing the keyboard and the main display screen to have the touch surface facing him, substantially parallel to the main display screen, which facilitates the pointing and setting. relationship of the touch surface with the screen.
  • the angle chosen makes it possible to reduce the size of the tactile surface and thus to use a larger tactile surface.
  • the decrease in bulk is generally greatest for angles between 30 and 90 degrees when the touch surface is over the keyboard, the intersection between the average plane of the touch surface and the middle plane of the keyboard being at the back of the keyboard .
  • the decrease in size can also be maximal in configurations where the angle is between 90 and 135 degrees and where the intersection between the mean plane of the tactile surface and the mean plane of the keyboard is further ahead. keyboard.
  • the angle increases from 90 degrees the intersection between the average plane of the touch surface and the average plane of the keyboard being at the back of the keyboard, the size gradually increases while remaining significantly lower than that of a touch surface located in the plane of the keyboard. keyboard.
  • the angle chosen makes I more intuitive. When the touch surface is approximately parallel to the main display screen, the mind more easily associates the position of the finger with the position of the pointing symbol.
  • An "ideal" angle is 90 degrees, however the angles of 45 to 135 degrees are a significant improvement over a tactile surface located in the plane of the keyboard. The fact that the size is reduced also increases the size of the touch surface which is favorable to a more intuitive use.
  • the chosen angle minimizes the movement of the fingers between the keyboard and the touch surface. This displacement is minimized for angles of about 60 degrees but remains in all cases less than would be necessary for an equivalent tactile surface in the plane of the keyboard.
  • the exact configuration chosen will generally be a compromise between the various objectives of minimizing congestion, the intuitive nature of the pointing and minimizing the movement of the fingers between the keyboard and the touch surface.
  • An angle of about 60 degrees is for example a good compromise. Excessively open angles require a large movement of the finger that is to be avoided.
  • the angle between the average plane of the touch surface and the average plane of the keyboard is therefore between 30 degrees and 110 degrees.
  • a solution facilitating the matching of the touch surface with the main display screen is that the touch surface is substantially vertical, the angle between the average plane of the touch surface and the average plane of the keyboard being, for example, 85 at 95 degrees.
  • a solution that favors the limitation of the necessary movement of the user's finger is to use a tactile surface that overhangs the keyboard, the angle between the mean plane of the tactile surface and the average plane of the keyboard being, for example, from 30 to 85 degrees. .
  • This position allows an easier passage from the keyboard to the touch surface and vice versa, for the fingers of the user to alternate typing and pointing.
  • the touch surface of the invention should preferably be adapted to detect the position of a finger whose only nail is in contact with the touch surface. Indeed, if it were necessary to bring the finger pulp into contact with a surface close to the vertical, this would cause a strong contortion of the finger or the hand of the user, not in conformity with the ergonomic objectives of the invention.
  • the touch surface is integral with the keyboard, and the connection between the keyboard and the touch surface is adapted so that the angle between the average plane of the touch surface and the average plane of the keyboard is between 30 degrees and 135 degrees , and so that the touch surface is on the side of the keyboard or are the keys of the keyboard.
  • the touch surface is preferably mechanically linked to the keyboard by a fixed or adjustable link.
  • the fact that the touch surface is linked to the keyboard allows the user to get used to the relative position of the touch surface and the keyboard and acquire the automation to point quickly using the touch surface. Just as the experienced user can type a text virtually without looking at the keyboard, he must be able to point almost without looking at the touchpad, and go from pointing to typing without having to look at the touchpad. This is only possible if the position of the touch surface is well defined with respect to the keyboard. If it is constantly changed when the user moves his keyboard, the necessary automation can not easily be acquired.
  • the keyboard provides a solid base for the touch surface which without this should be fixed via a second base on the work table, which is not favorable in terms of size.
  • connection can be adjustable, for example the touch surface can be rotatable about an axis to fold the touch surface on the keyboard to facilitate storage of the whole.
  • the angle between the average plane of the touch surface and the average plane of the keyboard is not between 30 degrees and 125 degrees. But for at least one position that can be reached the angle between the average plane of the touch surface and the average plane of the keyboard must be between 30 degrees and 125 degrees.
  • the invention is simpler and less expensive to achieve if the connection is fixed. In addition this solution simplifies the adaptation of the user to the keyboard-touch surface assembly.
  • a system that can be compared to the invention is a touch-screen portable computer used with a projector, the projection screen being considered as a main display screen and the touch screen display of the portable computer constituting a secondary display screen.
  • a system is not intended to achieve an ergonomic workstation as is the case of the invention.
  • the invention is distinguished by the fact that the touch surface is not superimposed on a reproduction of the image displayed on the main display screen.
  • no computer window displayed on the main display screen is reproduced on the secondary display screen.
  • no part of the main display screen is reproduced on the secondary display screen.
  • the invention is also distinguished from a touch screen laptop used with a projector in that in such a system the user who uses the laptop to point the fact normally by looking at the screen of the laptop , the system is not adapted so that the user can point to the laptop while watching the projected image.
  • the secondary display screen can be deleted or simplified, which decreases the cost.
  • the high brightness of the screen of a laptop is bad for the eyes of the user and makes him largely lose the benefit of using a remote main display screen. Removing or simplifying the secondary display screen avoids this disadvantage.
  • the secondary display screen forms with the touch surface a touch screen. It will be used in the context of the invention to display items useful for pointing and easily identifiable by a user looking very little secondary display screen and focusing on the main display screen. For example, a tracking symbol to reproduce the movement of the pointing symbol will be displayed on the secondary display screen. To be easily identifiable this tracking symbol will be different from the pointing symbol used on the main screen. In particular, it is desirable that the ratio of the surface of the tracking symbol to the surface of the touch screen be at least twice the ratio of the area of the pointing symbol to the surface of the main display screen. .
  • the invention also relates to a complete computer system comprising a computer, a main display screen, a keyboard and a touch surface, a data input device used in the system.
  • complete computer consisting of the touch surface and the keyboard integral with each other, associated with computer interfaces and if necessary to a computer medium on which are recorded instructions loadable by a processor.
  • the invention also relates to a tactile surface mounted on a support adapted to be used in the device.
  • the assembly comprising the keyboard and the tactile interface is for example characterized by the following facts:
  • - It comprises a secondary display screen superimposed on the tactile surface, the secondary display screen and the touch surface forming together a touch screen, - It is adapted to display on the touch screen a tracking symbol, and for moving the tracking symbol so that its position follows the movements of a finger on the tactile surface within a pointing zone,
  • the pointing area is rectangular and covers at least half of the surface of the touch screen.
  • a problem in the context of the invention is that it is desirable both to be able: a) - “resume” the pointing symbol as one would take a computer mouse, without immediately disturbing its position; b) - Instantly "point” points away from the display screen, by properly positioning the finger on the touchpad, as can be done using a stylus on common touch screens.
  • one element of the solution consists in the use of a tracking symbol displayed on the secondary display screen and a control means adapted for: a) when the contact on the touch screen is moved progressively without interruption of contact, move the tracking symbol to follow the movement of the contact, b) when the contact on the touch screen is interrupted and then restored, then: i) leave the stationary tracking symbol if the recovery point at which contact is restored is in a recovery zone associated with the tracking symbol position; the area of recovery having a surface less than a quarter of the surface of the touch screen and greater than 1 square millimeter. ii) position the tracking symbol near the recovery point if the recovery point is outside the recovery zone associated with the tracking symbol position.
  • the tracking symbol appears as an element that can be "grabbed” like a mouse and then moved gradually (case where the recovery point is in the recovery zone), but that can also be moved instantly.
  • the recovery zone is at least one square centimeter and for an inexperienced user it can be, for example, 10 cm 2. It preferably coincides with the tracking symbol but may also, for example, be centered on the tracking symbol while being larger. It should be noted that in case (a) the tracking symbol follows the movement of the contact but does not necessarily coincide with the contact. On the main display screen, it is then desirable that the position of the pointing symbol depends solely on the position of the tracking symbol (and not on the position of the contact on the touch screen). In this way, it is possible for the user to "resume” the pointing symbol without the simple fact of squinting the touch screen systematically moves this pointing symbol.
  • This adapted control means is particularly interesting when the touch screen is close to the vertical and distinct from the main display screen, however it can be used indifferently with a horizontal touch screen. It can also be used in a system where the touch screen also constitutes the secondary display screen, such as a "tablet PC" or a personal assistant. In this case it can be used to use the finger to point, in systems usually using a stylus: indeed the stylus is used for its accuracy, and the adapted control means allows satisfactory operation despite absolute accuracy of the score which is weaker. Rather than using a secondary display screen, it is possible to use a touch surface that is not superimposed on any display screen, that is to say no screen displaying an editable image during the time. In this case, the device produced is particularly economical.
  • the pointing symbol can have the desired behavior without the need to display a tracking symbol.
  • the problem is solved, with or without a secondary display, by a main display control means adapted for: a) when a contact on the touch surface is progressively moved without interrupting the contact, moving a pointing symbol on a screen of main display, so as to follow on the main display screen the displacement of a display point associated with the contact, depending solely on the position of the contact, b) when the touch on the touch surface is interrupted and then restored, then: i) leave the pointing symbol stationary if the recovery point at which contact is re-established is in a recovery zone associated with the position of the aiming symbol; the area of recovery having an area less than one quarter of the surface of the touch screen and greater than 1 millimeter squared, ii) positioning the pointing symbol in the vicinity of the display point associated with the recovery point if the recovery point
  • each point of the main display screen is associated with a point of the touch surface and the recovery area surrounds a point of the touch surface that is associated with the point of the main display screen that is pointed at by the pointer.
  • This adapted control means is particularly interesting when the touch surface is close to the vertical and not superimposed on a display screen, however it can be used indifferently with a horizontal touch surface although direct pointing becomes less intuitive. It can also be used in a system where the touch surface is superimposed on the main viewing screen viewed by the user, such as a "PC lablet" or a personal assistant. In this case it allows to use the finger to point, in systems usually using a stylus: indeed the stylus is used for its accuracy, and the adapted control means allows satisfactory operation despite absolute accuracy of the score which is more low.
  • the touch surface can be of the different types mentioned above. Preferably it must be adapted to detect the position of a contact when this contact takes place between the nail of a finger and the touch surface. This is the case of resistive touch screens 4-wire or 5-wire, or tactile surfaces with force detection. This is also the case of proximity sensors that detect a finger close to the touch surface, whether the contact is with the help of a fingernail or with the finger pulp. However this is not the case of capacitive sensors used in the touch pads of most laptops.
  • a resistive touch surface has the advantage of being inexpensive. However, a significant pressure must be exerted on the touch surface so that it detects the presence of a finger, which can generate finger fatigue when the touch surface is used intensely.
  • a touch-sensitive surface with proximity detection has the advantage of not requiring this pressure. The finger should simply touch the touch surface, without the need to exert pressure.
  • a sensitive touch surface capable not only of detecting the position of a contact, but also the pressure exerted by the finger at the point of contact. Indeed, the mouse click can "then be replaced by a pressure exerted on the touch surface and exceeding a predetermined threshold. The fact of exerting pressure being a simple gesture for the user, this solution is advantageous compared to a mouse-click simulation by other means
  • a touch-sensitive surface with force detection can for example be used This touch-sensitive surface makes it possible to measure the force exerted by the finger at the point of contact and has the advantage of also being inexpensive, and can be curved without major manufacturing difficulties.
  • the touch surface used preferably comprises a "material" plate materializing the position of the touch surface, so that the user touching the touch surface has an immediate tactile feedback when the finger is touched with the touch surface.
  • an intangible tactile surface not allowing this return, can also be used. It must be integral with the keyboard, so the elements allowing detection (infrared beam frame for example) must be mechanically secured to the keyboard.
  • a secondary display screen is superimposed on the touch surface, it necessarily has a hardware surface that generates tactile feedback and is considered part of the touch surface. The use of a surface, intangible touch is therefore possible only in the absence of a secondary display screen superimposed on the touch surface.
  • the invention makes it possible to use a tactile surface, for example a vertical surface and pointed for example with the nail. on point.
  • the horizontal size of such a touch surface is low, which makes it possible to use a larger surface area, its position almost parallel to the main display screen is favorable to an intuitive connection of the tactile surface with the main display screen, and it is pointed with the finger nail which makes its use more enjoyable.
  • the diagonal of the tactile surface is preferably greater than 12 cm and preferably less than 35 cm.
  • the touch surface must be sufficiently close to the keys of the keyboard.
  • the smallest distance between the touch surface and a touch of the keyboard should therefore preferably be less than 10 cm.
  • the distance between the center of the keyboard and the center of the touch surface should preferably be less than 25 cm.
  • the most ergonomic keyboards are relatively wide, while the touch surface must keep reasonable dimensions so that each point of the surface is easily reached. This leads to a touch surface whose width is less than 85% of the width of the keyboard, and preferably less than 70% of the keyboard width.
  • the touch surface can be bent downward.
  • the keyboard can be weighted. Just as tactile feedback is desirable from the touch surface, good tactile feedback is desirable from the keyboard. The best tactile feedback is achieved with a keyboard with keys mounted on springs. The possibility of using such a keyboard is an important advantage of the invention compared to a touch-sensitive keyboard-surface assembly using a single touch surface type "multitouch" which plays both the role of keyboard and pointing device.
  • a material tactile surface is fixed on a support.
  • the side of the support opposite the touch surface is preferably non-touch, so as to prevent accidental contacts on this side do not generate involuntary movements of the pointing symbol.
  • the invention is not limited to a touch surface mounted originally on a keyboard, but also includes a touch surface equipped with a support adapted to attach a keyboard, which when assembled with the keyboard to obtain the keyboard assembly - tactile surface according to the invention.
  • the touch surface is used to position on the display screen a pointing symbol at a point depending on the position of the contact on the touch surface, and the activation of the buttons of a mouse is simulated by another way.
  • these specialized keys are mounted on springs with a relatively wide clearance as on a traditional keyboard, because this type of button provides a more pleasant return to the user and requires less effort than the traditional buttons of a mouse, with lower travel.
  • these specialized keys are installed on the left of the keyboard to be found easily and intuitively by the user. Note that in a left-handed keyboard these keys would be rather right, the most skilled hand to be used to point.
  • Activation of the button of a "virtual" mouse can also be simulated from gestures made by the user on the tactile surface, using one or more fingers.
  • the computer can wait for the position of the touch on the touch screen to stabilize for a predetermined period of time. It then displays on the main display screen at least one accessory window near the pointing symbol. By bringing the pointing symbol to one of this window and stopping it, the user for example generates a mouse click. The computer then returns the pointing symbol to the point on which it had initially stabilized. It is desirable, in the context of the invention, for the accessory window to be drawn by the computer inside the image of the recovery zone on the main display screen. In this way, after returning the pointing symbol to the point on which it had initially stabilized, the user does not have to move his finger to "resume" the pointing symbol.
  • the computer can simply display a symbol signaling the stabilization and detect a predetermined gesture, for example the user may have to draw an "8" to trigger a mouse click .
  • the device according to the invention may comprise a simulation means adapted to:
  • the second contact zone depends on the position of the first finger and does not include the position of the first finger.
  • the simulation means is adapted for
  • the simulation means is also suitable for:
  • the button of a mouse is simulated in a very natural way by a contact of the second finger with the tactile surface which replaces the contact of a finger with the mouse button.
  • This preferably requires a "multitouch" touch surface capable of separately detecting the positions of two fingers.
  • One aspect of the invention is to achieve this by means of a tactile surface that does not delect the position of two fingers separately, but can detect the average position of the two fingers.
  • the device of the invention is adapted to determine the existence and the average position of a contact between a finger and the touch surface, and comprises a series of instructions recorded on a support and loadable by a processor, adapted for the processor to perform a computer procedure comprising: a) a first step of determining the existence and the average position of a touch on the touch surface, b) a second step of determining the existence and the average position of a contact on the touch surface, c) if in two steps a contact exists on the touch surface and if in the second step the average position of the touch conlacl is in an activation zone dependent on the average contact position in the first step and does not include the average position of the contact in the first step, a simulation step of the activation of a button of a pointing device and / or generating a change of state indicator.
  • the activation zone is adapted to be reached when, a first finger being in contact with the tactile surface during the first determination step, a second finger comes into contact with the tactile surface during the second determination step. , which moves the average position of the contact, which is constituted by the two fingers during the second determination step.
  • the activation zone may be a ring of suitable dimensions given the scarlet between the index finger and middle finger so that these two fingers can be used.
  • the zone of continuity is a disk surrounding the average position of the contact during the first determining step, the radius of the disk being adapted so that during a movement of the finger on the tactile surface without breaking the contact and at a usual speed for the user, and given the time interval between two determinations step, the average position of the contact remains in the continuity zone.
  • each determination step only detects a single position, corresponding to an average contact position if the contact comprises two contact points such as is the case with two fingers during the second determination step.
  • This computer procedure is particularly interesting when the touch surface is close to the vertical and not superimposed on a display screen, however it can be used indifferently with a horizontal touch surface although direct pointing becomes less intuitive.
  • This procedure can also be used in a system where the touch surface is superimposed on the main viewing screen viewed by the user, such as a "tablet PC" or a personal assistant. In this case it simulates the activation of a mouse button, without unduly disturbing the pointing operation and without using an active or passive stylus. It therefore retains a great interest in this type of devices.
  • a solution then consists in "ignoring" the measurement leading to an average contact position located in an intermediate zone between the continuity zone and the activation zone. Therefore, in a preferred version of the method, if in two steps of determining a contact exists on the touch surface, and if in the second step of determining the average position of the contact is in an intermediate zone separating the zone of continuity of the zone activation, then the method repeats the second determination step and then uses the result of the first determination step and the new result of the second determination step.
  • the simulation and / or generation step is preferably followed by the following steps:
  • a third step of determining the existence and the average position of a contact on the tactile surface a fourth step of determining the existence and the average position of a contact on the tactile surface
  • a contact exists on the tactile surface and if in the fourth step the average position of the contact is in a deactivation zone depending on the average contact position during the third measurement and not including the average position of the contact during the third measurement, a step of simulating the deactivation of a button of a pointing device and / or of generating a change indicator,
  • this kind of device is the sensitivity to ambient light.
  • this sensitivity is reduced by means of a tactile surface comprising at least two cameras generating images from which the position of a contact on the tactile surface is obtained by triangulation, and comprises an absorbent protection covering a zone reached. by optical rays parallel to the tactile surface and passing through the center of the front lens of a camera, to generate a black background on which a finger is detached in contact with the tactile surface.
  • the absorbent protection takes the form of an absorbent cavity.
  • an auxiliary lighting device is used generating a diffused light directed from the rear to the front of the tactile surface.
  • this device comprises a diffusing surface illuminated by a source of illumination.
  • the lighting device also comprises an absorbent grid traversed by the light from the diffusing surface before it reaches the touch surface.
  • the grid constitutes a set of adjacent holes traversed by the light, and the depth of these holes determines the maximum angular aperture of the beams at the gate exit.
  • the grid makes it possible to avoid, by controlling the angular opening of the beams, that the illumination does not reach the absorbent protections directly. It is therefore preferably sized to limit the angular aperture of the light rays, so that they do not directly illuminate an area seen by the camera in the absence of a finger on the touch surface.
  • One way to further reduce the sensitivity to ambient lighting is, according to the invention, to use a substantially monochromatic auxiliary lighting, and to have, on the path of the light beams going to the sensor of each camera, monochromatic filters.
  • the external light is strongly attenuated by the monochromator filters while the light of the auxiliary lighting device, after diffusion by a finger in contact with the tactile surface, reaches the sensor in full.
  • the object of the invention is to facilitate pointing even though the touch surface is not directly superimposed on the display screen. According to the invention, this is facilitated by the use of a pre-pointing symbol.
  • the area in front of the touch surface is divided into a contact area and a detection area, a finger in the contact area being considered to be in contact with the touch surface, a finger in the detection area being detectable but not considered as touching the tactile surface.
  • the contact zone is adapted so that any finger in contact with the tactile surface is in the contact zone.
  • the device according to the invention preferably comprises a control means adapted to:
  • the control means may comprise a signal processing processor and / or a recording medium on which instructions executable by a processor are recorded and allowing the execution of a suitable procedure.
  • FIG. 1 represents a device according to the invention.
  • Figures 2 and 3 illustrate a data input assembly including a keyboard and a touch surface.
  • Figure 4 illustrates the position of specialized keys on the keyboard.
  • Figure 5 illustrates the use of a pointing symbol and a recovery zone.
  • FIG. 6 and FIG. 7 illustrate a procedure for carrying out the principles illustrated in FIG. 5.
  • FIG. 8 and FIG. 9 illustrate a principle of activation of virtual buttons of a mouse, by a user bringing several fingers into contact with each other. touch surface.
  • Figure 10 illustrates the position of a tracking symbol and a contact area indicator for contact by a second finger.
  • Fig. 11 and Fig. 12 show a procedure for realizing the principles of Figs. 8 and 9.
  • FIG. 13 shows a data input assembly having a keyboard and a touch surface, but without a secondary display screen.
  • Fig. 14 shows a curved touch surface and a laser scanning display device of a curved screen.
  • FIG. 15 shows a modification of FIG. 1 in which a specialized processor manages the various interfaces.
  • Figure 16 shows several possible variations of the data input device of Figure 13.
  • Figure 17 shows further variations of this device.
  • Figure 18 shows in front view an improved pointing device.
  • Figure 19 shows a section central point of this pointing device.
  • Figure 20 shows the image of a finger on the sensor of a camera.
  • Figure 21 shows the principle of a triangulation calculation.
  • Figure 22 shows a procedure for positioning a pre-score symbol.
  • Figure 23 illustrates the case where two fingers simultaneously come into contact with the touch surface.
  • Figure 24 illustrates the principle of using the pre-score symbol.
  • Figure 25 illustrates a modification of the device of Figure 19 in which the touch surface is tilted.
  • FIG. 1 represents an embodiment of the device according to the invention.
  • a computer 212 comprising a processor, a memory, and various interfaces, controls a main display screen 201 through a graphics card 211, receives information from a touchpad surface 213 via a touch screen controller 210, controls a secondary display screen 215 through a graphics card 209, and receives information from the keyboard 206.
  • the secondary display screen 215 and the touch surface 213 together form a touch screen having both a secondary display function and a contact detection function on the touch surface.
  • the touch surface 213 responds to the stresses of the fingers of the hand 203 of a user whose eye 202 observes the main display screen 201 but can also look at the keyboard 206 or the secondary display screen 215.
  • the tracking symbol has the role of placing itself in the last known position of the finger on the touch screen, so as to allow the user to quickly locate this position if he wants to replace his finger for " resume "both e tracking symbol and the pointing symbol.
  • FIG 2 and Figure 3 show in more detail the data input device constituted by the keyboard and the touch screen.
  • the touch surface 213 is made for example with technology "resistive 4 son” which allows to reveal a contact made by the user's nail. It comprises a glass plate and a polymer membrane, the plate being separated from the membrane by polymer beads, for example.
  • the assembly is represented by the element 204 whose outer surface constitutes the touch surface 213.
  • the secondary display screen 215 is for example a liquid crystal screen, also shown in a simplified manner.
  • the touch surface 213 and the secondary display screen 215 are held by a support plate 905 and a frame 901.
  • a flexible seal 903 separates the touch surface from the secondary display screen.
  • a flexible seal 908 separates the secondary display screen from the support plate 905.
  • a frame 904 makes it possible to fix the distance between the frame 901 and the plate 905.
  • the frames 901 and 904 can be fixed on the plate 905 by gluing or by screwing.
  • the plate 905 is itself fixed by gluing or screwing on a piece 906 ending in an acute angle and comprising a flat part on which is fixed the keyboard, by gluing or screwing.
  • FIG. 2 represents the sectional assembly and FIG. Represented in front view, different types of hatching were used to distinguish different elements.
  • the plate 905 may preferably be steel, as well as the element 906, in order to minimize the constant thickness deformations.
  • the assembly shown in Figure 2 can be removed in a keyboard attached to a part 906, and a touch surface attached to the part 905, for easy transport.
  • the width LT of the touch surface is preferably less than 85% of the LC width of the keyboard.
  • the diagonal DT of the tactile surface is preferably between 15 and 35 cm.
  • the angle Q between the plane of the keyboard and the plane of the touch surface is preferably less than 85 degrees and greater than 30 degrees.
  • the angle Q can however go up to 125 degrees within the scope of the invention, although too high angles are not very advantageous.
  • the intersection between the plane of the keyboard and the plane of the touch surface is at the back of the keyboard, the front corresponding to the direction towards which the touch surface is looking.
  • Figure 4 shows the keyboard alone, seen from above.
  • the keyboard may comprise specialized keys 401 and 402 shown in FIG. 4, preferably located to the left of the keyboard and having a surface greater than the surface of the keys representing letters. These specialized keys replace the left and right mouse buttons and thus make it possible to simulate the activation of the mouse buttons, the pointing itself being performed on the touch surface.
  • the letter A has also been represented on one of the keys of the keyboard of FIG. 4.
  • the direction of the width of this letter corresponds normally to the transverse dimension of the keyboard.
  • the intersection 403 between the plane of the tactile surface and the plane of the keyboard is also drawn in FIG. 4. It is oriented according to the transverse dimension of the keyboard which is also the direction of the width of the letter ⁇ .
  • Figure 5 illustrates the principle of using the pointing symbol and the tracking symbol.
  • the user touches the touch screen with his finger 301.
  • a tracking symbol 319 constituted by a clear disc is displayed on the touch screen, centered on the point of contact.
  • the pointing symbol 302 is at the same time displayed at a corresponding point on the main display screen. Then the user stops touching the touch screen and the tracking symbol remains in place as indicated by the frame (b). Similarly, the pointing symbol 302 remains in place.
  • the user will then touch the touch screen again, either in order to resume and then move the pointing symbol (frame c) or in order to directly bring the pointing symbol to a new position (frame d).
  • the user touches the touch screen at a point which is inside the area of recovery coinciding with the disc 319.
  • the tracking symbol and the pointing symbol do not move. not, although the user has touched the edge of the disc 319 and not its center. If the user then moves their finger, the tracking symbol and the pointing symbol will follow its movement.
  • the user touches a point on the touch screen that is outside the area of recovery. The tracking symbol is immediately centered on this new point and the pointing symbol is also immediately brought to a corresponding point of the main display screen.
  • the user can therefore resume the pointing symbol without moving it, by touching approximately the corresponding point of the touch screen, a certain margin of error, defined by the area of recovery, being tolerated. It can also point directly to a distant point. It benefits from both the advantages of the mouse (recovery in hand) and a specific advantage to the touch screen (direct pointing). In addition, the pointing symbol behaves like a material element that one would move on the touch screen with a finger. Its use is simple and intuitive.
  • Figures 6 and 7 show a procedure by which the computer manages the recovery zone.
  • Figure 6 schematically illustrates the pointing procedure which runs in the background on the computer 212 and which manages the pointing.
  • Figure 7 shows the same procedure.
  • Figure 7 shows the principle of each step whereas Figure 6 details the operations affecting variables used by the program.
  • Pcour (XPour, YPour) represents the position of the tracking symbol on the touch surface.
  • Xpoint, Ypoint are the pixel coordinates of the point pointed by the pointing symbol on the main display screen 201, then Xpoint and Ypoinl are proportional to XPcour and YPcour.
  • the tracking symbol is then positioned on the secondary display screen at the coordinates (TXPour, TYPcour) expressed in pixels of this screen.
  • the position Pl represents the position of the contact (finger) on the tactile surface.
  • Pl represents the position of a contact, measured directly on the tactile surface
  • Pcour represents a position of the tracking symbol calculated by the pointing procedure.
  • Pl has a horizontal coordinate varying from 1 to Ntac and a vertical coordinate varying from 1 to Ntac, Ic (1,1) being the lower left corner of the touch surface and the point (Ntac, Ntac) being the upper right corner of the touch surface.
  • Ic (1,1) being the lower left corner of the touch surface
  • Ntac, Ntac being the upper right corner of the touch surface.
  • S represents the offset between the position of the contact on the touch surface and the position of the tracking symbol.
  • C represents the existence of a contact on the tactile surface.
  • C I means there is contact,
  • C O means no contact.
  • the radius of the disk constituting the recovery zone measured in the same unit ("pixels" of the touch surface) as the positions Pl, is noted limit_0 and is for example Ntac / L or L is the width in centimeters of the touch surface . This corresponds to a recovery zone of 1 cm radius.
  • step 102 the contact and the position are updated, the variable C receives the state of the contact, namely 1 if at least one finger is in contact with the touch surface, 0 otherwise.
  • the position variable Pl containing two integers receives the position of the contact on the touch surface.
  • step 103 the procedure tests whether the finger was in contact during the update.
  • the state of variable C is tested and according to its state the procedure returns to the beginning or continues.
  • step 104 the pointing procedure tests whether the position of the contact, characterized by Pl, is in the recovery zone surrounding the position Pcour (which is at the center of the touch surface at the start of the computer but can be n ' import or on the touch surface thereafter).
  • the procedure tests for this the norm of the vector Pl-Pcour. If this norm is lower than the limit value, this means that Pl is in a disc of radius limit centered around P, this disc constituting the zone of recovery.
  • the old Pcour value is not recovered and therefore the offset S between the position of the tracking symbol Pcur and the position of the contact P1 is zero.
  • the variable S representing the offset between the point P1 where the operator's finger is, and the point Pcour which is taken into account for the pointing, receives the value 0 in the step 113.
  • step 105 The shift between the position of the tracking symbol Pcur and the position of the contact is calculated in step 105.
  • Ypoinl YPcour.Nvert / NTAC.
  • step 109 the value of C is updated and the variable P1 receives the position of the contact returned by the touch surface.
  • Figures 8 and 9 illustrate a principle of simulation of the activation of the bolts of the mouse, using several fingers.
  • the touch surface used is for example xine resistive touch surface. These touch surfaces can not identify the presence of more than one contact. When several contacts occur simultaneously on the surface, these tactile surfaces return to the computer, at least approximately, an average of the positions of the different contacts.
  • a finger of the user touches the touch surface.
  • the tracking symbol (not shown) follows the position of that finger on the touch surface.
  • a pointing symbol 302 on the display screen also follows the position of the finger 301.
  • the position of the finger 301 on the touch surface 213 is at a point 305 indicated by a cross 309 which indicates the center of the tracking symbol.
  • the position value returned by the touch surface corresponds to a position indicated by the cross 309 superimposed on the point of contact 305.
  • the position 305 is surrounded by two second contact zones 306 and 307.
  • the tracking symbol is not represented. 319 of Figure 5 nor the recovery zone that coincides with this symbol.
  • the user moves his finger 301, followed by the pointing symbol 302, the spot 305 and the second contact zones 306, 307 until the pointing symbol 302 is brought to the menu item 303 displayed on the screen. display.
  • the user touches with his second finger the point 308 located in the second contact zone 306.
  • This causes a sudden displacement of the position value detected by the touch surface, which passes from the point 309 of FIG. Figure 8 at point 309 of Figure 9 located in an activation zone which is the image of the zone 306 by a homolhpita center 305 and ratio 1/2.
  • This abrupt change of position is identified by the pointing procedure implemented by the computer 212, which for example translates as a depression of the left mouse button.
  • This depression of the left mouse button makes for example appear a text window 304 on the display screen.
  • the position detected by the touch surface corresponds approximately to the average of the positions of the two fingers coming into contact with the surface, which is why this position is abruptly modified when the second finger touches the tactile surface.
  • the pointing procedure detects the recess of the left mouse button it leaves the pointing symbol 302 and the tracking symbol 319 at their last position before the button is depressed.
  • the user could also have, with another finger or with the same finger, touch the second contact area 307 which would have for example been recognized as a depression of the right mouse button.
  • the user can immediately withdraw his second finger to return to the situation of FIG. 8.
  • the point 309 then suddenly returns to the point 305 which is recognized as a release of the mouse button.
  • the user can also hold his two fingers in contact with the touch surface and move them together. This movement will be followed by the pointing symbol 302 and the tracking symbol 319, and a releasing of the mouse button will be identified only when the user raises one of the fingers. This allows for example the "drag-and-drop" function.
  • the pointing procedure having only an average contact position it moves the tracking symbol 319 to follow the movement of the average position of the contact represented by the cross 309 of Figure 9, but keeping between the tracking symbol and the average position the gap that exists immediately after pressing the mouse button. This difference corresponds in Figure 9 to the difference between the point 305 which also corresponds to the position of the tracking symbol, and the average contact position represented by the cross 309. The user can stop contact with the touch surface .
  • the computer stores the last zone pointed before the contact breaks and continues to display the pointing symbol 302 and the tracking symbol at their last position before the contact break.
  • a recovery zone is defined, for example a disk centered on the last zone pointed before the contact breaks. If the user reconnects with the surface In the recovery zone, the position of the pointing symbol during the handshake is not changed and the pointing symbol will follow only the subsequent movements of the finger of the user. Similarly, the position of the tracking symbol is not modified, so as to reproduce on the touch surface the movement of the pointing symbol. If the user makes contact with the touch surface outside the recovery zone, the position of the pointing symbol is changed immediately according to the position of the new contact. Similarly, the position of the tracking symbol is immediately brought to the new contact position.
  • a tracking symbol consisting of a disk 319 shown in FIG. 10 can be generated, and an indicator of the second contact zone constituted by a circle 702 also shown in FIG. 10.
  • the zones of second contact are centered on the new position of the finger while the tracking symbol remains on the last position of the finger preceding the break of the contact, as indicated above.
  • the indicator 702 is centered on the new position of the contact while the tracking symbol is centered on the old position of the contact and these two symbols are therefore off-center with respect to each other as indicated on Figure 10.
  • Figure 11 schematically shows the pointing procedure which runs in the background on the computer 212 and manages the pointing and activation of the buttons of a "virtual" mouse.
  • Figure 12 shows the same procedure.
  • Figure 12 shows the principle of each step while Figure 11 details operations affecting variables used by the program.
  • This procedure is particularly well suited to the device of Figures 1 and 2, but it can also be used with a touch surface directly superimposed on the main display screen, for example in a "tabtel PC" or a personal assistant. This avoids on a “tablet PC” the use of a stylus, which is restrictive for the user.
  • Pcour (XPour, YPour) represents the position of the tracking symbol on the touch surface.
  • Xpoint, Ypoint are the pixel coordinates of the point pointed to by the pointing symbol on the main display screen
  • the tracking symbol is then positioned on The display screen secondary to the coordinates (TXPour, TYPcour) expressed in pixels of this screen. For simplicity it is assumed in the following that the coordinates on the touch surface and the secondary display screen are the same.
  • Positions P1, P2, and P3 represent positions on the touch surface. However, P1, P2, P3 represent positions of a contact, measured directly on the touch surface, while Pcour represents a position of the tracking symbol calculated by the pointing procedure.
  • Pl, P2 and P3 each have a horizontal coordinate varying from 1 to Ntac and a vertical coordinate varying from 1 to Ntac, the point (1,1) being the lower left corner of the touch surface and the point (Ntac, Ntac) being the upper right corner of the touch surface.
  • the representation conventions are therefore the same for Pcour, P1, P2, P3.
  • S represents the offset between the position measured on the touch surface and the position of the tracking symbol.
  • C represents the existence of a contact on the tactile surface.
  • I means there is contact
  • the continuity zone corresponds to an area in which the point of contact between two successive updates should normally remain if the user moves his finger without breaking the contact. Its exact value therefore depends on the refresh rate, but in general it is very small.
  • the deactivation zone represents the zone in which the average contact position must pass abruptly in order to consider that a button is raised.
  • the pointing procedure continuously updates the position P of the tracking symbol, which also corresponds to a proportionality factor near the position of the pointing symbol. It also identifies the events "bolt 1 depressed”, “bolt 2 depressed”, “raised button” for which it can for example generate software interrupts simulating the activation of the corresponding buttons of a mouse.
  • the computer and the various interfaces manage the inputs and outputs of this procedure, providing in particular the position of the contact on the touch surface and controlling in particular the laser module to display the tracking symbol at the point Pcour.
  • step 102 the contact and the position are updated, the variable C receives the state of the contact, namely 1 if at least one finger is in contact with the touch surface, 0 otherwise.
  • the position variable Pl containing two integers receives the position of the contact on the touch surface.
  • step 103 the procedure tests whether the finger was in contact during the update.
  • the state of variable C is tested and according to its state the procedure returns to the beginning or continues.
  • step 104 the pointing procedure will be slow if the position of the contact, characterized by Pl, is in the recovery zone surrounding the position Pcour (which is at the center of the touch surface at the start of the computer but can be n ' import or on the touch surface thereafter).
  • the procedure tests for this the norm of the vector Pl-Pcour. If this norm is lower than the limit value, this means that Pl is in a disc of radius limit centered around P, this disc constituting the zone of recovery.
  • the old Pcour value is not recovered and therefore the offset S between the position of the tracking symbol Pcur and the position of the contact P1 is zero.
  • the variable S representing the offset between the point P1 where the operator's finger is, and the point Pcour which is taken into account for the score, receives the value 0 in the step 113. - if P1 is in the zone recovery, the Pcour value is resumed, so unchanged.
  • the shift between the position of the tracking symbol Pcur and the position of the contact is calculated at step 105.
  • the variable S receives at 105 the offset between the position of the finger P1 and the position of the tracking symbol Pcour, which, unless Pl ⁇ Pour, differs slightly from the Pl position.
  • Step 108 calculates the position of the tracking symbol Pcour by shifting the last position of the contact P1 from the offset S calculated in the previous step. The procedure then moves the tracking symbol and the pointing symbol to place them respectively at the point Pcour on the touch screen, and at the corresponding point (Xpoint, Ypoint) on the main display screen. When it was resumed in step 104, the existence of this shift materializes the difference between the position Pcour of the tracking symbol and the position Pl of the finger in contact with the touch surface. In the absence of recovery, the offset is zero.
  • step 109 the value of C is updated and the variable P2 receives the position of the contact returned by the touch-screen.
  • the virtual button number 2 is activated in step 115.
  • the computer simulates the depression of the left button of a computer mouse, which generates a change indicator state that can be a "beep" sound emitted by the computer or simply a visible change on the main display screen, such as opening a window.
  • the computer simulates the depression of the right mouse button.
  • the procedure tests at 127 if the average position is in a continuity zone which is a disc of radius LO with L0 ⁇ Ll ⁇ Llb ⁇ L2 ⁇ L2b. If one is in the zone of continuity the current value Pl of the position of the contact on the tactile surface is replaced in 125 by the last position P2 and the procedure returns to 108 to iterate the detection loop of the activation of the buttons. If one is outside the continuity zone, the procedure "ignores" the value P2 and returns to step 108 without modification of Pl.
  • the procedure continues with the step 116 of calculating the offset S.
  • the position returned by the touch surface is now the average position of the two fingers that are in contact with the screen .
  • step 118 there is then updating variables C and P3.
  • step 117 the difference in position is supposed to correspond to a joint movement of the two fingers on the touch screen which must be followed by the pointing symbol.
  • the variable Pcour is then updated accordingly during step 117.
  • Pcour is actually modified if P3 was in the continuity zone, to take into account the displacement of the position of the contact. .
  • the loop comprising steps 117 to 120 and 126, 121 thus moves the pointing symbol until the user raises one or both fingers.
  • step 117 the tracking symbol and the pointing symbol are moved to their new positions Pcour and (Xpoint, Ypoint) respectively.
  • step 122 When the button has been deactivated in step 122, it is again necessary to recalculate the offset S (step 123) and to replace Pl with the last known position of the contact, which is P3.
  • the vector S is increased in the step 123 of the vector P2-P3, and the variable P1 receives the value of P3 in the step 124, the procedure then returning to step 108.
  • Button 1 and Button 2 can respectively simulate the left and right buttons of a computer mouse, or any other keyboard key or mouse button.
  • the number of buttons can be increased by increasing or diversifying the activation zones.
  • Activation zones are not necessarily annular and they can take any type of shape, the ring shape being however simple and ergonomic.
  • a tracking symbol on the touch screen is optional. Indeed the tracking symbol is mostly a help for the novice user. The experienced user can find the area of recovery without the help of the tracking symbol.
  • the deletion of the tracking symbol does not change the procedures of Figures 6, 7, 11, 12 except that the tracking symbol is no longer displayed (but its position, which also corresponds to the position of the symbol of pointing, must always be calculated). Removing the tracking symbol allows you to use a non-overlapping touch surface on a secondary display screen, which is particularly economical.
  • Fig. 13 shows a modification of Fig. 6, in which the secondary display screen 215 has been removed and only the touch surface 213 is used.
  • a curved touch surface facilitates access to the tactile surface for the fingers of the hand.
  • a display device consisting of a screen scanned by a laser spot easily adapts to a curved touch surface and makes it possible to obtain a secondary display screen at a lower cost.
  • Figure 14 schematically illustrates such a display device.
  • This has a touch surface 213 curved integral keyboard 206 and transparent.
  • On the rear face of this touch surface is a diffusing surface 225.
  • a collimated laser diode 228 emits a beam and two galvanometric mirrors 227, 226 make it possible to control the direction of this beam and therefore the position of the laser spot 224 formed on the surface.
  • the touch surface 213 may be of the force sensor type, which is well suited to the use of curved surfaces.
  • the computer thus controls the display via a control card of the galvanometric mirrors, shown.
  • the scanning assembly 229 comprising the laser diode and the galvanometric mirrors is linked to the keyboard and to the tactile
  • the device of Figure 1 has the defect of requiring multiple interfaces directed to the computer and require complex management on the part of the computer. This problem is solved by the device of FIG. 15 in which only the elements differing from FIG. 1 have been renumbered.
  • a control device 601 receives the information coming from the touch-sensitive surface 213 via an interface 605. It controls the display 215 via the interface 604. It controls the display 215 so as to display the tracking symbol.
  • the control device 601 includes a memory in which executable instructions are stored, and a processor executing these instructions. The instructions stored in the memory of the device 601 are adapted so that the processor follows the procedure of FIGS.
  • controller 601 moves the tracking symbol on the screen 603 as determined by this procedure, and sends to the computer 212, by way of intermediate of the interface 607, the current position Pcur of the tracking symbol.
  • the displacement of the pointing symbol on the screen 215 of low resolution being imprecise, the controller 601 nevertheless returns to the computer 212 a position as precise as possible, limited only by the performance of the touch surface 213 and its interface 605 , which can detect very small displacements. This position returned to the computer 212 then allows it to control the position of the pointing symbol of the screen 201 with the necessary accuracy.
  • the touch surface 213 may be covered with a protective fabric diffusing the light coming from the display. The necessary precision being weak, this diffusion is not troublesome.
  • the controller 601 manages the various interfaces and returns to the computer 212 only useful information. Controller 601 may also relay information from keyboard 206 through interface 606 for transmission to computer 212 through interface 607.
  • the assembly consisting of the display 603, the touch surface 602, the keyboard 206, the interfaces 604, 605, 606, 607 and the controller 601 may be constituted by a touch screen laptop, the processor this computer constituting the controller 601, and the position information of the pointing symbol being transmitted to the main computer 212 by an ethereal or USB connection.
  • this solution is excessively expensive, because the processor is then oversized and the secondary display screen 603 used is unnecessarily precise and unnecessarily bright.
  • the controller 601 is a microcontroller of a reasonable cost
  • the screen 603 is of low definition and weakly illuminated or not illuminated
  • the interface 607 is an interface LJSB commonly used for the keyboards
  • the assembly is integrated in the keyboard case.
  • a touch surface capable of detecting the pressure exerted by the fingers can be used. Such a touch surface is described for example in US 5,159,159.
  • Another type of pressure sensing touch surface is a force sensing surface. In this case, the click of the mouse can be simulated by a higher pressure exerted by the finger. The user touches the touchpad to point, and presses harder to click.
  • Figs. 11 and 12 results in part from the fact that the resistive touch surfaces detect only one contact at a time. To improve reliability and simplify it, it is advantageous to use a touch surface capable of simultaneously detecting multiple contacts. In addition, the resistive touch surfaces require a significant pressure of the finger on the surface. This pressure is lower when only the nail comes into contact with the surface, but still remains significant. To avoid this problem, it is advantageous to use proximity-sensing touch-sensitive surfaces that do not require finger pressure on the touch surface.
  • FIGS. 6, 7, 11, 12 can be used with horizontal tactile surfaces, not necessarily related to the keyboard, which facilitates the use of certain types of tactile surfaces that can not be pointed with the finger nail. , like the tactile surfaces capactitives.
  • the touch surface 213 close to the vertical of Figure 1 is replaced by a horizontal tactile surface independent of the keyboard.
  • FIGS. 6, 7, 11 and 12 can also be used with a touch surface directly superimposed on a main display screen viewed by the user, such as in a personal assistant or a "tablet PC".
  • the main display screen 201 and the secondary display screen 215 are replaced by a single display screen.
  • the tracking symbol and the pointing symbol are then combined into a single symbol, which may for example comprise an arrow (pointing symbol) and a circle indicating the limits of the recovery zone (tracking symbol).
  • FIGS. 6, 7, 11 and 12 can also be used with a touch surface 213 which is not superimposed on any display screen, and is used only to point a remote display screen.
  • the pointing symbol remains displayed but there is no display of a tracking symbol.
  • the position Pcour of the tracking symbol is always calculated, but essentially constitutes a calculation intermediate for determining the position (Xpoint, Ypoint) of the pointing symbol.
  • Figure 16 illustrates several possible variations of the device of Figure 13, also adaptable to the case where a secondary display screen is superimposed on the touch surface.
  • Figure 16 (a) shows an angle Q of 90 degrees.
  • Figure I6 (b) shows a 90-degree angle, but the touchpad has been positioned above the keyboard and not behind the keyboard.
  • Figure 16 (c) shows a high angle close to 120 degrees.
  • Figure 16 (c) shows an angle close to 30 degrees.
  • FIG. 17 (a) illustrates a variation of the device of FIG. 13, comprising a compound articulation:
  • FIG 17 (b) illustrates a variation of the device according to the invention, wherein the touch surface is mounted on a support independent of the keyboard. The keyboard and touch surface support are then placed on a 932 table.
  • Figures 18 to 24 illustrate an improved embodiment of the touchpad-surface assembly and control procedures. This embodiment uses a type of tactile surface whose general principle is described in US Pat. Nos. 7,236,162 and 4,782,328.
  • Figure 18 schematically shows the keyboard assembly and tactile surface, in front view.
  • Figure 19 shows the same set seen from a central cutting plane.
  • the touch surface comprises a transparent plate 1008. This plate is connected by means of pads (1012 for example) to a holding plate 1003 made of steel.
  • the studs are equipped with force sensors for measuring the pressure exerted by a finger on the surface
  • the plate 1003 is pierced with a hole that accommodates a grid 1005, for example black cardboard, and a plate diffusing 1006 for example in white paper.
  • the diffusing plate 1006 is backlit by the bulb 1007 fixed on the part 906. The light emitted by the bulb 1007 therefore passes through the diffusing film 1006, the grid 1005 and the plate 1008 before possibly illuminating a finger coming into contact with the plate 1008.
  • the plate 1003 is fixed on the part 906 and the cameras 1001, 1002 are fixed on the plate 1003.
  • the cameras 1001, 1002 make two images, ie a stereoscopic image, starting from which is calculated the position of the finger.
  • the force sensors can detect a pressure on the plate 1008, which replaces for example the depression of the left bolt of a mouse.
  • the protection 1020 which is for example black cardboard absorbent and diffusing, and the grid 1005 which can also be black cardboard absorbing and diffusing.
  • Figure 19 is shown in thicker lines the portion of the protection 1020 which effectively cuts the central cutting plane; for example this part is pointed by the arrow 1022.
  • the interior 1026 of the protection has been shown darker than its outside 1027. The useful part of the image forming on the sensors of the cameras comes from inside the protection
  • the edge 1022 protects the cavities 1024 and 1025 against a part of the ambient light: the ambient light oriented in the plane of the Figure does not penetrate directly into the cavities 1024 and 1025 without first being diffused by the keyboard, since the radius Extreme 1029 does not reach inside the cavities.
  • the second cavity 1025 is formed by two folds 1023 and 1021 of the protection. It is better protected from the ambient light than the cavity 1024 and its bottom is blacker.
  • the orientation of the fold 1023 is determined so that the cameras do not see the side of this fold which is on the side of the cavity 1025, the largest.
  • the fact that the camera can see the top of the fold and the side of the fold which is inside the cavity 1024 creates a slightly shifted boundary between the images of the two cavities on the sensor of the camera. As can be seen in FIG.
  • the protection 1020 protects the top and the sides of the touch surface 1008, ie the entire area in the useful field of view of the cameras 1001 and 1002.
  • the cameras are approximately focused on the entrance of the cavity 1025, limited by the folds 1021 and 1023, so as to have on the sensors a black area and therefore little disturbed by the possible variations of the external lighting.
  • the useful diameter of the front lens of the cameras, as well as the relative position of the camera and the plate 1008, are adjusted so that the image of the entrance of the cavity 1025 is not disturbed by the presence of the window 1008.
  • a typical useful diameter may be of the order of 2 mm in order to be able to detect the presence of a finger about 1 mm from the surface 1008.
  • a smaller diameter facilitates the optical design and makes it possible to detect a finger closer to the surface. touch surface, a larger diameter allows to use a bulb 1007 weaker.
  • the gate 1005 makes it possible to limit the opening of the light beam coming from the diffusing surface J006 passing through the transparent plate 1008.
  • the light that has passed through this gate has a maximum inclination relative to at a direction orthogonal to the surface 1008, which is limited by the width of the openings of the grid and the thickness of the grid in the horizontal direction.
  • the most inclined radius 1028 passing through the opening pins low spring grid without being stopped by the protection 1020 which stops on the drawing at the upper end of the inclined right edge 1022.
  • the image obtained on the The sensor of a camera in the presence of a finger approaching the tactile surface is shown in Figure 20. The image is dark except the image of a finger, lighter, possibly more blurred than in the drawing.
  • the coordinate in the direction or the sensor is the shortest is roughly proportional to the angle Q shown in Figure 19.
  • the coordinate in the direction or the sensor is the longest is roughly proportional to the angle ⁇ shown in Fig. 18.
  • the image is analyzed by a processor, which may be the processor of a computer or a specialized signal processing processor replacing the controller 601 of Fig. 15, in the case where the entire system is organized as in Figure 15.
  • the signal processor receives data from both cameras and force sensors and the keyboard. It analyzes this data and transmits to the computer only the data coming from the keyboard, the information on the depression of the virtual mouse buttons and the position of the various symbols used for pointing.
  • the processor can follow for example the procedure of Figures 6 and 7 so as to work with a recovery zone. This procedure is modified by removing the display of the tracking symbol since the device of Figures 18 and 19 does not have a secondary display screen for displaying a tracking symbol. On the other hand, the computation of Pcour remains useful since it is from Pcour that the position of the pointing symbol on the display screen is calculated.
  • This procedure is also modified by replacing step 102 with the procedure described in FIG. 22, and replacing step 109 with the same procedure described in FIG. 22.
  • the pre-pointing symbol is for example a translucent or flashing pointer of the same shape as the pointing symbol. Its role is to replace the direct vision of the finger or stylus near the screen, which the user of a touch screen usually has and which is not available in the present invention, because the touch surface does not is not superimposed directly on the display screen.
  • the user sees his finger go over the screen and feedbacks on the position of the finger according to this image, so as to aim a point correctly and the first time.
  • the user does not see his finger but he sees the pre-pointing symbol, which he uses in the same way to touch the touch surface directly to a target point, much more precisely than he did not have the pre-score symbol.
  • Figure 24 shows the use of the pre-score symbol.
  • the finger points at a point on the touch surface and the pointer is displayed at the corresponding point on the main display screen.
  • the finger moves away from the screen abruptly and the pointing symbol remains in place.
  • the finger approaches the screen without coming into contact. The pointing symbol remains but the pre-score symbol is displayed at a point corresponding to the position of the finger.
  • the finger moves along the screen, down, but does not come into contact. Only the pre-score symbol follows it.
  • the finger finally comes into contact with the screen. The pointing symbol then replaces the pre-score symbol.
  • step 1101 the images coming from the two cameras are loaded by the processor.
  • the processor identifies whether the images are uniformly dark, or if they comprise a clear part. For example, the images are considered to have a clear portion if the illumination of at least a fixed number of pixels is above a set lighting limit value. In the case where the two images comprise a clear part, it is considered that a finger is present and the procedure goes to step 1104. In the opposite case, the procedure, in step 1103, sets to 0 the value of the variable C of Figure 6, then ends and returns the control to the procedure of Figure 6 which goes to the next step.
  • the procedure extracts from the image present on a camera the point B of FIG. 20 which represents the point corresponding to the minimum value of the coordinate proportional to ⁇ among all the "clear" pixels of which the illumination is above the limit value of illumination, or if several bright pixels have the same coordinate or have a coordinate lower than a contact limit value from which contact is considered, at the center of gravity of these pixels.
  • the procedure is the same with the image on the other camera.
  • the procedure calculates the coordinates X, Y, Z of the finger from the coordinates of the points B obtained on each camera. For this, it calculates the X, Y, Z coordinates using triangulation formulas that can be established, according to the geometric characteristics of the system, according to the principle illustrated in FIG. 21.
  • the formulas for calculating the X-coordinates, Y are obtained according to the principle shown in figure 21 (a). These are the coordinates of the intersection A of the rays coming from the optical centers O 1 , O 2 of the two cameras and characterized by the angles ⁇ 1 , ⁇ 2 themselves obtained from the coordinates of the points B on the two cameras.
  • the optical centers of the two cameras correspond approximately to the position of their front lens.
  • step 1106 the procedure tests whether Z is less than a final contact limit value below which contact is considered to be on the tactile surface. If Z is less than the final contact limit value the procedure proceeds to step 1108 otherwise it proceeds to step 1107.
  • step 1108 the procedure assigns to the variable C of the procedure of FIG. 6 the value 1 and at the point P1 of the procedure of FIG. 6 the coordinates (X, Y). It then ends and returns control to the procedure of Figure 6 which moves to the next step.
  • the activation of the left mouse button is done by pressing on the tactile surface, in the procedure of FIG. 12 one only needs to activate the right mouse button. This is done for example by adopting a very high limit LIb of the procedure of Figure 11, so that only one button can be activated, the button 1 which must then simulate the right mouse button.
  • step 1108 also incorporates the display of the pointing symbol or its displacement, and the "END" step is replaced by a return to the "start” step to make the procedure iterative.
  • the left mouse button can be activated by a pressure on the touch surface, measured by the pressure sensors. It is also possible to define two pressure limit values, so that a low but non-zero pressure simulates the depression of a first mouse button, and a higher pressure simulates the depression of a first mouse button. a second mouse button.
  • the positioning of the cameras 1001 and 1002, remote from the useful area of the touch surface 1008, is adapted to allow the use of the pre-pointing symbol on the entire touch surface. Indeed, if the cameras were too close to the corner of the touch surface, as is the case in US Pat. No. 4,782,328 and US Pat. No. 7,236,162, then, given the angle of the beams reaching the sensors, it would not be possible. possible to detect a finger that has not yet reached the touch surface, when that finger is too close to the camera. This constraint is even more important if it is desired that the direct light reaching the sensors comes only from the interior of the cavities 1024, 1025. This positioning of the cameras also makes it possible to reduce the manufacturing constraints on the optics of the cameras.
  • the cameras 1001 and 1002 must also be positioned both on the same side of the tactile surface, the line joining the front lenses of the cameras being sufficiently below the useful area of the touch surface 1008. Indeed if the useful area from the touch surface passed at the line joining the front lenses of the two cameras we could not determine the position of a finger along this line.
  • the cameras 1001 and 1002 may possibly be replaced by linear cameras which only make a 1025 cavity image. In this case the pre-pointing symbol can not be used.
  • protections can be added to the system, in the form of transparent or reflecting plates according to whether or not they must be traversed by the light. Such protections make it possible, for example, to completely enclose the bulb 1007 to prevent light leakage, and to protect the complete optical path of the beams going to the cameras so that it can be reached by the user's fingers only. in the area actually corresponding to the touch surface, and also to protect it from dust or any external objects. It is possible, to reduce the influence of the external light, to replace the bulb 1007 with a
  • the LED or other device generating substantially monochromatic illumination, and placing on the optical path of the light reaching the sensor of a camera a monochromator filter.
  • the monochromator filter can for example be placed just in front of the CCD or CMOS sensor of the camera. In this case the ambient light loses much of its intensity through the monochromator filter, while the light from the LED keeps its intensity.
  • Figures 18 and 19 were made for simplicity with a vertical touch surface but the touch surface can be tilted forward to improve the ergonomics of the device. Protections can be added on the sides of the keyboard to completely prevent ambient light to reach directly inside the cavities 1024 and 1025 without being first diffused by a protection. In Figures 19 and 18 it is possible to remove the glass plate 1008. This does not prevent the operation of the touch surface but it removes the immediate tactile feedback to the user performing a pointing act, since the score is then in the air without any material contact, on an immaterial surface defined only by an observation plane of the cameras. Moreover in this case it is no longer possible to simulate recess of a mouse bolt according to a pressure measured on the plate 1008.
  • the touch surface can also be made slightly curved.
  • the step 1106 must therefore be replaced by a condition dependent on X, Y, Z.
  • FIG. 25 shows a modification of FIG. 19, in which surface 1008 is bent and curved.
  • the studs 1012 equipped with force sensors can advantageously be replaced by the device of FIG. 26 (b).
  • the plate 1003 is pierced with a hole closed by a plate 1060 and in which is housed a compression spring 1064.
  • the spring 1064 presses a plate 1061 glued on or otherwise connected to the plate 1008.
  • the plate 1061 itself bears on the plate 1062 integral with the plate 1003 via a connecting element 1063.
  • the plate 1061 and the plate 1003 are metallic and connected to the two terminals of a device for detecting the existence of a contact. This device transmits information about the existence of a contact to the processor that retransmits them to the computer.
  • the part 1064 comes into contact with the plate 1008 and the contact is established.
  • the computer simulates the depression of the left button of a mouse.
  • This device is simpler than the force sensors and has the advantage of providing immediate tactile feedback to the user confirming the depression of the mouse button.
  • infrared cameras In this case a source of infrared lighting is not essential since the finger is, because of its temperature, an infrared transmitter.
  • An infrared light source can nevertheless be used.
  • a source may be a heating plate replacing the diffusing element 1006.
  • An infrared diode may also be used instead of the bulb 1007. If this diode emits in a reduced wavelength spectrum a monochromator filter selecting this wavelength and placed in front of the sensor can improve the detection.
  • the touch surface of Figures 18 and 19 may be used with other types of devices.
  • the touch surface can be separated from the keyboard, to allow the user to use an existing keyboard.
  • the 1007 backlight makes the system more reliable, but it is optional and can be removed and replaced by ambient lighting, other auxiliary lighting, or light from an LCD screen superimposed on the light.
  • plate 1008 which also allows the use of the device in the presence of a display screen superimposed on the touch surface.
  • the display screen can also be used as a secondary display screen and allow the display of a tracking symbol, or in a conventional touch screen device can also be used as a display screen. main.
  • the preferred embodiment of the invention uses the screen-keyboard assembly of FIGS. 18 and 19, modified as shown in FIG. 26 to allow a simpler equivalent of the "mouse click", and modified as shown in FIG. that the touch surface is slightly bent, but not curved so as not to complicate the system excessively.
  • the preferred embodiment uses the computer array of Fig. 15 or the controller processor 601 executes instructions that enable the procedure of Figs. 11 and 12 to be performed, supplemented by the procedure of Fig. 22 as indicated above. , and modified to not display the pointing symbol.
  • the processor 601 transmits to the computer 212, via the interface 607, all of the data concerning the position of the pointing symbol, the position of the pre-pointing symbol, the state of the virtual bolts of the mouse, and data from the keyboard.
  • the device according to the invention is applicable as a replacement for computer mice and provides an improvement in the ergonomics of pointing systems.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • Input From Keyboards Or The Like (AREA)
EP07823469A 2006-09-01 2007-09-03 Zeigegerät Withdrawn EP2067092A2 (de)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
FR0607791A FR2905484A1 (fr) 2006-09-01 2006-09-01 Dispositif de pointage
FR0702850A FR2915294A1 (fr) 2007-04-19 2007-04-19 Methode et dispositif d'activation d'un bouton virtuel
FR0704451A FR2917861A1 (fr) 2007-06-20 2007-06-20 Dispositif et methode de pointage
PCT/FR2007/001425 WO2008025904A2 (fr) 2006-09-01 2007-09-03 Dispositif de pointage

Publications (1)

Publication Number Publication Date
EP2067092A2 true EP2067092A2 (de) 2009-06-10

Family

ID=38992652

Family Applications (1)

Application Number Title Priority Date Filing Date
EP07823469A Withdrawn EP2067092A2 (de) 2006-09-01 2007-09-03 Zeigegerät

Country Status (2)

Country Link
EP (1) EP2067092A2 (de)
WO (1) WO2008025904A2 (de)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2142711A (en) * 1983-07-04 1985-01-23 Philips Electronic Associated Manually operable x-y signal generator
SE515663C2 (sv) * 1996-08-23 2001-09-17 Ericsson Telefon Ab L M Pekskärm och användning av pekskärm
JP2000112567A (ja) * 1998-09-30 2000-04-21 Internatl Business Mach Corp <Ibm> 携帯型電子装置
JP2003330591A (ja) * 2002-05-08 2003-11-21 Toshiba Corp 情報処理装置およびコンピュータの操作方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2008025904A3 *

Also Published As

Publication number Publication date
WO2008025904A3 (fr) 2009-07-09
WO2008025904A2 (fr) 2008-03-06

Similar Documents

Publication Publication Date Title
TWI543038B (zh) 用於實體使用者介面上之影像投影之裝置、系統、及方法
JP5693972B2 (ja) 切替え可能なディフューザを備える対話型サーフェイスコンピュータ
US8259240B2 (en) Multi-touch sensing through frustrated total internal reflection
TWI536227B (zh) 光學觸控偵測
US9298255B2 (en) Transmissive display apparatus and operation input method
US20140168153A1 (en) Touch screen systems and methods based on touch location and touch force
JP2007506180A (ja) 表示モニタのための座標検出システム
US8184101B2 (en) Detecting touch on a surface via a scanning laser
US20070046625A1 (en) Input method for surface of interactive display
US20090128499A1 (en) Fingertip Detection for Camera Based Multi-Touch Systems
US20120023423A1 (en) Orientation free user interface
US20100001963A1 (en) Multi-touch touchscreen incorporating pen tracking
WO2010098911A2 (en) Dynamic rear-projected user interface
WO2012039969A2 (en) Interactive display
EP2188701A2 (de) Mehrfachberührungserfassung durch frustrierte innere totalreflexion
JP2012508913A (ja) 一体型タッチセンシングディスプレー装置およびその製造方法
US20100201636A1 (en) Multi-mode digital graphics authoring
US20120127084A1 (en) Variable light diffusion in interactive display device
US20140016103A1 (en) Interactive control system, methods of making the same, and devices incorporating the same
EP2067092A2 (de) Zeigegerät
US9213444B2 (en) Touch device and touch projection system using the same
WO2014082928A1 (fr) Système et procédé de communication reproduisant une interactivité de type physique
TWM419988U (en) Touch-to-control input apparatus
BE1023596B1 (fr) Système interactif basé sur des gestes multimodaux et procédé utilisant un seul système de détection
WO2009121199A1 (fr) Procede et dispositif pour realiser une surface tactile multipoints a partir d&#39;une surface plane quelconque et pour detecter la position d&#39;un objet sur une telle surface

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA HR MK RS

R17D Deferred search report published (corrected)

Effective date: 20090709

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20100112