EP2742409A1 - Verfahren zur kennzeichnung einer berührung auf einem taktilen display - Google Patents

Verfahren zur kennzeichnung einer berührung auf einem taktilen display

Info

Publication number
EP2742409A1
EP2742409A1 EP12761644.9A EP12761644A EP2742409A1 EP 2742409 A1 EP2742409 A1 EP 2742409A1 EP 12761644 A EP12761644 A EP 12761644A EP 2742409 A1 EP2742409 A1 EP 2742409A1
Authority
EP
European Patent Office
Prior art keywords
touch
current
zone
type
characterization
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP12761644.9A
Other languages
English (en)
French (fr)
Inventor
Gowri RIES
Marianne FICHOUX
Julien Olivier
Pascal Joguet
Guillaume Largillier
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissha Printing Co Ltd
Original Assignee
Stantum SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Stantum SAS filed Critical Stantum SAS
Publication of EP2742409A1 publication Critical patent/EP2742409A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0446Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes

Definitions

  • the present invention relates to the field of tactile sensors for displays, in particular multi-contact touch sensors. These displays are typically equipped with a display screen superimposed on the touch sensor. The screen is used to display graphic objects that are intended to be manipulated by a user of the display using data acquired by the touch sensor.
  • the present invention relates to the characterization of the type of touch (or contact) on such touch sensors.
  • the characterization of the type of touch makes it possible to distinguish the touch of a palm from a hand of a user (which does not correspond user manipulation), a stylus or a user's finger (which correspond to user manipulation).
  • the system is able to distinguish the data related to writing (from the stylus), data from a finger (for the selection of a text box for example), or data from the palm of the hand (which can be ignored for example).
  • EP 1 717 677 discloses a method for discriminating fingerprints on a touch screen according to whether they correspond to fingers or palms of hands. This document also discloses discrimination between fingerprints corresponding to a left hand of those corresponding to a right hand. The method according to this document implements a comparison of the contact points with the touch screen according to spatial criteria.
  • EP 2 159 670 discloses a touch type discrimination based solely on an analysis of the size of the different shapes. touching the touch sensor. According to this document, any shape detected on the touch sensor whose length or width is greater than a reference distance is considered to be coming from the touch of a palm. If the detected shape does not meet this criterion, it is considered to have come from the touch of a finger or a stylet.
  • US 2009/095540 discloses touch type discrimination in the context of a virtual keyboard typing system on a touch screen. The system discriminates between the touch of a palm of a hand and that of a finger. This discrimination is done in two stages.
  • a spatial analysis is performed to compare the size of the affected areas on the touch screen and to bring closer areas close to each other.
  • an analysis of the amplitude of the signal representing the activation of a cell of the touch screen is performed.
  • the object corresponding to the imprint is of the type "palm of hand”.
  • any footprint located outside a square of 8 cm sides and centered on a "palm-of-hand" footprint and located at a distance greater than a threshold distance from the palm-type footprint. "hand” is considered to be a finger print.
  • This document also discloses the detection of a fingerprint corresponding to a stylus but only when it is active and communicates with the touch sensor by electromagnetic waves. This document allows proper detection when the hand interacting with the touch screen is unfolded, which is the proper position for typing on a keyboard. However, this document does not provide a sufficiently powerful and fast solution for other applications, including applications for writing and drawing.
  • the stylet is actually detected by a specific means of activation or deactivation which also implies a higher cost and a lack of flexibility compared to a passive stylus of any material.
  • the processing speed is not fast enough for an application intended for writing, in which the speed of movement of the stylus is greater than the speed of typing on virtual keyboard, because this document provides the acquisition of the entire surface before attempting to distinguish between a palm of a hand and a finger.
  • the present invention falls within this framework.
  • a first aspect of the invention relates to a touch characterization method on a surface of a tactile sensor comprising the following steps, implemented from tactile data obtained following a scanning of detection elements of said touch sensor:
  • the present invention By attributing to the current zone to characterize a characteristic value (for example a probability) of the possible types of support (for example according to spatial, morphological, temporal or other criteria), the present invention notably allows a better interpretation of the tactile data. obtained following the scanning of detection elements of a touch sensor by a system incorporating this sensor.
  • a characteristic value for example a probability
  • the present invention notably allows a better interpretation of the tactile data. obtained following the scanning of detection elements of a touch sensor by a system incorporating this sensor.
  • Embodiments allow the distinction between several types of support by assigning to each tactile fingerprint detected by the sensor probabilities of belonging to each of the types of support. These probabilities are calculated according to discriminant criteria and contextual criteria and are refined at each scan phase from which the tactile data are derived.
  • the present invention allows applications for writing by hand and / or drawing, thanks to a characterization of contact points in real time and with a short processing time to discriminate different types of contacts.
  • the characteristic values correspond to probabilities of belonging to types of support. However, it is possible to consider other types of characteristic values such as higher or lower scores depending on whether the current zone meets criteria specific to a type of touch, or other.
  • the characteristic values correspond to types of touch such as “finger”, “palm of hand”, “stylus” or other. Depending on the intended applications, other types of touch can be considered.
  • the characterization state comprises, at a given moment, all the characteristic values considered before a decision is made to attribute the current zone to a type of touch.
  • the state of characterization can thus evolve over time, according to the updates made for the different characteristic values.
  • the method further comprises the following steps of:
  • the definition step may for example comprise the following steps:
  • the zone to be treated as a new zone to be characterized, the zone to be treated thus forming the current zone to be characterized.
  • the analyzed areas may change over time and depending on the user's manipulations of the sensor.
  • the integration of a contact zone can furthermore allow the updating of characteristic values.
  • an area to be treated is called a "bounding box".
  • An encompassing area corresponds to an area of the surface of the touch sensor comprising a set of triggered activation elements. Such encompassing areas make it possible to smooth the touch detection on the touch sensor, for example by taking into account the detection artifacts (ignoring trigger elements that are triggered but isolated from FR2012 / 051849
  • 6 other triggered element include non-triggered sensing elements but being surrounded by other triggered sensing elements, or other).
  • the test aims in particular to determine whether the area to be treated defined from the tactile data of the detection elements should be considered as a new current area or whether it can be considered as part of the existing existing area according to a criterion.
  • the integration test includes an application of a distance criterion between the zone to be treated and the zone already defined.
  • An area to be characterized while awaiting characterization may have a plurality of characteristic values that do not make it possible to select a touch type to definitively qualify it.
  • touch types may become irrelevant.
  • the application of the disqualification criterion can eliminate these types of touch and lighten the calculations by no longer taking into account the corresponding characteristic values.
  • disqualification criteria may be part of a category of criteria called “contextual criteria” for accelerating data processing.
  • Another category of criteria called “discriminant criteria” are related to tactile data and the determination of the state of characterization.
  • the update step includes the following steps of:
  • the update can be done by analysis of the surface, the weight, the distance to other areas of known type, the speed of movement or other.
  • the decrease or increase can be done in increments or according to the evolution of these parameters.
  • a parameter may for example be relative to the geometry of the current zone (shape, size or other), to the dynamics of the current zone (speed of movement or other) or to other aspects.
  • the method may further include an initialization step in which initial values are assigned to characteristic values respectively associated with touch types on the surface of the touch sensor of a set of touch types.
  • the initial values are equivalent for each characteristic value.
  • This initialization step may make it possible to favor or not a type of touch by attributing a higher initial value to the corresponding characteristic value.
  • This initialization step may also take into account initial values of parameters.
  • the method may further include a characterization step in which a touch type is selected from the touch types 9 which are associated with the characteristic values of the set defining the characterization state of said current zone to be characterized.
  • This characterization step makes it possible to qualify a current analyzed zone by attributing to it a type of touch thus putting an end to the characterization process.
  • the characterization step comprises comparing said characteristic values of the set defining the characterization state of said current zone to be characterized at third selection thresholds.
  • the type of touch selected is the one at which the associated characteristic value is the highest of the characteristic values of the set defining the characterization state of said zone. common to characterize.
  • the method when the same type of touch is selected for at least two distinct zones to be characterized, the method further comprises a characterization conflict management step in which only one of said at least two zones is characterized by said same type of touch according to a conflict management criterion.
  • characterization errors are limited (for example typical can be the use of only one stylus at a time)
  • the update step is cyclically implemented.
  • the update cycle is stopped when an update number reaches a fourth threshold. 12 051849
  • the cycle of update is stopped for this fifth value.
  • the calculations are lightened by no longer taking into account a characteristic value associated with a type of disqualified touch.
  • the method when it is decided to characterize said current zone by a touch type, the method further comprises a step of applying a reliability criterion to a sixth characteristic value associated with this type of touch and , depending on a result of this step, stopping the update cycle for the characteristic values.
  • said at least one characteristic value is a probability for said current area to be characterized to be of the type with which said value is associated.
  • the parameters are chosen from:
  • these parameters can be part of the discriminating criteria.
  • a second aspect of the invention relates to a computer program as well as a computer program product and a storage medium for such a program and product, allowing the implementation of a method according to the first aspect when the program is loaded and executed by a processor of a touch data processing device.
  • a third aspect of the invention relates to a device for processing tactile data obtained following a scanning of detection elements of a touch sensor to characterize a touch on a surface of said touch sensor, configured for the implementation of a method according to the first aspect of the invention.
  • such a device comprises a processing unit configured to update at least a first characteristic value of a current area of the surface of the touch sensor to be characterized, said first characteristic value being associated with a type of touch on the surface of the touch sensor, and for determining a characterization state of said current zone to be characterized, as a function of said at least one first updated characteristic value, by a set of characteristic values associated with types of touch on the surface of the touch sensor .
  • a fourth aspect of the invention relates to a touch sensor having a capture interface for acquiring tactile data representative of an activation of at least one touch sensing element of a surface of the touch sensor and a device according to the third aspect. .
  • a fifth aspect of the invention relates to a touch screen having a display screen juxtaposed with a touch sensor according to the fourth aspect.
  • the objects according to the second, third, fourth and fifth aspects of the invention provide at least the same advantages as those provided by the method according to the first aspect.
  • Objects according to the third, fourth and fifth aspects of the invention may comprise means 2 051849
  • FIG. 1 illustrates a touch screen according to embodiments
  • FIG. 2 illustrates an integration test of a zone to be treated (or zone encompassing) in a current zone
  • FIG. 3 is a general flowchart of steps implemented in embodiments
  • FIGS. 4a and 4b illustrate the interaction between a touch sensor and an application software according to embodiments
  • FIGS. 5a and 5b illustrate the framework for implementing a first exemplary embodiment
  • FIG. 6 illustrates the conflict management in the first exemplary embodiment
  • FIG. 7 is a flow chart of steps implemented in the first exemplary embodiment
  • FIGS. 8a, 8b, 9a and 9b illustrate the frame of implementation of a second exemplary embodiment
  • FIG. 10 illustrates the conflict management in the second exemplary embodiment
  • FIG. 11 illustrates the discriminating criterion of distance
  • FIG. 12 illustrates the discriminant size criterion
  • FIG. 13 illustrates the criterion discriminating speed
  • FIG. 14 is a flowchart of steps implemented in the second exemplary embodiment
  • FIG. 15 illustrates a calculation of the probability of belonging to a touch type according to a third exemplary embodiment
  • FIG. 16 is a flowchart of steps implemented in the third exemplary embodiment; and - Figures 17a and 17b illustrate the implementation of the invention framework for a virtual keyboard.
  • the embodiments of the invention provide a quick and effective means of discriminating contact types with a touch screen.
  • FIG. 1 shows a touch screen 10 according to one embodiment of the invention.
  • a touch screen can be defined as an assembly superimposing a display screen and a touch sensor to allow direct manipulation of a User Graphic Interface (an expression often encountered under the acronym GUI, that is to say the acronym of "Graphical User Interface” in English terminology).
  • GUI User Graphic Interface
  • a transparent touch sensor can be placed on the display screen, but it is also possible to arrange the touch sensor inside the screen or under the screen (for example using infrared or pressure sensors) .
  • the touch screen 10 of Figure 1 comprises a touch sensor 1 1 for detecting the touch on its surface juxtaposed to a display screen 12.
  • the touch sensor 1 is of the "multicontact" type, that is to say that is, it is arranged to detect several simultaneous support points in different places.
  • the touch sensor 1 1 can then be a matrix sensor consisting of a set of cells (that is to say individual elements of a matrix) arranged in rows or columns.
  • Such a structure can be particularly adapted to the implementation of "multicontact” sensors. Indeed, the cells of the matrix can be tested individually to detect the presence of support points.
  • a matrix touch sensor can be composed of active cells
  • TFT transistors (acronym for "Thin Film Transistor” in English terminology, piezoelectric elements, or other).
  • the matrix touch sensor may be composed of passive cells corresponding to the intersection between a row and a column of the matrix. Detection of a fulcrum (or "point of contact”) can then be done by measuring a change of one FR2012 / 051849
  • a fulcrum defines the contact exerted in a given place on a touch sensor.
  • the contact can be made by any part of the body (palm, finger or other), or by an object (stylus or other).
  • a passive matrix sensor can be resistive or capacitive, that is to say that the electrical characteristic whose change is measured corresponds respectively to an electrical resistance or an electrical capacitance.
  • the multicontact matrix touch sensor 11 is disposed above the display screen 12.
  • this touch sensor 11 is transparent in order to allow the visualization of the data displayed on the underlying display screen 12.
  • the touch screen 10 also includes a capture interface 13, a main processing unit 14 comprising a main processor (not shown) and a graphics processing unit 15 having a graphics processor.
  • the capture interface 13 makes it possible in particular to acquire measured data at the level of the multicontact tactile sensor 11.
  • This capture interface 13 contains the acquisition and analysis circuits necessary for the acquisition of the data, which can then be transmitted to the processor of the main processing unit 14 for processing, then implementation of the various functions of the 10 'touch screen.
  • the touch screen also comprises a memory unit 16.
  • This memory unit comprises a random access memory for storing in an unsustainable manner calculation data used during the implementation of a method according to one embodiment.
  • the memory unit also comprises a non-volatile memory (for example of the EEPROM type) for storing, for example, a computer program according to one embodiment for its execution by the main processor of the main processing unit 14. Examples of embodiments are described in document EP 1 719 047 concerning the various applications and uses of such a touch screen 10.
  • a solution for discriminating the type of contact on the touch screen by associating it with at least one characteristic value, for example a probability of belonging to a type of support that can be envisaged by the system. and then updating this characteristic value in real time with each acquisition of the touch data (or information).
  • characteristic value for example a probability of belonging to a type of support that can be envisaged by the system.
  • a type of contact corresponds to the origin of the point of contact with a user.
  • the contact can be made by a part of the body of the user (palm of hand, finger, elbow, forearm or other), or by an object (stylus or other).
  • the touch data is acquired during a touch sensor scan phase in which each sensor cell is probed (typically at a frequency of 100 Hertz) to determine whether a contact is occurring on the screen at the location or location of the sensor. find the cell. These data are then processed.
  • Areas to be processed are defined from activated areas consisting of sets of activated neighboring cells. They can be set during or after the sequential scan.
  • the definition of an encompassing zone makes it possible, in particular, to differentiate a fulcrum from a background noise that can be generated by the scanning.
  • a current area is thus a set that may include one or more bounding areas.
  • the current areas may be part or all of a fulcrum. These areas are created to optimize the times calculations by reducing the number of areas to be analyzed.
  • a current area can be characterized by different discriminating criteria, including:
  • Size parameters (the size of a current zone may for example be defined by its area),
  • morphology parameters eg the weight of the area, the shape of the area, the orientation of the area or other
  • time parameters for example the life time, ie the time during which a current zone remains waiting to be qualified as a type of touch, or other).
  • the weight of a current area characterizes the pressing force on an area of the touch sensor.
  • This parameter can for example be deduced by comparing the number of cells activated in the zone with the total number of cells (activated or not) included in this zone. Alternatively, or in combination, this parameter can be determined using a pressure sensor (for example piezoelectric) disposed at the zone.
  • Another weighting parameter for the current area may be the average weight which is defined as the weight per unit area in the area.
  • the shape of a current area as well as its evolution over time as a function of aggregations of bounding areas may be useful for characterizing the touch corresponding to a finger of a user's hand.
  • the orientation of a current zone can be determined from the shape of this zone, in particular to take into account the direction of a point of support created by a finger or a stylus, for example during writing or drawing a shape on the screen.
  • a current zone can be characterized by the parameters mentioned above, other parameters, or their evolution over time. At a given moment, the characteristic value or values associated with a current zone make it possible to determine its membership in a type of support.
  • a probability of belonging of the current zone to each type of possible support on the screen is calculated. As sweeps and acquisitions of tactile data progress, these probabilities evolve and it is for example determined that the current area is of a given type when the corresponding probability reaches the value 1. The current area is then "qualified" of that type.
  • a test is performed to integrate or not the new corresponding bounding zone to an existing existing area.
  • this test is performed according to a distance criterion.
  • the new bounding zone 200 is in a perimeter 201 corresponding to the maximum possible area for a current area around an existing current area 202, then the bounding area is integrated therewith.
  • a new current zone 203 is created comprising the zones 200 and 202.
  • a new perimeter 204 is defined, around the new current zone 204 for a future integration test.
  • the characteristics of the new current zone are then updated.
  • the first step S300 corresponds to the acquisition phase of the tactile data.
  • the touch sensor is scanned and sensor cells that are activated by the touch of a user (via a stylus or its finger for example) emit an activation signal which makes it possible to determine that the sensor is touched at the level of the cell.
  • step S301 is implemented in which bounding areas are defined, as already mentioned above.
  • step S302 it is determined for each bounding zone, in step S302, whether or not to be integrated into an existing existing zone.
  • step S303 If the bounding box is not to be integrated into a current area, a new current area is created in step S303. The method then proceeds to step S304 updating the characteristic values associated with touch types. For the case of a new current zone, it is a step of initializing these values.
  • the characteristic values associated with the touch types for the area in which the bounding box is to be integrated are updated, taking into account the new integrated area.
  • step S305 the characteristics of the current zone are recorded and the state of characterization of the current zone is determined during step S305.
  • the discriminant criteria may be the most likely to modify the characterization state of a current zone, ie the probability that it belongs to a type of support. These criteria are based on tactile data. It can in particular be spatial, morphological, temporal or other criteria, among others:
  • the weight of the current zone (for example, it is possible to modify a current characteristic value of the current zone according to the weight of the current zone at the time of measurement during the acquisition phase),
  • the contextual criteria may for example correspond to criteria imposed, in particular by the user, via modifications of the preferences of the application software or by the programmer for the proper functioning of the application software. These criteria are independent of the data from the acquisition.
  • the application software (or application program) is understood as the part of the highest software layer of the system comprising the touch screen. This is the part visible by the user dedicated to a particular use (video games, word processing, internet browser or other).
  • the contextual criteria may notably correspond to:
  • the detection level the number of types of support to be taken into account can be chosen, for example to detect only the touch of the stylus and palm of the hand or to detect all the touches of the palm, the stylus and the finger )
  • the reliability of the calculated value can be determined and according to a reliability test, if the current zone is already qualified by a type of support whose discrimination is considered reliable, then the current zone retains its qualification without subsequent updating of the characteristic values concerning it.
  • the system can thus send the application software the information of the type of contact support in real time.
  • the characterization of the zone is complete (that is, the associated type has been determined, for example because the characteristic probability of one type of support has reached the value 1)
  • the association between the current zones and the characteristic values can be managed entirely by the touch sensor.
  • the sensor comprises analysis means configured to acquire the touch data (S400), to process the touch data (S401), to apply the discriminant criteria for each current zone (S402), applying the contextual criteria (S403), associating one or more characteristic values with each current field (for example, membership probabilities of a contact type, S404) and associating a cursor with a current area (S405).
  • a cursor or pointer
  • a cursor is a computer object associated with a fulcrum (contact) and characterized by at least one unique identification number and coordinates characterizing the position of a bounding area or a current area.
  • step S405 the result of the step, that is to say the cursor-current zone association, is transmitted to the application software for processing and management of the graphical interface.
  • the senor manages the calculation of the characteristic values with the application of the discriminant criteria and the application software manages the application of the contextual criteria.
  • step S403 is no longer implemented by the sensor but by the application software.
  • the characteristic values calculated for the current zones are probabilities of belonging to a type of touch (or support).
  • the first embodiment is part of an application software writing or drawing by means of a passive stylus.
  • a user manipulates a stylus 500 with his right hand 501 to write the word "hello", that is, "hello” in English.
  • the application software displays the inscription according to the passage of the stylus on the surface of the screen 502.
  • the application software also displays 503 icons that the user can select with the stylus to activate the software functions.
  • FIG. 5b common areas corresponding to the touch of the user are shown.
  • the zones 504 correspond to the touch of the palm of the user's hand.
  • the zones 505 correspond to the touch of the stylus.
  • the method of characterizing the touch according to the invention allows the system to differentiate these types of touch (or support). As shown in Figure 5b, the palm prints are broken up because of the folds of the hand.
  • the system must differentiate between a touch made by a palm of the hand and a touch made by a stylus.
  • the software should only take into account one stylus and an additional rule is needed to determine in case of conflict (detection of two or more pens) which is considered.
  • conflict detection of two or more pens
  • a current zone is defined by its coordinates in the plane of the screen, its weight (as defined above), its size and probabilities P (palm), P (stylus) belonging to the touch type " palm “or” stylus ".
  • step S601 After updating the data of the current zone during a step S600, it is determined, during a step S601, whether the number of current zones whose current characterization state indicates the "stylus" type. is greater than 1. If this is the case, a step S602 is implemented to select the current zone having the highest position. The highest current zone is then called the "stylus" type, which type becomes inaccessible to the other current zones until the qualified zone disappears. If during step S601 there is no detection of conflict, step S603 is implemented for the only current zone whose current characterization state indicates the type "stylus".
  • a new bounding box is grouped with a current field if it appears in the maximum area of an existing current area (the maximum area of a current area being, for example, a circle with a radius of 15 mm centered on its barycenter).
  • the discriminating criteria are as follows.
  • the current probabilities are updated and the probability P (pen) is increased by 0.125 while the probability P (palm) is decreased by 0.125.
  • the stylet type is disqualified and the probability P (pen) is set to zero whereas the probability P (palm) is set to 1, thus qualifying the zone current type "palm”.
  • the current probabilities are updated and the probability P (stylet) is increased by 0.125 while the probability P (palm) is decreased by if the variation VW of the average weight W of a current zone during a lapse of time T is less than a threshold value WV1 of 15, then the current probabilities are updated and the probability P (stylet) is increased by 0.125 then that the probability P (palm) is decreased by 0.125,
  • the "palm” type is disqualified and the probability P (palm) is set to zero whereas the probability P (stylet) is set to 1 thus qualifying the current area to the type "stylus".
  • the current probabilities are updated and the probability P (stylet) is increased by 0.125 while the probability P (palm) is decreased by 0.125,
  • the touch data is analyzed and a current area is defined los of step S700.
  • a test on the size of the current zone is then implemented during a step S701 in which the surface S of the current zone is compared with the threshold value S1.
  • the probabilities P (palm) and P (pen) are set to 1 and 0, respectively, in a step S702.
  • the probabilities are updated during a step S703: P (palm) is decreased by 0.125 and P (stylet) is increased by 0.125.
  • the variation VW of the weight of the current zone is compared with the threshold value VW1 during a step S704.
  • the calculation of the weight variation is for example carried out between two measurements corresponding to two scans.
  • step S702 is implemented.
  • step S705 similar to the step S703 is implemented.
  • the probabilities are updated during a step S707: P (palm) is increased by 0.125 and P (stylus) is decreased by 0.125.
  • the average weight W of the current zone is compared with W1 during a step S709.
  • the probabilities P (palm) and P (pen) are set to 0 and 1, respectively, in a step S710.
  • step S702 is implemented.
  • step S707 an S711 test is implemented to determine whether the probability P (stylet) reached on average 1 during the period
  • the probability is definitively set to 1 for P (stylus) and zero for P (palm) in step S710.
  • the current probabilities are recorded during a step S712, thus defining the current characterization state of the current zone.
  • the second embodiment is part of an application software writing or drawing by means of a passive stylus, with possible interaction of the finger.
  • a user manipulates a stylus 800 with his right hand 801 to write the word "hello", that is, "hello” in English.
  • the application software displays the inscription according to the passage of the stylus on the surface of the screen 802.
  • the application software also displays 803 icons that the user can select with a finger 804 with his left hand to activate functionalities. of the software.
  • FIG 8b common areas corresponding to the touch of the user are shown.
  • the zones 805 correspond to the touch of the palm of the right hand of the user
  • the zones 806 correspond to the touch of the stylus
  • the zone 807 corresponds to the palm of the left hand of the user (or his thumb)
  • the zone 808 corresponds to the finger 804 of his left hand.
  • the method of characterizing the touch according to the invention allows the system to differentiate these types of touch (or support). As shown in Figure 5b, the palm prints are broken up because of the folds of the hand.
  • FIGS 9a and 9b have been shown the use of the same application software for a user holding the stylus with his left hand.
  • a new bounding box is grouped with a current field if it appears in the maximum area of an existing current area (the maximum area of a current area being, for example, a circle with a radius of 15 mm centered on its center of gravity).
  • the "palm” type or the "finger” type can be associated with an infinite (or at least very large) number of current zones and the "stylus" type can only be associated with one zone current at a time.
  • a current zone is defined by its coordinates, its weight, its size and its probabilities P (palm), P (finger) and P (stylet) of belonging respectively to the type "palm", "finger” and "stylus".
  • step S1001 After updating the data of the current zone during a step S1000, it is determined, during a step S1001, whether the number of current zones whose current characterization state indicates the "stylet" type. is greater than . If this is the case, a step S1002 is implemented to select the current zone having the leftmost position. The leftmost current zone is then called the "stylus" type, which type becomes inaccessible to the other current zones until the qualified zone disappears. If during step S1001 there is no conflict detected, step S1003 is implemented for the only current zone whose current characterization status indicates the "stylus" type.
  • the discriminating criteria are as follows.
  • the current zone ZD1 is the current analyzed zone, and that the current zones ZD2, ZD3, ZD4 and ZD5 are already qualified as the "palm" type.
  • the distance between the zone ZD1 and the zones ZD2 or ZD3 is greater than D2, thus the probabilities P (stylet) and P (finger) are increased by 0.125 while the probability P ( palm) is decreased by 0.25.
  • the distance between the zones ZD1 and ZD4 and the zones ZD2 or ZD3 is less than D1, so the probability P (palm) is increased by 0.125 while the probability P (stylus) is decreased by 0.125.
  • the probability P (finger) is unchanged.
  • the distance between the zones ZD1 and ZD5 and the zones ZD2 or ZD3 is between D1 and D2, so the probabilities are unchanged).
  • the size of the current area may be a discriminating criterion for the detection of the palm. Indeed, if the area of the current area is greater than the limit area S1 of 5 mm 2 of a stylet and the characteristic surface S2 of 1.5 cm 2 of the fingerprint on a touch sensor, then it is possible to directly consider that the type of the current area is a palm because the palm of the hand has the potential surface of contact with the largest sensor.
  • Box 1200 has the scale of the threshold surfaces S1 and S2.
  • the areas of the zones ZD1 and ZD5 are smaller than S1, so the probability P (stylus) is increased by 0.25 while the probabilities P (finger) and P (palm) are decreased by 0.125.
  • the area ZD4 area is between S1 and S2, the probabilities P (finger) and P (palm) are unchanged while the probability P (stylus) is set to zero which disqualifies the stylus type for zone ZD4.
  • the probability P (palm) is set to 1, and the probabilities P (finger) and P (pen) are set to 1, which qualifies the area as "palm ".
  • a common area with a large average weight can be considered a stylus while a common area with a low average weight is considered a palm.
  • the criterion for changing the weight of the current zone corresponds to the variation of the average weight of the current zone during a given time, for example between 2 scans. This variation is small in the case of a stylet, intermediate in the case of a finger and strong for a palm.
  • the probabilities are modified as follows:
  • V1 for example 50 mm / s
  • V2 for example 20 mm / s
  • V1 and V2 are limit speeds distinguishing three characteristic intervals of typical speeds of a stylus (if V> V1), a finger (if V2 ⁇ V ⁇ V1) and a palm (if V ⁇ V2).
  • the probability P (palm) is then increased by 0.25, the probabilities P (finger) and P (stylet) are decreased by 0.125.
  • the probability P (stylus) is then increased by 0.25, the probabilities P (finger) and P (palm) are decreased by 0.125.
  • a touch type membership test is performed based on the average membership probability for each type. If an average probability relative to a type is greater than a threshold (PM1 for example 0.75) while the average probabilities relating to the other types are lower than another threshold (PM2 for example 0.25), then this type is qualified.
  • a threshold for example 0.75
  • PM2 for example 0.25
  • a membership test is performed after a time T "(for example 40ms) for each type according to the average of probabilities PM (Palm), PM (finger), PM (stylus). 2 051849
  • FIG. 14 is a flowchart summarizing the steps implemented in the second exemplary embodiment.
  • step S1400 the touch data is acquired and then processed in step S 1401 to define current areas.
  • step S1402 it is determined whether a current area is of the "palm" type.
  • the zone is qualified of this type during a step S1404 and a corresponding cursor is assigned to the zone during the step S1405.
  • the probabilities associated with each type are updated during a step S1403 before proceeding to step S1404 to determine a current characterization state of the current zone.
  • the third exemplary embodiment is part of an application software for writing or drawing using a passive stylus, with possible finger interaction and in which several stylets can be used.
  • This third example is based on the two previous examples, which is why some explanations are not repeated here, the person skilled in the art can refer to the preceding examples for more details. 12 051849
  • a new bounding box is grouped with a current field if it appears in the maximum area of an existing current area (the maximum area of a current area being, for example, a circle with a radius of 15 mm centered on its center of gravity).
  • the "palm” type, the "finger” type or the “stylus” type can be associated with an infinite number (or at least a very large number) of current zones.
  • a current zone is defined by its coordinates, its weight, its size and its probabilities P (palm), P (finger) and P (stylet) of belonging respectively to the type "palm", "finger” and "stylus".
  • the touch data is analyzed and a current area is defined los of step S1500.
  • a test on the size of the current zone is then implemented during a step S1501 in which the surface S of the current zone is compared with a threshold value S 2 of 1.5 cm 2 . If the area of the current area is greater than S1, then the probabilities P (palm) and P (finger) are set to 1 and 0, respectively, in a step S1502.
  • the probabilities are updated during a step S1503: P (finger) is decreased by 0.125 and P (stylet) is increased by 0.125.
  • the variation VW of the weight of the current zone is compared with a threshold value VW2 of 20.
  • the calculation of the variation of weight is for example carried out between two measurements corresponding to two scans.
  • step S 502 is implemented.
  • step S1505 similar to the step
  • the speed of the center of the current zone is compared with a threshold value V2 of 20 mm / s during a step S1506.
  • the probabilities are updated in a step S 1507: P (finger) is increased by 0.125 and P (stylus) is decreased by 0.125.
  • the average weight W of the current zone is compared with a threshold value W2 of 10 during a step S 509.
  • P (palm) and P (pen) are set to 0 and 1 respectively during a step
  • step S1502 is implemented.
  • a test S1511 is implemented to determine whether the probability P (finger) has reached on average 1 during a period T "of 40 ms (corresponding to 4 scans at 100 Hz).
  • the probability is definitively set to 1 for P (stylus) and zero for P (finger) in step S1510.
  • the current probabilities are recorded during a step S1512, thus defining the current characterization state of the current zone.
  • the current probabilities are also recorded in step S1512.
  • FIG. 16 is a flowchart summarizing the steps implemented in the third exemplary embodiment.
  • step S1600 the touch data is acquired and then processed in step S1601 to define current areas.
  • step S 1602 it is determined whether a current zone is of the "palm" type.
  • the zone is qualified of this type during a step S1608 and a corresponding cursor is assigned to the zone during the step S1609.
  • the probabilities associated with each remaining type ie "palm” and "stylus” are updated in a step S1604 before proceeding to step S1608 for determine a current characterization state of the current zone.
  • a step S1605 is implemented to determine if the "stylus" type is disqualified.
  • the probabilities associated with each remaining type ie "palm” and "finger" are updated in a step S1606 before proceeding to step S1608 for determine a current characterization state of the current zone.
  • step S1607 is implemented to update the probabilities associated with each type.
  • Step S1608 is then implemented to record the probabilities for the current zone and thus determine the current characterization state e the zone.
  • Step S1609 is then implemented.
  • a computer program for carrying out a method according to an embodiment of the invention can be realized by the person skilled in the art upon reading the flowcharts of FIGS. 3, 4a, 4b, 6, 7, 10, 14, 15 and 16 and the present detailed description.
  • the present invention is not limited to the embodiments described, other variations and combinations of features are possible.
  • the present invention can be implemented for application software applications on a tactile keyboard, as illustrated by FIGS. 17a and 17b.
  • a user manipulates can select 1700 keys from a virtual keyboard shown on a 1701 screen with his right hand 1702 to type text as he would on a conventional physical keyboard.
  • Figure 17b common areas corresponding to the touch of the user are shown.
  • the areas 1703 correspond to the touch of the palm of the user's hand and the areas 1704 correspond to the touch of the stylus.
  • the method of characterizing the touch according to the invention allows the system to differentiate these types of touch (or support).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
EP12761644.9A 2011-08-12 2012-08-06 Verfahren zur kennzeichnung einer berührung auf einem taktilen display Withdrawn EP2742409A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1157336A FR2979025A1 (fr) 2011-08-12 2011-08-12 Procede de caracterisation de toucher sur un ecran tactile
PCT/FR2012/051849 WO2013024225A1 (fr) 2011-08-12 2012-08-06 Procédé de caractérisation de toucher sur un écran tactile

Publications (1)

Publication Number Publication Date
EP2742409A1 true EP2742409A1 (de) 2014-06-18

Family

ID=46880745

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12761644.9A Withdrawn EP2742409A1 (de) 2011-08-12 2012-08-06 Verfahren zur kennzeichnung einer berührung auf einem taktilen display

Country Status (3)

Country Link
EP (1) EP2742409A1 (de)
FR (1) FR2979025A1 (de)
WO (1) WO2013024225A1 (de)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015146177A (ja) * 2014-01-06 2015-08-13 船井電機株式会社 入力装置
US9430085B2 (en) * 2014-09-12 2016-08-30 Microsoft Technology Licensing, Llc Classification of touch input as being unintended or intended
US9804707B2 (en) 2014-09-12 2017-10-31 Microsoft Technology Licensing, Llc Inactive region for touch surface based on contextual information
US9626020B2 (en) * 2014-09-12 2017-04-18 Microsoft Corporation Handedness detection from touch input
US9495052B2 (en) 2014-12-19 2016-11-15 Synaptics Incorporated Active input device support for a capacitive sensing device
US10037112B2 (en) 2015-09-30 2018-07-31 Synaptics Incorporated Sensing an active device'S transmission using timing interleaved with display updates

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1717684A3 (de) 1998-01-26 2008-01-23 Fingerworks, Inc. Verfahren und Vorrichtung zur Integration von manuellen Eingaben
US7190348B2 (en) * 2000-12-26 2007-03-13 International Business Machines Corporation Method for touchscreen data input
FR2866726B1 (fr) 2004-02-23 2006-05-26 Jazzmutant Controleur par manipulation d'objets virtuels sur un ecran tactile multi-contact
US8018440B2 (en) * 2005-12-30 2011-09-13 Microsoft Corporation Unintentional touch rejection
US8130203B2 (en) * 2007-01-03 2012-03-06 Apple Inc. Multi-touch input discrimination
US20090095540A1 (en) 2007-10-11 2009-04-16 N-Trig Ltd. Method for palm touch identification in multi-touch digitizing systems
TW201011605A (en) 2008-09-01 2010-03-16 Turbotouch Technology Inc E Method capable of preventing mistakenly triggering a touch panel
US8514187B2 (en) * 2009-09-30 2013-08-20 Motorola Mobility Llc Methods and apparatus for distinguishing between touch system manipulators

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2013024225A1 *

Also Published As

Publication number Publication date
WO2013024225A1 (fr) 2013-02-21
FR2979025A1 (fr) 2013-02-15

Similar Documents

Publication Publication Date Title
EP2235615B1 (de) Elektronische analyseschaltung mit modulation der abtasteigenschaften für taktilen mehrkontakt-passivmatrixsensor
EP2235614B1 (de) Elektronische analyseschaltung mit wechselnder kapazitiver/resistiver messung für taktilen mehrkontakt-passivmatrixsensor
WO2013024225A1 (fr) Procédé de caractérisation de toucher sur un écran tactile
US11880565B2 (en) Touch screen display with virtual trackpad
US20180088786A1 (en) Capacitive touch mapping
EP2956846B1 (de) Verfahren, vorrichtung und speichermedium zur navigation auf einem anzeigebildschirm
EP2310932A1 (de) Verfahren zur acquisition und analyse eines mehrkontakt-tastsensors unter verwendung eines dichotomen prinzips und elektronische schaltung und mehrkontakt-tastsensor zur implementierung eines solchen verfahrens
CN107835968A (zh) 力曲线和无意输入控制
FR2952730A1 (fr) Dispositif a ecran tactile multimode
US20180046319A1 (en) Method to adjust thresholds adaptively via analysis of user's typing
EP2671140B1 (de) Verfahren und vorrichtung für datenerfassung mit einem mehrfachberührungs-crossbar-netzwerksensor
US10678381B2 (en) Determining handedness on multi-element capacitive devices
FR2925715A1 (fr) Circuit electronique d'analyse a alternance axe d'alimentaiton/axe de detection pour capteur tactile multicontacts a matrice passive
US20240192806A1 (en) Diffusion-based handedness classification for touch-based input
WO2013153338A1 (fr) Génération perfectionnée de commandes dans un équipement à écran tactile
US8531412B1 (en) Method and system for processing touch input
US12056311B2 (en) Touch screen and trackpad touch detection
FR3017470A1 (fr) Procede de saisie sur un clavier numerique, interface homme machine et appareil mettant en œuvre un tel procede
EP4145253A1 (de) Verfahren zur analyse der aktivität eines benutzers eines elektronischen endgeräts
WO2023229646A1 (en) Using touch input data to improve fingerprint sensor performance
FR2946768A1 (fr) Procede d'entree tactile d'instructions de commande d'un programme d'ordinateur et systeme pour la mise en oeuvre de ce procede

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140303

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
19U Interruption of proceedings before grant

Effective date: 20150422

19W Proceedings resumed before grant after interruption of proceedings

Effective date: 20170403

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: NISSHA PRINTING CO., LTD.

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20170907

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20180118