WO2020192563A1 - Reconnaissance et suivi d'objets multiples lorsqu'ils sont placés sur un écran tactile capacitif - Google Patents

Reconnaissance et suivi d'objets multiples lorsqu'ils sont placés sur un écran tactile capacitif Download PDF

Info

Publication number
WO2020192563A1
WO2020192563A1 PCT/CN2020/080310 CN2020080310W WO2020192563A1 WO 2020192563 A1 WO2020192563 A1 WO 2020192563A1 CN 2020080310 W CN2020080310 W CN 2020080310W WO 2020192563 A1 WO2020192563 A1 WO 2020192563A1
Authority
WO
WIPO (PCT)
Prior art keywords
identified
pattern
individual
patterns
candidate
Prior art date
Application number
PCT/CN2020/080310
Other languages
English (en)
Inventor
Jacky CHO
Original Assignee
Cho Jacky
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cho Jacky filed Critical Cho Jacky
Priority to CN202080024216.5A priority Critical patent/CN113632055B/zh
Publication of WO2020192563A1 publication Critical patent/WO2020192563A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • G06F3/041661Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving using detection at multiple resolutions, e.g. coarse and fine scanning; using detection within a limited area, e.g. object tracking window
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Definitions

  • the present invention relates to a technique for distinguishing and tracking a plurality of objects placed on a capacitive touchscreen, where each object produces a plurality of touches at selected touch points detectable by the touchscreen.
  • FIG. 1 depicts an example of this scenario.
  • Each object has its own set of anchor points and information points.
  • the triangular pattern is uniquely identifiable by a touchscreen processor such that the processor can compare the identified pattern against a database of known patterns to detect which object corresponds to the identified pattern. The object can then be identified.
  • the information points are associated with an object. Appearance and disappearance of the information points can be switched over time, and the information points are used by the object to transmit data information from the object to the touchscreen processor. It is desirable to simultaneously identify all the cards/objects and their locations and orientations when they are placed on the touchscreen, based on the collection of touch points that are activated on the touchscreen, viz., that are detected by the touchscreen.
  • NFC near-field communication
  • the present disclosure provides a method for distinguishing and tracking multiple objects placed on a touchscreen.
  • An individual object is configured to generate imitated touches on the touchscreen.
  • the method comprises steps (a) - (d) .
  • step (a) touch points made on the touchscreen by the objects at a time instant are identified and located.
  • An individual candidate pattern includes three of the touch points forming a triangle congruent to a fingerprint triangle associated with a known predefined object in a set of predefined objects. Said three of the touch points are regarded as three anchor points of the individual candidate pattern.
  • a set of identified patterns is updated according to the candidate patterns and the locations of the touch points identified in the step (a) .
  • An individual identified pattern is used for indicating a detected object on the touchscreen and is assigned with plural touch points generated by the detected object.
  • the touch points assigned to the individual identified pattern include three anchor points for identifying the detected object.
  • the set of identified patterns is updated by an updating process.
  • the updating process comprises incorporating a subset of the candidate patterns into the set of identified patterns. The subset is selected such that in the subset, each anchor point belongs to one and only one candidate pattern for avoiding a single anchor point from being shared by plural candidate patterns.
  • the updating process further comprises purging the set of identified patterns of any identified pattern that is invalid.
  • An invalid identified pattern is defined to have no more than one anchor point coinciding with any of the touch points identified in the step (a) .
  • processing steps are repeated for a next time instant.
  • the processing steps include the steps (a) - (c) .
  • any touch point identified in the step (a) and coinciding with any assigned touch point in the set of identified patterns determined for the time instant is excluded from consideration in finding out the candidate patterns. It advantageously leads to a reduction in required computation and storage when compared to execution without a knowledge of the set of identified patterns.
  • the updating process further comprises creating a point set of the individual candidate pattern based on locations of the three anchor points of the individual candidate pattern.
  • the point set is a bounded area representing a size and an orientation of the known predefined object.
  • the bounded area may be a circle or any other shape.
  • the bounded area may also be a triangle, a quadrilateral or a polygon.
  • the subset of the candidate patterns is selected by a selecting process.
  • the selecting process comprises: initializing the subset to be a null set; sequentially examining the candidate patterns one by one to determine whether a candidate pattern under examination is invalid; and responsive to finding that the candidate pattern under examination is not invalid, including the candidate pattern under examination into the subset.
  • An invalid candidate pattern is defined such that any anchor point of the invalid candidate pattern is shared, or is already used, by any identified pattern in the set of identified patterns or by any candidate pattern already included in the subset.
  • the candidate patterns are arranged in an ascending order of degrees of pattern overlapping.
  • a respective degree of pattern overlapping of the individual candidate pattern is computed by summing three numbers each being a total number of respective point sets that an individual anchor point of the individual candidate pattern is enclosed by.
  • the purging of the set of identified patterns comprises: validating a respective touch point assigned to the individual identified pattern by verifying if the respective touch point is one of the touch points identified in the step (a) and is enclosed by the point set of the individual identified pattern; and responsive to finding that the respective touch point is invalid, remove the respective touch point from the identified pattern.
  • An incomplete identified pattern is defined to have only two anchor points coinciding with the touch points identified in the step (a) such that the incomplete identified pattern has a missing anchor point. It is also preferable that responsive to finding that the individual identified pattern is incomplete, a location of the missing anchor point of the individual identified pattern from the two coinciding anchor points thereof is estimated such that the individual identified pattern is re-assigned with three anchor points.
  • the updating process After the incorporating of the subset into the set of identified patterns and after the purging of the set of identified patterns are performed, preferably the updating process performs allocating remaining touch points identified in the step (a) and not assigned as anchor points in the set of identified patterns to different identified patterns as information points according to respective point sets of the different identified patterns and locations of the remaining touch points.
  • the updating process After the remaining touch points are allocated, preferably the updating process performs updating attributes of the individual identified pattern according to data received through any information point allocated to the individual identified pattern.
  • the updating process performs computing or updating a geometry of the individual identified pattern according to locations of the three anchor points of the individual identified pattern.
  • the geometry of the individual identified pattern comprises a reference point of the individual identified pattern on the touchscreen and an orientation of the individual identified pattern with respect to the touchscreen.
  • the method further comprises a step (e) .
  • a trajectory of the individual identified pattern is forecasted such that respective trajectories of on-screen icons tracking the objects are displayable on the touchscreen in advance for compensating for latency of the touchscreen.
  • the processing steps further include the step (e) .
  • the trajectory of the individual identified pattern may be forecasted by a forecasting process.
  • a time series of respective geometries of the individual identified pattern over a plurality of different time instants is obtained.
  • the plurality of different time instants includes the time instant.
  • a linear regression technique is then used to estimate a future geometry of the individual identified pattern based on the time series of respective geometries of the individual identified pattern such that the trajectory of the individual identified pattern is forecasted.
  • the linear regression technique may be a polynomial-based linear regression technique.
  • the present disclosure also provides an apparatus comprising a touchscreen and one or more processors.
  • the touchscreen is controllable by the one or more processors to sense touches made on the touchscreen.
  • the one or more processors is configured to execute a process for distinguishing and tracking multiple objects placed on the touchscreen according to any of the embodiments of the method, where an individual object is configured to generate imitated touches on the touchscreen.
  • the present disclosure provides a non-transitory computer readable medium storing a program.
  • the program when executed on one or more processors, causes the one or more processors to execute a process for distinguishing and tracking multiple objects placed on a touchscreen according to any of the embodiments of the method, where an individual object is configured to generate imitated touches on the touchscreen.
  • FIG. 1 is a photograph of multiple objects placed on a touchscreen of a computing device, showing that the user can interact with the computing device by placing the objects on particularly selected locations on the touchscreen.
  • FIG. 2 depicts a flowchart showing a process flow of an algorithm for exemplarily illustrating a disclosed method for distinguishing and tracking multiple objects placed on a touchscreen.
  • FIG. 3 depicts an example of creating a plurality of candidate patterns from a plurality of touch points by performing step S1 of the algorithm.
  • FIG. 4 depicts the creation of point sets (shown as circles) after step S2.1 is performed, showing that overlapping of circles is present where the overlapped circles are considered invalid circles and should be discarded.
  • FIG. 5 depicts a plurality of identified patterns after the whole step S2 of the algorithm is completed, showing that any overlapped circle is removed.
  • FIG. 6 depicts an example for pictorially illustrating prediction of movement of an identified pattern on the touchscreen by performing the step S5 of the algorithm.
  • FIG. 7 depicts a flow of exemplary processing steps as used in the disclosed method.
  • FIG. 8 depicts a flowchart showing exemplary steps of an updating process for updating the set of identified patterns in accordance with certain embodiments of the disclosed method.
  • FIG. 9 depicts exemplary steps of a selecting process for selecting candidate patterns to be incorporated into the set of identified patterns in accordance with certain embodiments of the disclosed method.
  • FIG. 10 depicts exemplary steps for forecasting a trajectory of each identified pattern in accordance with certain embodiments of the disclosed method.
  • FIG. 11 depicts an apparatus used for implementing the disclosed method, where the apparatus includes a touchscreen and one or more processors.
  • the method is illustrated by first describing a computer-implemented algorithm for distinguishing and tracking the objects placed on the touchscreen.
  • the algorithm is a practical realization of an exemplary embodiment of the disclosed method. Details of the method are further elaborated by generalizing the disclosed algorithm.
  • a system model and variables used herein are described as follows. Let M be the number of touch points activated on the screen at any time instant. That is, M touch points are detected on the screen at the aforesaid time instant. It is desired to determine the number of objects placed on the screen and to distinguish these objects according to the activated touch points that are detected by a touchscreen processor. Let N be the actual number of objects placed on the screen. Since one object contains three anchor points and possibly includes one or more information points, it follows that N is upper bounded by M/3. That is, N ⁇ M/3. Denote P as an object pattern (or a pattern in short) , which is a set of points representing an object O. Let ⁇ a 1 , a 2 , a 3 ⁇ be anchor points of P.
  • ⁇ A 1 , A 2 , A 3 ⁇ be the anchor point triangle of P, where the anchor point triangle is a unique predefined shape such that no other 3 points in P except ⁇ a 1 , a 2 , a 3 ⁇ can form a triangle congruent to the triangle formed by ⁇ A 1 , A 2 , A 3 ⁇ . That is,
  • ⁇ pqr stands for a triangle formed by the three points p, q and r on the touchscreen; and means “congruent to” .
  • the anchor point triangle of P is also named as a fingerprint triangle, because the fingerprint triangle is associated with the represented object O and is thus useful for object identification.
  • Let ⁇ x, y, ⁇ be the geometry of P, where (x, y) is the location of a certain pre-determined reference point of P on the screen, and ⁇ is an orientation of P with respect to the screen.
  • the pre-determined reference point may be a certain anchor point of P.
  • the orientation ⁇ may be defined as an angle of rotation about the reference point (x, y) from a horizontal axis of the screen to a longest side of ⁇ a 1 a 2 a 3 connecting to the reference point.
  • a 1 , a 2 and a 3 may be arranged in an anticlockwise sense with the side a 1 a 2 being the longest side of ⁇ a 1 a 2 a 3 .
  • the reference point (x, y) may be selected to be a 1 , so that the orientation ⁇ can be determined from the side a 1 a 2 and the horizontal axis of the screen.
  • the anchor-point locations of ⁇ a 1 a 2 a 3 are usable to calculate ⁇ x, y, ⁇ , which is the geometry of the represented object O. Note that for unique, unambiguous determination of the geometry, the three sides of the fingerprint triangle are required to be mutually different. Hence, the fingerprint triangle of the object O shall not be an isosceles triangle or an equilateral triangle.
  • the disclosed algorithm is illustrated with the aid of FIG. 2, which depicts a process flow of the algorithm.
  • the algorithm is executed as a loop, where the loop includes a sequence of operation steps.
  • the sequence of operation steps is executed for a set of M touch points detected by the screen at one time instant.
  • the sequence of operation steps is repeated for respective sets of touch points obtained over consecutive time instants.
  • the numbers of touch points obtained at different time instants are most likely different because new objects may be put on or withdrawn from the screen.
  • erroneous detection of a touch point that is actually non-existent (viz., false positive) by the touchscreen electronics/processors occasionally occurs.
  • a set of identified patterns is obtained. Each identified pattern includes pattern-related data such as the pattern’s geometry.
  • the set of identified patterns is updated by recursively repeating the sequence of operation steps each time with a new set of detected touch points.
  • the algorithm includes steps S1 to S5.
  • step S1 it is to find out candidate patterns from the M detected touch points.
  • the step S1 is illustrated by the following pseudo codes.
  • the screen has a finite resolution in resolving different touch points. Therefore, the location of a touch point is an estimated one and may be a little bit off from the actual location.
  • the tolerable error may be, e.g., 10%.
  • a knowledge of screen resolution is required for determining whether ⁇ p 1 p 2 p 3 is congruent to ⁇ A 1 A 2 A 3 .
  • a triangle ⁇ p 1 p 2 p 3 similar to ⁇ A 1 A 2 A 3 with a scaling factor within a predetermined small range is identified. It is then assumed that ⁇ p 1 p 2 p 3 is congruent to ⁇ A 1 A 2 A 3 so as to create a candidate pattern.
  • Whether ⁇ p 1 p 2 p 3 is truly congruent to ⁇ A 1 A 2 A 3 is to be verified in a later stage with side information such as a location of an information point assigned to the candidate pattern.
  • N′ candidate patterns As a result of executing the step S1, it forms N′ candidate patterns, where N′ ⁇ N. These N′ candidate patterns will be individually assessed in subsequent steps to find out which N out of N′ candidate patterns belong to the actual objects on the screen.
  • FIG. 3 depicts an example of creating a plurality of candidate patterns from a plurality of touch points by performing the step S1. Note that one touch point may be shared by different patterns, resulting in overlapped patterns. Overlapped patterns are clearly not valid; otherwise one touch point on the screen would have been created by at least two physical objects. Overlapped patterns are required to be removed in subsequent steps.
  • step S2 it is to discard invalid (i.e. overlapped) candidate patterns from the N′ candidate patterns.
  • step S2 is illustrated by the following pseudo codes with the aid of FIGS. 4 and 5.
  • S i be the total sum of number of circles that every anchor point of P i ′ is enclosed by, i.e. where
  • C i is created as a circle with a predefined size and position with respect to the anchor points of P′ i . Since a pattern is associated with a physical object in practice, and since the boundary and dimension of the object are known, the size of the circle C i can be determined. For example, one may set C i to be a circle with a diameter of 5cm and a center selected to be a mid-point between first and second anchor points of P′ i .
  • the circle is only one implementation option of a bounded area that represents a size and an orientation of a known predefined object. Apart from the circle, other realizations of the bounded area used for C i include a rectangle, a triangle, a quadrilateral, a polygon, etc.
  • step S2.3 all the N′ candidate patterns are first arranged in the ascending order of S i . Then the arranged sequence of N′ candidate patterns are one-by-one processed by the steps S2.4.1 (including S2.4.1.1 and S2.4.1.2) and S2.4.2 (including S2.4.2.1) .
  • FIG. 4 depicts the creation of point sets (shown as circles) after the step S2.1 (including the step S2.1.1) is performed. Note that overlapping of the circles is present.
  • FIG. 5 depicts a plurality of identified patterns after the step S2 is completed, showing that any overlapped circle is removed.
  • step S3 it is to validate and update the geometry of each identified pattern.
  • the step S3 is illustrated by the following pseudo codes.
  • step S3.1.1.1 that “the touch point under consideration does not exist” means that this touch point as assigned to P i is not found from the currently-considered M detected touch points (detected by the screen) .
  • step S3.2.2.2 means “not congruent to” .
  • step S4 it is to update the attributes of identified patterns survived in the step S3.
  • the step S4 is illustrated by the following pseudo codes.
  • the attributes of a pattern can be used to identify which physical object is placed on the screen, and different user interface (UI) response can be provided.
  • UI user interface
  • step S4.1 the set of identified patterns has been updated.
  • step S5 it is to forecast the geometry of patterns, and display the object. For the sake of convenience and without loss of generality, consider that the current time instant is zero. Let previous time instants be ⁇ T -1 , ⁇ T -2 , etc. Exemplarily, the step S5 is illustrated by the following pseudo codes.
  • step S5.1.2 it is possible that other linear-regression techniques may be used.
  • FIG. 6 provides an example for pictorially illustrating the above-mentioned procedure.
  • An aspect of the present disclosure is to provide a method for distinguishing and tracking multiple objects placed on a capacitive touchscreen.
  • the method is a generalization of the above-described algorithm.
  • An individual object is configured to generate imitated touches on the touchscreen, and the touchscreen is configured to sense the imitated touches. Respective locations and orientations of the objects on the touchscreen are determined. Furthermore, the objects are tracked such that future location and orientation of the individual object are predicted. The predicted location and orientation are useful for practical applications in which, e.g., an on-screen icon following the individual object is intended to be displayed in advance through some visual animation.
  • FIG. 7 depicts a flow of exemplary processing steps as used in the disclosed method.
  • a step 710 touch points made at a time instant on the touchscreen by the objects are identified and located. For convenience and without loss of generality, this time instant is referred to as a present time instant.
  • Candidate patterns are then identified in a step 720.
  • the candidate patterns are created and determined from locations of the touch points identified in the step 710.
  • An individual candidate pattern includes three of the touch points forming a triangle congruent to a fingerprint triangle associated with a known predefined object in a set of predefined objects.
  • the aforesaid three of the touch points are regarded as three anchor points of the individual candidate pattern.
  • the step 720 is realizable by the step S1 of the algorithm.
  • a set of identified patterns is updated in a step 730 according to the candidate patterns and the locations of the touch points identified in the step 710.
  • An individual identified pattern is used for indicating a detected object on the touchscreen and is assigned with plural touch points generated by the detected object.
  • the touch points assigned to the individual identified pattern include three anchor points for identifying the detected object.
  • the set of identified patterns is updated by an updating process.
  • the updating process is at least arranged to incorporate a subset of the candidate patterns into the set of identified patterns (step 731) , and to purge the set of identified patterns of any identified pattern that is invalid (step 732) .
  • the selection of the subset of the candidate patterns is made such that in the subset, each anchor point belongs to one and only one candidate pattern for avoiding a single anchor point from being shared, or used, by plural candidate patterns.
  • An invalid identified pattern is defined to have no more than one anchor point coinciding with any of the touch points identified in the step 710.
  • the step 730 is realizable by the steps S2 and S3 of the algorithm, and optionally by the step S4 thereof.
  • the rationale of defining the invalid identified pattern by the aforementioned criterion is as follows. If no anchor point assigned to the individual identified pattern coincides with any detected touch point identified in the step 710, it is almost sure that the object associated with the individual identified pattern has been moved away from its previously detected position. If there is only one such anchor point, it is also likely that the object has been moved away from the previously detected position. However, if there are two assigned anchor points coinciding with some detected touch points identified in the step 710, it is likely that the remaining assigned anchor point is not successfully detected so that the object is not actually moved away. If the object has been moved away from its previously recorded position, the individual identified pattern associated with the object is outdated, and should be removed from the set of identified patterns. Conversely, if the object is still intact on the previously recorded position, the individual identified pattern is still informative and can be retained in the set of identified patterns.
  • a step 750 the steps 710, 720 and 730 are repeated for a next time instant.
  • step 720 for the next time instant it is highly preferable that any touch point identified in the step 710 and coinciding with any assigned touch point in the set of identified patterns determined for the present time instant is excluded from consideration in finding out the candidate patterns.
  • the exclusion of the touch points already assigned to any identified pattern is also implemented in the step S1.1 of the algorithm.
  • FIG. 8 depicts a flowchart showing exemplary steps taken by the updating process in updating the set of identified patterns.
  • a point set of the individual candidate pattern based on locations of the three anchor points of the individual candidate pattern is created.
  • the point set is a bounded area representing a size and an orientation of the known predefined object.
  • the step 810 is realizable by the step S2.1 of the algorithm.
  • the bounded area may be shaped according to the shape of the known predefined object.
  • the bounded area may be shaped as a circle, a rectangle, a triangle, a quadrilateral, a polygon, etc.
  • the bounded area is preferably set as a circle because only two parameters, i.e. center and radius of the circle, are required to specify the circle. Since the three anchor-point locations and the boundary of the known predefined object are known, generally the center and radius of the circle can be easily determined.
  • Those skilled in the art may also design each object in the set of predefined objects with a constraint between the center of the object and the set of three anchor points.
  • the constraint is that the center of the object coincides with a centroid of the fingerprint triangle formed by the three anchor points.
  • Another example is that the center of the object coincides with a mid-point of the longest side of the fingerprint triangle.
  • a step 820 the subset of the candidate patterns is selected by a selecting process.
  • the steps are realizable by the steps S2.2-S2.4 of the algorithm.
  • FIG. 9 depicts exemplary steps of the selecting process.
  • the subset is gradually added up with acceptable candidate patterns by one-by-one examining the candidate patterns.
  • the subset is initialized to be a null set, i.e. an empty set (step 910) .
  • the candidate patterns are sequentially examined one by one to determine whether a candidate pattern under examination is invalid, whereby the candidate pattern under examination, if found to be not invalid, is included into the subset (step 915) .
  • An invalid candidate pattern is defined such that any anchor point of the invalid candidate pattern is shared, or is already used, by any identified pattern in the set of identified patterns or by any candidate pattern already included in the subset.
  • the rationale behind such definition is that valid candidate patterns are to be included into the set of identified patterns, and different identified patterns are non-overlapping.
  • the step 915 may be implemented as follows.
  • the candidate patterns are first arranged in an ascending order of degrees of pattern overlapping (step 920) .
  • a respective degree of pattern overlapping of the individual candidate pattern is computed by summing three numbers. Each of these three numbers is given by a total number of respective point sets that an individual anchor point of the individual candidate pattern is enclosed by.
  • the candidate patterns are one-by-one examined in the aforementioned ascending order. If the candidate pattern under examination is not invalid (step 930) , the candidate pattern under examination is added to the subset (step 935) such that the subset is expanded. In case the candidate pattern under examination is found to be invalid (the step 930) , this candidate pattern is discarded (step 936) and is not added into the subset.
  • the step 930 and the conditional steps 935, 936 are repeated until all the candidate patterns are examined (step 940) .
  • the step 920 is realizable by the steps 2.2 and 2.3 of the algorithm.
  • the steps 935 and 936 are realizable by the steps 2.4.1 and 2.4.2, respectively, of the algorithm.
  • sorting of the candidate patterns by the step 920 has an advantage of ensuring that a maximum number of non-overlapped patterns is identified.
  • the subset of the candidate patterns is obtained in the step 820, the subset of the candidate patterns is incorporated into the set of identified patterns in a step 825.
  • each of the touch points assigned to the individual identified pattern is validated by verifying if a touch point under consideration is one of the touch points identified in the step 710 and is enclosed by the point set of the individual identified pattern.
  • the touch point under consideration is valid if it coincides with one touch point identified in the step 710 and is also enclosed by the point set. If the touch point under consideration is found to be invalid, this touch point is removed from the individual identified pattern.
  • the step 830 is realizable by the step S3.1 of the algorithm.
  • a step 832 it is to find out any identified pattern in the set of identified patterns to be invalid.
  • an invalid identified pattern has none or only one assigned anchor point that coincides with the touch points identified in the step 710.
  • any invalid identified pattern in the set of identified patterns is removed so as to purge the set of identified patterns in a step 835.
  • the steps 832 and 835 are collectively realizable by the step S3.2.1 (including S3.2.1.1) of the algorithm.
  • the step 731 of incorporating the subset of the candidate patterns into the set of identified patterns includes the steps 820 and 825.
  • the step 732 of purging the set of identified patterns of any invalid identified pattern includes the steps 830, 832 and 835. Although it is shown in FIG. 8 for illustrative purpose that the step 731 precedes the step 732 in execution order, other execution orders may be used since identified patterns newly introduced into the set of identified patterns in executing the step 731 are already valid while the step 732 is intended to remove invalid identified patterns only. It follows that the step 732 may precede the step 731 in execution.
  • the updating process further comprises a step 850 of finding out and repairing any incomplete identified candidate in the set of identified patterns.
  • An incomplete identified pattern is defined to have only two anchor points coinciding with the touch points identified in the step 710. Note that the incomplete identified pattern has a missing anchor point.
  • the individual identified pattern is found to be incomplete, it is repaired by estimating a location of the missing anchor point from the two coinciding anchor points, viz, the aforementioned two anchor points of the individual identified pattern coinciding with the touch points identified in the step 710.
  • the individual identified pattern after repairing is re-assigned with three anchor points.
  • the step 850 is realizable by the step S3.2.2.1 of the algorithm.
  • the updating process further comprises a step 860 of computing or updating a geometry of the individual identified pattern in the set of identified patterns.
  • the step 860 is executed after the steps 731 and 732 are performed.
  • the geometry is computed or updated according to locations of the three anchor points of the individual identified pattern.
  • the geometry of the individual identified pattern comprises a reference point of the individual identified pattern on the touchscreen and an orientation of the individual identified pattern with respect to the touchscreen.
  • the step 860 is realizable by the step S3.2.2.3.1 of the algorithm.
  • step 870 it is preferable to allocate remaining touch points identified in the step 710 and not assigned as anchor points in the set of identified patterns to different identified patterns as information points according to respective point sets of the different identified patterns and locations of the remaining touch points (step 870) .
  • the step 870 is realizable by the step S4.1 of the algorithm.
  • step 870 After the remaining touch points are allocated in the step 870, it is preferable to update attributes of the individual identified pattern according to data received through any information point allocated to the individual identified pattern (step 875) .
  • An individual attribute may be any type of information conveyed from the one or more allocated information points.
  • the step 875 is realizable by the step S4.2 of the algorithm.
  • a trajectory of the individual identified pattern is forecasted in a step 740 such that respective trajectories of on-screen icons tracking the objects are displayable on the touchscreen in advance for compensating for latency of the touchscreen.
  • the step 740 is realizable by the step S5 of the algorithm.
  • the step 740 is also repeated for the next time instant.
  • FIG. 10 depicts exemplary steps for realizing the step 740.
  • a time series of respective geometries of the individual identified pattern is obtained over a plurality of different time instants (step 1010) .
  • the plurality of different time instants includes the present time instant.
  • a linear regression technique is used to estimate a future geometry of the individual identified pattern based on the time series of respective geometries of the individual identified pattern (step 1020) .
  • the linear regression technique may be a polynomial-based linear regression technique.
  • FIG. 11 depicts an apparatus 1100 used for implementing the disclosed method.
  • the apparatus 1100 comprises a touchscreen 1110 and one or more processors 1120.
  • the touchscreen 1110 is controllable by the one or more processors 1120. It follows that the one or more processors 1120 are operable to configure the touchscreen 1110 to sense touches made by the objects, and to identify and locate the objects in accomplishing the step 710 of the disclosed method.
  • the one or more processors 1120 are configured to execute a process for distinguishing and tracking multiple objects placed on the touchscreen 1110 according to any of the embodiments of the disclosed method.
  • the one or more processors 1120 may be, or may include, a touchscreen processor, which is a processor integrated into the touchscreen 1110.
  • An individual processor may be realized by a microcontroller, a general-purpose processor, or a special-purpose processor such as an application specific integrated circuit (ASIC) or a digital signal processor (DSP) , or by reconfigurable logics such as a field programmable gate array (FPGA) .
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • Each of the embodiments of the disclosed method may be implemented in the apparatus 1100 by programming the one or more processors 1120.
  • a program that results causes the one or more processor 1120 to execute the aforementioned process for distinguishing and tracking the multiple objects.
  • the program may be stored in a non-transitory computer readable medium, such as an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

Selon la présente invention, un algorithme permettant de distinguer et de suivre de multiples objets placés sur un écran tactile comprend les cinq étapes suivantes, consistant : étape S1, à trouver des motifs candidats à partir d'emplacements de points tactiles détectés sur l'écran tactile, les points tactiles étant détectés à un certain instant ; étape S2, à éliminer des motifs candidats non valides (par exemple, superposés) des motifs candidats obtenus à l'étape S1 ; étape S3, à valider et à mettre à jour la géométrie de chaque motif identifié ; étape S4, à mettre à jour les attributs de motifs identifiés conservés à l'étape S3 ; étape S5, à prédire la géométrie des motifs, et à afficher l'objet. Les étapes S1 à S5 sont répétées pour un instant suivant. Lors de l'exécution de l'étape S1 pour l'instant suivant, les motifs candidats sont exempts de tout motif déjà identifié de manière à réduire le calcul et le stockage requis lors de l'exécution de l'étape S1.
PCT/CN2020/080310 2019-03-26 2020-03-20 Reconnaissance et suivi d'objets multiples lorsqu'ils sont placés sur un écran tactile capacitif WO2020192563A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202080024216.5A CN113632055B (zh) 2019-03-26 2020-03-20 放置在电容式触摸屏上时区分并跟踪多个对象

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962823735P 2019-03-26 2019-03-26
US62/823,735 2019-03-26

Publications (1)

Publication Number Publication Date
WO2020192563A1 true WO2020192563A1 (fr) 2020-10-01

Family

ID=72605784

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/080310 WO2020192563A1 (fr) 2019-03-26 2020-03-20 Reconnaissance et suivi d'objets multiples lorsqu'ils sont placés sur un écran tactile capacitif

Country Status (3)

Country Link
US (1) US11036333B2 (fr)
CN (1) CN113632055B (fr)
WO (1) WO2020192563A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012017205A2 (fr) * 2010-08-06 2012-02-09 Disruptive Limited Interaction avec des dispositifs à écran tactile
US20120249430A1 (en) * 2011-03-31 2012-10-04 Oster David Phillip Multi-Touch Screen Recognition of Interactive Objects, and Application Thereof
US20180211071A1 (en) * 2017-01-24 2018-07-26 Kazoo Technology (Hong Kong) Limited Card with electrically conductive points that are readable
WO2019008109A1 (fr) * 2017-07-05 2019-01-10 HAYDALE TECHNOLOGIES (Thailand) Company Limited Supports d'informations et procédés de codage et de lecture de tels supports d'informations
CN109416723A (zh) * 2016-05-13 2019-03-01 实立科技 (香港) 有限公司 具有可读的导电图案的基板

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5949060A (en) * 1996-11-01 1999-09-07 Coincard International, Inc. High security capacitive card system
EP2394165A4 (fr) * 2009-02-03 2013-12-11 Complete Genomics Inc Cartographie de séquences d'oligomères
CN103270479B (zh) * 2010-11-22 2017-05-24 株式会社Ip舍路信 信息输入系统、程序、介质
CN102541417B (zh) * 2010-12-30 2014-02-26 株式会社理光 虚拟触摸屏系统中跟踪多个对象方法和系统
US9965094B2 (en) * 2011-01-24 2018-05-08 Microsoft Technology Licensing, Llc Contact geometry tests
US9336240B2 (en) * 2011-07-15 2016-05-10 Apple Inc. Geo-tagging digital images
CN103294236A (zh) * 2012-02-29 2013-09-11 佳能株式会社 确定目标位置的方法和装置、控制操作的方法和装置、电子设备
US9465460B2 (en) * 2012-05-24 2016-10-11 Htc Corporation Method for controlling display of electronic device and electronic device using the same
CN105190508A (zh) * 2012-12-19 2015-12-23 瑞艾利缇盖特(Pty)有限公司 图形用户界面中的导航速度和获取容易度之间的权衡的用户控制
US9373007B2 (en) * 2013-11-21 2016-06-21 Analog Devices Global Low-cost capacitive sensing decoder
TWI482097B (zh) * 2014-04-18 2015-04-21 Generalplus Technology Inc 識別碼辨識系統以及使用其之識別卡
US10318077B2 (en) * 2014-09-05 2019-06-11 Hewlett-Packard Development Company, L.P. Coherent illumination for touch point identification
US10037592B2 (en) * 2015-06-05 2018-07-31 Mindaptiv LLC Digital quaternion logarithm signal processing system and method for images and other data types
KR102480270B1 (ko) * 2015-12-11 2022-12-23 주식회사 지2터치 터치스크린 내장형 표시 장치 및 터치 검출 방법
US10216389B2 (en) * 2016-09-02 2019-02-26 Adobe Inc. Mirror snapping during vector drawing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012017205A2 (fr) * 2010-08-06 2012-02-09 Disruptive Limited Interaction avec des dispositifs à écran tactile
US20120249430A1 (en) * 2011-03-31 2012-10-04 Oster David Phillip Multi-Touch Screen Recognition of Interactive Objects, and Application Thereof
CN109416723A (zh) * 2016-05-13 2019-03-01 实立科技 (香港) 有限公司 具有可读的导电图案的基板
US20180211071A1 (en) * 2017-01-24 2018-07-26 Kazoo Technology (Hong Kong) Limited Card with electrically conductive points that are readable
WO2019008109A1 (fr) * 2017-07-05 2019-01-10 HAYDALE TECHNOLOGIES (Thailand) Company Limited Supports d'informations et procédés de codage et de lecture de tels supports d'informations

Also Published As

Publication number Publication date
US20200310642A1 (en) 2020-10-01
CN113632055B (zh) 2022-10-18
US11036333B2 (en) 2021-06-15
CN113632055A (zh) 2021-11-09

Similar Documents

Publication Publication Date Title
CN105243388B (zh) 基于动态时间规整和划分算法的波形分类方法
JP7353946B2 (ja) アノテーション装置および方法
US11449706B2 (en) Information processing method and information processing system
CN105814524B (zh) 光学传感器系统中的对象检测
US20120131513A1 (en) Gesture Recognition Training
JP6985856B2 (ja) 情報処理装置、情報処理装置の制御方法及びプログラム
US20160098615A1 (en) Apparatus and method for producing image processing filter
CN109961029A (zh) 一种危险物品检测方法、装置及计算机可读存储介质
CN110443242A (zh) 读数框检测方法、目标识别模型训练方法及相关装置
RU2746152C2 (ru) Обнаружение биологического объекта
JP6623851B2 (ja) 学習方法、情報処理装置および学習プログラム
US20220300774A1 (en) Methods, apparatuses, devices and storage media for detecting correlated objects involved in image
JPWO2019073546A1 (ja) 操作入力装置、情報処理システムおよび操作判定方法
WO2020192563A1 (fr) Reconnaissance et suivi d'objets multiples lorsqu'ils sont placés sur un écran tactile capacitif
CN114240928B (zh) 板卡质量的分区检测方法、装置、设备及可读存储介质
CN114359548A (zh) 一种圆查找方法、装置、电子设备及存储介质
US20120299837A1 (en) Identifying contacts and contact attributes in touch sensor data using spatial and temporal features
KR102022183B1 (ko) 인공지능을 이용한 전자기기의 화면 및 메뉴 획득방법
CN109074210A (zh) 信息处理装置、信息处理方法以及信息处理程序
CN112685056A (zh) 脚本更新方法及装置
CN110084298A (zh) 用于检测图像相似度的方法及装置
KR20130076993A (ko) 블롭 감지 장치 및 그 방법
EP4276746A2 (fr) Mesure interactive basée sur des représentations tridimensionnelles d'objets
US20160246920A1 (en) Systems and methods of improved molecule screening
WO2022195338A1 (fr) Procédés, appareils, dispositifs et supports de stockage pour détecter des objets corrélés impliqués dans une image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20779929

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20779929

Country of ref document: EP

Kind code of ref document: A1