CN109656457B - Multi-finger touch method, device, equipment and computer readable storage medium - Google Patents

Multi-finger touch method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN109656457B
CN109656457B CN201710936886.4A CN201710936886A CN109656457B CN 109656457 B CN109656457 B CN 109656457B CN 201710936886 A CN201710936886 A CN 201710936886A CN 109656457 B CN109656457 B CN 109656457B
Authority
CN
China
Prior art keywords
finger
frame
touch
group
touch screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710936886.4A
Other languages
Chinese (zh)
Other versions
CN109656457A (en
Inventor
谭登峰
郭昱
佘二永
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zen Ai Technology Co ltd
Original Assignee
Beijing Zen Ai Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zen Ai Technology Co ltd filed Critical Beijing Zen Ai Technology Co ltd
Priority to CN201710936886.4A priority Critical patent/CN109656457B/en
Publication of CN109656457A publication Critical patent/CN109656457A/en
Application granted granted Critical
Publication of CN109656457B publication Critical patent/CN109656457B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger

Abstract

The invention provides a multi-finger touch method, a multi-finger touch device, multi-finger touch equipment and a computer readable storage medium, wherein the method comprises the steps of receiving touch information of each frame of finger touch point input by a user on a first touch screen; clustering and grouping the finger contacts in each frame to obtain a plurality of groups of finger contacts; tracking the motion of each set of finger contacts; and when the moving direction, the moving speed and the distance from the starting point of the multiple groups of finger contacts meet preset conditions, responding to the touch control of the multiple groups of finger contacts to execute corresponding operation. The technical scheme provided by the embodiment of the invention can obtain the correct multi-finger contact track, and improves the identification efficiency and precision, thereby accurately responding to the touch operation of a user.

Description

Multi-finger touch method, device, equipment and computer readable storage medium
Technical Field
The invention relates to the technical field of touch interaction, in particular to a multi-finger touch method, a multi-finger touch device, multi-finger touch equipment and a computer-readable storage medium based on screen interaction.
Background
When a touch object moves on the touch screen, in the multi-point touch process, the touch points cannot be directly and accurately associated one to one, so that a correct touch point track cannot be obtained, the recognition efficiency and the recognition accuracy are low, and the touch screen cannot accurately respond to the touch operation of a user.
Disclosure of Invention
In view of this, embodiments of the present invention provide a multi-finger touch method, apparatus, device and computer-readable storage medium.
In a first aspect, an embodiment of the present invention provides a multi-finger touch method, where the method includes:
receiving touch information of each frame of finger touch point input by a user on a first touch screen;
clustering and grouping the finger contacts in each frame to obtain a plurality of groups of finger contacts;
tracking the motion of each set of finger contacts;
and when the moving direction, the moving speed and the distance from the starting point of the multiple groups of finger contacts meet preset conditions, responding to the touch control of the multiple groups of finger contacts to execute corresponding operation.
In some embodiments, the multi-finger touch method includes:
and acquiring coordinate information of the finger contacts in each frame, and clustering and grouping the finger contacts in each frame by adopting any one of a DBSCAN clustering algorithm, a K value clustering algorithm, a system clustering algorithm or a minimum distance clustering algorithm according to the coordinate information to obtain a plurality of groups of finger contacts.
In some embodiments, the multi-finger touch method further includes:
numbering each group of finger contacts so that each finger contact is assigned a unique group number and each finger contact has a unique ID number;
calculating the clustering center coordinate of each group of finger contacts;
for the Nth frame, acquiring a cluster center coordinate of each group of finger touch points in the Nth-1 th frame in each group of predicted cluster center coordinates of the Nth frame, comparing the distance between the cluster center coordinate of each group of finger touch points in the Nth frame and the predicted cluster center coordinates, and tracking the group number of the two groups of finger touch points with the minimum distance in the two groups of finger touch points to be the same, wherein N is an integer more than or equal to 4; for the Nth frame, acquiring each finger predicted contact point of each finger contact point in the Nth frame in the Nth-1 frame, comparing the finger contact point in the Nth frame with the finger predicted contact points, and tracking the ID numbers of the finger contact points with the minimum distance in the finger contact points as the same;
judging whether at least one finger contact with the same ID number exists in groups with the same group number in adjacent frames, and if so, judging that the groups with the same group number are continuous groups; if not, judging the group with the same group number as a discontinuous group;
and when the moving direction, the moving speed and the distance from the starting point meet preset conditions, responding to the touch control of the continuous groups to execute corresponding operation.
In some embodiments, the multi-finger touch method includes:
for the second frame, calculating the distance between the clustering center coordinates in the first frame and the second frame, and tracking the group numbers of the two groups of finger contacts with the smallest distance between the clustering center coordinates to be the same;
for the second frame, calculating the distance between the finger contacts in the first frame and the second frame, and tracking the ID numbers of the two finger contacts with the minimum distance between the finger contacts to be the same;
for the third frame, acquiring the speed of the clustering center of each group of finger touch points in the second frame according to the clustering center coordinates of each group of finger touch points in the first two frames and the inter-frame time interval, and acquiring each group of predicted clustering center coordinates of the clustering center coordinates of each group of finger touch points in the second frame in the third frame according to the speed of the clustering center of each group of finger touch points;
and for the third frame, acquiring the speed of each finger touch point in the second frame according to the coordinate information of each finger touch point in the first two frames and the inter-frame time interval, and acquiring each finger predicted touch point of each finger touch point in the third frame in the second frame according to the speed of each finger touch point.
In some embodiments, the method for tracking a multi-finger contact includes:
for the Nth frame, acquiring the speed and the acceleration of the clustering center of each group of finger touch points in the Nth frame according to the clustering center coordinates of each group of finger touch points in the previous N-1 frame and the inter-frame time interval, and acquiring each group of predicted clustering center coordinates of the clustering center coordinates of each group of finger touch points in the Nth frame in the N-1 frame according to the speed and the acceleration of the clustering center of each group of finger touch points;
and for the Nth frame, acquiring the speed and the acceleration of each finger touch point in the Nth frame according to the coordinate information of each finger touch point in the previous N-1 frame and the inter-frame time interval, and acquiring each finger predicted touch point of each finger touch point in the Nth frame according to the speed and the acceleration of each finger touch point.
In some embodiments, the method for tracking a multi-finger touch point further includes:
when N is greater than 4, for the N frame, obtaining the first derivative values of the speed, the acceleration and the acceleration of the clustering center of each group of finger touch points in the N-1 frame according to the clustering center coordinates of each group of finger touch points in the previous N-1 frame and the inter-frame time interval, and obtaining each group of predicted clustering center coordinates of the clustering center coordinates of each group of finger touch points in the N-1 frame in the N frame according to the speed, the acceleration and the first derivative values of the acceleration of the clustering center of each group of finger touch points;
and for the Nth frame, obtaining a first derivative value of the speed, the acceleration and the acceleration of each finger touch point in the Nth frame according to the coordinate information of each finger touch point in the previous N-1 frame and the inter-frame time interval, and obtaining each finger predicted touch point of each finger touch point in the Nth frame in the N-1 frame according to the first derivative value of the speed, the acceleration and the acceleration of each finger touch point.
In some embodiments, the multi-finger touch method further includes:
when the moving direction of the clustering center of the continuous group of the Nth frame is within a preset moving direction range, the moving speed is within a preset moving speed range, and the distance from the starting point is within a preset distance range, judging that the continuous group meets a preset condition, and responding to the touch control of the continuous group to execute corresponding operation; or when the moving direction of the cluster center of the continuous group of the Nth frame is within a preset moving direction range and the distance from the initial point is within a preset distance range, judging that the continuous group meets a preset condition, and responding to the touch control of the continuous group to execute corresponding operation; or when the distance between the cluster center of the continuous group of the Nth frame and the starting point is within a preset distance range, judging that the continuous group meets a preset condition, and responding to the touch control of the continuous group to execute corresponding operation.
In some embodiments, the nth frame is a last frame.
In some embodiments, the multi-finger touch method further includes:
and deleting the discontinuous group when the group with the same group number is judged to be the discontinuous group.
In some embodiments, the multi-finger touch method further includes:
responding to the touch of the multiple groups of finger contacts, and outputting the image content displayed on the first touch screen or preset associated information with the image content displayed on the first touch screen for the second touch screen to display, or
And responding to the touch control of the multiple groups of finger contacts, and outputting the image content displayed on the first touch screen or preset associated information with the image content displayed on the first touch screen for the first touch screen and the second touch screen to display simultaneously.
In some embodiments, the multi-finger touch method further includes:
when the moving direction, the moving speed and the distance from the starting point of only one group of the clustering centers of the finger contacts meet preset conditions, responding to the touch control of the group of the finger contacts to execute corresponding operation; or
And when the moving direction, the moving speed and the distance from the starting point of the clustering centers of at least two groups of finger contacts all meet preset conditions, responding to the touch control of the multiple groups of finger contacts to execute corresponding operation according to the priority set by each group of finger contacts.
In some embodiments, the multi-finger touch method further includes:
responding to the touch of the multiple groups of finger contacts, and outputting the image content displayed on the first touch screen or preset associated information of the image content displayed on the first touch screen to a second touch screen corresponding to the moving direction according to the moving direction of the clustering centers of the continuous groups.
In a second aspect, an embodiment of the present invention provides a multi-finger touch device, where the device includes:
the receiving unit is used for receiving touch information of each frame of finger touch point input by a user on the first touch screen;
the segmentation unit is used for clustering and grouping the finger contacts of each frame to obtain a plurality of groups of finger contacts;
a tracking unit for tracking the movement of each set of finger contacts;
and the execution unit is used for responding to the touch control of the multiple groups of finger contacts to execute corresponding operation when the moving directions, the moving speeds and the distances from the starting points of the multiple groups of finger contacts all meet preset conditions.
In a third aspect, an embodiment of the present invention provides an apparatus for multi-touch, where the apparatus includes a memory and a processor, where,
the memory is used for storing executable program codes;
the processor is configured to read executable program code stored in the memory to perform the multi-touch method of the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, which includes instructions, when executed on a computer, cause the computer to execute the multi-touch method according to the first aspect.
According to the multi-finger touch method, the multi-finger touch device, the multi-finger touch equipment and the computer readable storage medium provided by the embodiment of the invention, the finger touch points in each frame are divided into a plurality of groups, then the motion of each group of finger touch points is tracked, and when the moving direction, the moving speed and the distance from the starting point of the plurality of groups of finger touch points meet the preset conditions, the corresponding operation is executed in response to the touch of the plurality of groups of finger touch points. Therefore, the technical scheme provided by the embodiment of the invention can obtain the correct multi-finger contact track, and improves the identification efficiency and precision, so that the touch operation of a user can be accurately responded.
Drawings
Fig. 1 is a flowchart illustrating a multi-touch method according to an embodiment of the invention.
FIG. 2 is a diagram illustrating segmentation of finger contacts in a frame, in accordance with an embodiment of the present invention. FIG. 3
The method is a flow chart of the multi-finger touch method according to another embodiment of the invention.
Fig. 4 a-4 c are schematic diagrams of tracking the motion of each group of finger contacts according to another embodiment of the present invention.
Fig. 5 is a schematic view illustrating a scene of a screen interaction system according to still another embodiment of the present invention.
Fig. 6 is a schematic diagram of outputting image content or related information on the first touch screen for display on the second touch screen according to another embodiment of the present invention.
Fig. 7 is a schematic structural diagram of a multi-finger touch device according to another embodiment of the invention.
Fig. 8 is a schematic structural diagram of a multi-finger touch device according to still another embodiment of the invention.
Detailed Description
The technical means of the present invention will be described in further detail with reference to specific embodiments. It should be understood that the detailed description and specific examples, while indicating the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, fig. 1 is a schematic flow chart of a multi-touch method according to an embodiment of the present invention. The method comprises the following steps:
s11: and receiving touch information of each frame of finger touch points.
It is known to those skilled in the art that when a user performs multi-touch on a touch screen, touch information of a finger touch of the user can be collected by various methods such as a screen and a sensor attached to the screen, as shown in a resistive touch screen, a capacitive touch screen, an electromagnetic induction touch screen, an infrared frame touch screen, a surface acoustic wave touch screen, an optical touch screen, and the like. Over time, the sensor can continue to collect touch, and each frame of collected touch information can be sent to the processor, and the processor receives each frame of touch information.
The touch may be a direct touch or an off-screen touch, in which case the touch of the corresponding frame is captured by capturing a gesture operation in front of the screen, for example. The touch screen may comprise, for example, an optical touch screen or the like.
The touch screen may further include, for example, the following screens: the screen can directly display images (such as an LCD display) or not display images (such as a wall and a cloth curtain), information is projected to the screen through a projection device to be displayed, and an infrared light curtain and an infrared camera are arranged on the surface of the screen. When touch is performed on the screen, the light distribution of the infrared light curtain at the touch point changes due to touch behavior, for example, part of infrared light at the touch point is reflected off the screen by a touch finger or is transmitted through the screen due to the action of the touch finger, the infrared camera captures an infrared image including the touch point, captures touch input of a user, that is, collects touch information, and as time passes, the infrared camera can collect multiple frames of touch information, each frame of touch information can be sent to the processor, and the processor receives each frame of touch information.
The touch screen may further include, for example, the following screens: the screen can directly display images (such as an LCD display) or not display images (such as a wall and a cloth curtain), information is projected to the screen by a projection device to be displayed, an infrared light curtain is arranged on the surface of the screen, and an infrared camera is arranged on the surface of or nearby the screen. When touch is performed on the screen, the light distribution of the infrared light curtain at the touch point changes due to touch behavior, for example, part of infrared light at the touch point is reflected off the screen by a touch finger or is transmitted through the screen due to the action of the touch finger, the infrared camera captures an infrared image including the touch point, captures touch input of a user, that is, collects touch information, and as time passes, the infrared camera can collect multiple frames of touch information, each frame of touch information can be sent to the processor, and the processor receives each frame of touch information.
In addition to the above, various screen touch technologies and touch screens exist in the prior art, and since these technologies are well known to those skilled in the art, they are not described herein.
S12: and clustering and grouping the finger contacts in each frame to obtain a plurality of groups of finger contacts.
In this step, the processor receives and analyzes the touch information of each frame of finger touch points, and can obtain the coordinate information of each finger touch point in each frame. And then inputting the coordinate information of each finger contact, and clustering and grouping the finger contacts in each frame by adopting any one of a DBSCAN clustering algorithm, a K value clustering algorithm, a system clustering algorithm or a minimum distance clustering algorithm. Some conditions may be set properly when clustering, such as defining the maximum number of points of each cluster to be less than five, the number of clusters, etc., and these algorithms will not be explained herein since the corresponding clustering according to the positions of the points, the adjustment of the coefficients, etc. are well known to those skilled in the art.
Referring to fig. 2, fig. 2 is a schematic diagram illustrating clustering or segmenting finger contacts in a frame according to an embodiment of the present invention. Fig. 2 exemplarily shows a case where one frame includes 13 finger contacts, and after the experiment of the inventor, the best grouping effect is achieved when the finger contacts are grouped by using the DBSCAN clustering algorithm, and fig. 2 shows that 13 finger contacts in one frame are divided into 4 groups of a, b, c, and d by using the DBSCAN clustering algorithm.
It can be understood that when a plurality of people's fingers touch on the touch screen, the finger contacts of the plurality of people on the touch screen can be clustered into different groups, so that the finger contacts of each person can be synchronously tracked, and the processing method has the advantages of avoiding the problems of recognition error and low recognition efficiency caused by integral tracking of a pile of points in each frame according to the relationship between two adjacent frames of finger contacts in the prior art.
S13: the motion of each set of finger contacts is tracked.
In this step, each set of finger touch points is first matched, and then each finger touch point in each matched set is tracked, so that the motion of each set of finger touch points can be tracked.
S14: and when the moving direction, the moving speed and the distance from the starting point of the multiple groups of finger contacts meet preset conditions, responding to the touch control of the multiple groups of finger contacts to execute corresponding operation.
In this step, by tracking the motion of each group of finger contacts, when the moving direction, the moving speed, and the distance from the starting point of a certain group of finger contacts satisfy preset conditions, corresponding operations are executed in response to the touch of the group of finger contacts.
For example, if the moving direction, the moving speed, and the distance from the starting point of a group of finger contacts in the current frame all satisfy the preset conditions, corresponding operations are executed in response to the touch of the group of finger contacts. Here, some touch operations that correspond to the touch screen device when the preset condition is satisfied may be stored in advance in the touch screen device, so as to perform different operations or controls on the image content displayed in the touch screen, for example: the operation or control on the image content displayed on the touch screen includes moving the image left, moving the image right, enlarging, reducing, deleting, marking, writing, turning the page, or adjusting the layout of the image on the touch screen, which is only an example.
Another example is: when the moving direction, the moving speed and the distance from the starting point of more than two groups of finger contacts meet preset conditions, only the touch control of one group of finger contacts is allowed to be responded, so that the disorder of multi-finger touch control is avoided.
Another example is: when the moving direction, the moving speed and the distance from the starting point of more than two groups of finger contacts meet preset conditions, touch control of any group of finger contacts is not responded, and disorder of multi-finger touch control is avoided.
For another example, when the moving direction, the moving speed, and the distance from the starting point of two or more groups of finger contacts satisfy the preset conditions, priorities may be set for responses of the two or more groups of finger contacts, and corresponding touch operations may be sequentially responded according to the priority order of the different groups of finger contacts.
For another example, when the moving direction, the moving speed, and the distance from the starting point of two or more groups of finger contacts satisfy the preset conditions, priorities may be set for responses of the two or more groups of finger contacts, and when the touch of the finger contact of the high-priority group is responded, the touch of other groups of finger contacts may be ignored, thereby avoiding confusion.
When the multi-finger touch method provided by the embodiment of the invention is utilized, the correct multi-finger contact track can be obtained, and the recognition efficiency and precision are improved, so that the touch operation of a user can be accurately responded.
Referring to fig. 3, fig. 3 is a flowchart illustrating a multi-touch method according to another embodiment of the invention. The method comprises the following steps:
s21: and receiving touch information of each frame of finger touch point input by a user on the first touch screen.
S22: and clustering and grouping the finger contacts in each frame to obtain a plurality of groups of finger contacts.
S23: each group of finger contacts is numbered such that each finger contact is assigned a unique group number and each finger contact has a unique ID number.
In this step, when the finger contacts in each frame are divided into multiple groups, each group of finger contacts is marked so that each group of finger contacts has a unique group number and each finger contact has a unique ID number in each frame. The ID number of each finger contact does not change in different frames, but as the finger moves, the group number to which each finger contact is assigned may change during grouping.
S24: and calculating the cluster center coordinates of each group of finger touch points.
In this step, the cluster center coordinates of each group of finger touch points can be obtained according to the coordinate information of the finger touch points in each group. For example: the coordinates of each finger touch point in a certain group are respectively (X11, Y11), (X12, Y12), (X13 and Y13), and then the cluster center coordinate of the group is
Figure BDA0001430049770000081
Figure BDA0001430049770000082
S25: and for the Nth frame, obtaining the predicted clustering center coordinate of each group of finger touch points in the Nth frame of the clustering center coordinate of each group of finger touch points in the N-1 th frame, comparing the distance between the clustering center coordinate of each group of finger touch points in the Nth frame and the predicted clustering center coordinate, and tracking the group numbers of two groups of finger touch points with the minimum distance in the two groups of finger touch points to be the same, wherein N is an integer more than or equal to 4.
For the 4 th frame and the following frames, the moving speed and the acceleration of the clustering center of each group of finger touch points can be obtained according to the clustering center coordinates of each group of finger touch points in the previous frame and the time interval between frames, the predicted clustering center coordinates of each group of the clustering center coordinates of each group of finger touch points in the next frame can be obtained according to the moving speed and the acceleration of the clustering center of each group of finger touch points, and the group with the minimum distance between the coordinates in the clustering center and the predicted clustering center coordinates in the Nth frame is judged as the same group number.
Further, in some embodiments, for the third frame, the moving speed of the cluster center of each group of finger touch points can be obtained according to the cluster center coordinates and the inter-frame time interval of each group of finger touch points in the first two frames, each group of predicted cluster center coordinates of each group of cluster centers in the third frame can be obtained according to the moving speed of each group of finger touch point cluster centers and the inter-frame time interval, and each group of cluster centers in the third frame is compared with the predicted cluster center and is considered to be the same as the group number of the group with the smallest distance from the predicted cluster center.
Further, in some embodiments, for the second frame, the distance between the two groups of finger touch points and the cluster center coordinate in the first frame may be calculated, and then the group numbers of the two groups of finger touch points with the smallest cluster center coordinate distance are determined to be the same. In some embodiments, it may also be determined whether the minimum distance between the two is within a preset range, and if so, the group numbers of the two groups of finger touch points with the smallest distance between the cluster center coordinates are determined to be the same.
The following description refers to the accompanying drawings. Referring to fig. 4a to 4c, fig. 4a to 4c are schematic diagrams illustrating tracking the motion of each group of finger touch points according to another embodiment of the present invention. Each group of finger contacts in the second frame to the fourth frame is taken as an example for explanation, and for clarity, the finger contacts in each group are not shown here, and each group is only indicated by a circle. Wherein, a (1), b (1), c (1) in FIG. 4a respectively represent the position schematic diagrams of the a-th group, the b-th group and the c-th group of finger touch points in the second frame, in FIG. 4b, the dotted line portions a (1), b (1), c (1) represent the positions of the a-th, b-th and c-th group finger touch points in the second frame, respectively, the solid line portions a (2), b (2), c (2) represent the positions of the a-th, b-th and c-th group finger touch points in the third frame, respectively, in fig. 4c, dotted line portions a (2), b (2), and c (2) respectively indicate the positions of the a-th, b-th, and c-th groups of finger touch points in the third frame, solid line portions a (3), b (3), and c (3) respectively indicate the positions of the a-th, b-th, and c-th groups of finger touch points in the fourth frame, and a dotted line portion b0 indicates the positions of the predicted cluster center coordinates of the b-th group of finger touch points in the fourth frame. The following describes the tracking of the motion of the group b finger contacts in conjunction with fig. 4 a-4 c.
And for the third frame or the fourth frame, if the matching between the groups is continuously carried out according to the minimum distance between the clustering center coordinates of each group of finger contacts in the adjacent frames, the condition of wrong matching can occur. For example, referring to fig. 4c for the fourth frame, since the distance h between the b-th group of finger contacts b (2) in the third frame and the c-th group of finger contacts c (3) in the fourth frame is the smallest, if the matching between the groups is continued according to the smallest distance between the cluster center coordinates of each group of finger contacts in the adjacent frames, a situation may occur in which the b-th group of finger contacts b (2) in the third frame is erroneously matched as the c-th group of finger contacts c (3) in the fourth frame. Therefore, in order to improve the tracking accuracy between groups, the following tracking method is adopted for the finger touch points of the third frame and the later finger touch points of the third frame:
for the frames after the third frame, please refer to fig. 4c, the moving speed and the acceleration of the cluster center of the b-th group of finger touch points can be determined according to the cluster center coordinates and the sampling interval of the b-th group of finger touch points in the first frame to the third frame, the predicted cluster center coordinates b0 (shown as a dotted line part in fig. 4 c) of the b-th group of finger touch points in the fourth frame can be further obtained according to the moving speed and the acceleration, the distance between the cluster center coordinates of each group of finger touch points in the fourth frame and the predicted cluster center coordinates b0 is calculated, and the distance between the cluster center coordinates of the b-th group of finger touch points b (3) in the fourth frame and the predicted cluster center b0 is found to be the minimum, so that the b (3) group of finger touch points in the fourth frame are matched into the b-th group, thereby being capable of avoiding the mismatching and improving the tracking accuracy.
For the third frame, acquiring the speed of the clustering center of each group of finger touch points in the second frame according to the clustering center coordinates of each group of finger touch points in the first two frames and the inter-frame time interval, and acquiring each group of predicted clustering center coordinates of the clustering center coordinates of each group of finger touch points in the second frame in the third frame according to the speed of the clustering center of each group of finger touch points; and then comparing the distance between the cluster center coordinates of each group of finger touch points in the third frame and the predicted cluster center coordinates, and tracking the group numbers of the two groups of finger touch points with the minimum distance in the two groups of finger touch points to be the same.
S26: and for the Nth frame, acquiring each finger predicted contact point of each finger contact point in the Nth frame in the Nth-1 frame, comparing the finger contact point in the Nth frame with the finger predicted contact points, and tracking the ID number of the finger contact point with the minimum distance in the two finger predicted contact points to be the same.
In this step, the tracking method for each finger touch point in each group is similar to the tracking method for each group in step S25, and can be understood with reference to step S25, which is not described herein again.
After the tracking of each group of finger contacts in each frame is completed, tracking the finger contacts in each group, determining whether at least one finger contact with the same ID number exists in a group with the same group number, if so, indicating that there is a finger touch in the group all the time, and thus, determining that the group is a continuous group, thereby implementing the tracking of touch, and if not, indicating that a user may lift the finger completely during the touch process and leave the screen, which will be described in detail below.
S27: judging whether at least one finger contact with the same ID number exists in groups with the same group number in adjacent frames, if so, judging that the groups with the same group number are continuous groups, and executing step S28; if not, the group with the same group number is judged to be a discontinuous group, and step S29 is executed.
In this step, since the user may lift the finger or leave the touch screen during the finger touch process, if at least one finger contact with the same ID number exists in the group with the same group number in two adjacent frames, the group is determined to be a continuous group; if no finger contact with the same ID number exists in the group with the same group number, the group is a moved group or a newly added group, and the group is judged to be a discontinuous group.
For convenience of understanding, please refer to fig. 2 again, assume that the a-th group in the first frame includes three finger contacts with IDs 1, 2, and 3, and if the a-th group in the second frame includes one ID with IDs 1, 2, and 3, or any two IDs or three IDs, the a-th group is determined to be a valid group; and if the a-th group in the second frame does not contain any ID of 1, 2 or 3, judging that the a-th group is a discontinuous group.
In some embodiments, when N >4, for the nth frame, obtaining first derivative values of the speed, the acceleration and the acceleration of the cluster center of each group of finger touch points in the nth frame according to the cluster center coordinates of each group of finger touch points in the previous N-1 frame and the inter-frame time interval, and obtaining each group of predicted cluster center coordinates of the cluster center coordinates of each group of finger touch points in the nth frame according to the first derivative values of the speed, the acceleration and the acceleration of the cluster center of each group of finger touch points in the nth frame.
And for the Nth frame, obtaining a first derivative value of the speed, the acceleration and the acceleration of each finger touch point in the Nth frame according to the coordinate information of each finger touch point in the previous N-1 frame and the inter-frame time interval, and obtaining each finger predicted touch point of each finger touch point in the Nth frame in the N-1 frame according to the first derivative value of the speed, the acceleration and the acceleration of each finger touch point.
After obtaining the predicted cluster center coordinates and the finger predicted touch point, the remaining processing is as described previously, and is omitted here. Here, by considering the first derivative value of the acceleration, the predicted cluster center coordinates of the finger contact point or the cluster center in the next frame and the finger predicted contact point can be predicted more accurately.
S28: obtaining the moving direction, the moving speed and the distance from the starting point of the cluster center of the continuous group of the Nth frame according to the cluster center coordinates of the finger contacts in the continuous group with the same group number in the Nth frame and the first frame, judging that the continuous group meets a preset condition when the moving direction of the cluster center of the continuous group is within a preset moving direction range, the moving speed is within a preset moving speed range and the distance from the starting point is within a preset distance range, and responding to the touch control of the continuous group to execute corresponding operation; or when the moving direction of the clustering center of the continuous group is within a preset moving direction range and the distance from the initial point is within a preset distance range, judging that the continuous group meets a preset condition, and responding to the touch of the continuous group to execute corresponding operation; or when the distance between the clustering center of the continuous group and the starting point is in a preset distance range, judging that the continuous group meets a preset condition, and responding to the touch control of the continuous group to execute corresponding operation.
In this step, assuming that the coordinates of the finger touch points in a certain continuous group of the nth frame are (X11, Y11), (X12, Y12), (X13, Y13), the cluster center coordinates of the continuous group are (X11, Y11), (X12, Y12), and (Y13), respectively
Figure BDA0001430049770000121
Figure BDA0001430049770000122
The coordinates of the finger touch points in the group having the same group number as the consecutive group in the first frame are (X01, Y01), (X02, Y02), (X03, Y03), respectively, and the cluster center coordinates of the group having the same group number as the consecutive group in the first frame are
Figure BDA0001430049770000123
Figure BDA0001430049770000124
The position difference from the cluster center of the continuous group of the Nth frame to the cluster center of the group having the same group number as the continuous group in the first frame is DeltaX-X0 and DeltaY-Y0, the moving direction of the Nth frame relative to the first frame is L-DeltaX/DeltaY, and the distance from the starting point is
Figure BDA0001430049770000125
The moving speed is V ═ Δ Y/t, where t is the time interval from the acquisition of the first frame to the acquisition of the nth frame. When the moving direction L of the clustering center of a certain continuous group is within a preset moving direction range, the distance R from the starting point is within a preset moving distance range and the moving speed V is within a preset moving speed range, judging that the certain continuous group meets the preset condition.
In this embodiment, the speed is calculated by using the position difference between the cluster center of the consecutive group in the nth frame and the ordinate of the cluster center of the consecutive group in the first frame, in other embodiments, the speed may also be calculated by using the position difference between the cluster center of the consecutive group in the nth frame and the abscissa of the cluster center of the consecutive group in the first frame, or the speed may be calculated by using the distance between the cluster center of the consecutive group in the nth frame and the cluster center of the consecutive group in the first frame.
In other embodiments of the present invention, the nth frame is a last frame, that is, when the touch operation of the user is finished, it is determined whether a continuous group in the last frame meets a preset condition, and if yes, a corresponding operation is executed in response to the finger touch in the continuous group.
In this embodiment, in response to the finger touches in the continuous group to perform corresponding operations, a possible implementation manner is to output image content displayed on the first touch screen for display on the second touch screen in response to the continuous group touches, or output preset information associated with the display content on the first touch screen for display on the second touch screen, where the associated information may be content such as text, picture, table, video, web page link, and the like. Specifically, the processor may send image content displayed on the first touch screen or call pre-stored associated information to the multi-screen stitching processor, the multi-screen stitching processor sends the image signal content or the associated information to the second touch screen, and only the second touch screen displays the image content or the associated information; or the multi-screen splicing processor sends the image signal content or the associated information to the first touch screen and the second touch screen, and the first touch screen and the second touch screen simultaneously display the image content or the associated information.
In the description of the above embodiments, for ease of understanding, a multi-screen stitching processor is introduced to illustrate: and the multi-screen splicing processor sends the image content or the associated information on the first touch screen to the second touch screen or the first touch screen and the second touch screen. However, it will be understood by those skilled in the art that the functions of the multi-screen splicing processor can also be integrated into a processor, and the processor implements the related functions of the multi-screen splicing processor.
In other embodiments of the present invention, the corresponding operation may be further performed according to a moving direction of the cluster centers of the consecutive groups, where control information corresponding to the moving direction of the cluster centers of the consecutive groups may be stored in the processor in advance. For example: assuming that the upper left corner of the touch screen is a coordinate origin, the right direction is a positive direction of an X axis, the left direction is a negative direction of the X axis, the upward direction is a positive direction of a Y axis, the downward direction is a negative direction of the Y axis, and the slope of the moving direction of the cluster centers of the continuous groups relative to the X axis is K, when K is larger than 0 degrees and smaller than or equal to 45 degrees, the finger slides upwards, and image content or associated information in the first touch screen is displayed in a second touch screen above the first touch screen; when K is less than 45 degrees and less than 135 degrees, the finger slides leftwards, and image content or associated information in the first touch screen is displayed in a second touch screen positioned on the left side of the first touch screen; when the K is more than or equal to 135 degrees and less than or equal to 225 degrees, the finger slides downwards, and image content or associated information in the first touch screen is displayed in a second touch screen positioned below the first touch screen; and when the K is more than or equal to 225 degrees and less than or equal to 315 degrees, the finger slides to the right, and the image content or the associated information in the first touch screen is displayed in a second touch screen positioned at the right side of the first touch screen.
It should be noted that the above description is only exemplary to illustrate that the image content or the associated information on the first touch screen is triggered to be displayed on the second touch screen corresponding to the moving direction according to the moving direction of the cluster center of the continuous group. In practical application, the image content or the associated information in the first touch screen can be set in a user-defined mode, and the image content or the associated information can be displayed on the second touch screen in which direction.
In other embodiments of the present invention, the direction in which the image content or the related information of the first touch screen is displayed on the second touch screen may be determined according to a moving direction Δ X of an abscissa of a cluster center of a consecutive group in the nth frame with respect to an abscissa of a cluster center of a corresponding group in the first frame, or according to a moving direction Δ Y of an ordinate of a cluster center of a consecutive group in the nth frame with respect to an ordinate of a cluster center of a corresponding group in the first frame. For example: when the delta X is larger than 0, the finger slides rightwards, and the image content or the associated information on the first touch screen is displayed on a second touch screen on the right side of the first touch screen; when the delta X is less than 0, the finger slides leftwards, and the image content or the related information on the first touch screen is displayed on a second touch screen on the left of the first touch screen; when the delta Y is larger than 0, the finger slides upwards, and the image content or the associated information on the first touch screen is displayed on a second touch screen above the first touch screen; and when the delta Y is less than 0, the finger slides downwards, and the image content or the related information on the first touch screen is displayed on a second touch screen below the first touch screen.
S29: the non-contiguous group is deleted.
In this step, since the user may detach all the hands from the first touch screen during the touch process, the group where the finger touch point is located is determined to be a discontinuous group in this case, and the discontinuous group is deleted.
In other embodiments of the present invention, when a certain group is determined to be a non-contiguous group, the non-contiguous group may not be processed.
When the multi-finger touch method provided by the embodiment of the invention is utilized, the correct multi-finger contact track can be obtained, and the recognition efficiency and precision are improved, so that the touch operation of a user can be accurately responded.
According to the technical scheme provided by the embodiment of the invention, one possible application scene is applied to a multi-screen linkage technology. The multi-screen linkage technology is characterized in that more than two high-definition touch display systems are used as hardware platforms, a multi-screen linkage display system based on interactive application is used, one of the high-definition touch display systems can be selected as a main control screen, other screens are controlled through linkage of the main control screen, and a richer display mode is displayed for a user. The display content is divided into a plurality of objects, such as characters, pictures, videos and the like, and a brand new interactive display form is brought to the user. However, the existing multi-screen linkage technology can only control other screens through the linkage of the touch control main screen, and cannot realize the interaction between other screens. Therefore, it is an urgent problem to achieve interaction between other screens and enrich the display modes of the images.
An application scenario when the multi-finger touch method provided by the embodiment of the invention is applied to a multi-screen linkage system is specifically described below with reference to fig. 5.
Referring to fig. 5, fig. 5 is a schematic view illustrating a scene of a screen interaction system according to another embodiment of the present invention. The screen interaction system includes: the system comprises a first touch screen m1, a second touch screen m2, infrared lasers g1 and g2, infrared cameras b1 and b2, a processor s and a multi-screen splicing processor. The first touch screen m1 and the second touch screen m2 are used for displaying images, can be spliced together through a multi-screen splicing processor, and can be arranged separately in space; the infrared lasers g1 and g2 are used for forming a light curtain parallel to the first touch screen m1 and the second touch screen m2, the number of the infrared lasers can be 2 as the number of the touch screens, and 2 infrared lasers g1 and g2 are respectively arranged above the first touch screen m1 and the second touch screen m 2; the infrared cameras b1 and b2 are used for collecting each frame of image of the finger touch point of the user on the first touch screen m1, the first touch screen m1 and the second touch screen m2 are located in the visual angle range of the infrared cameras b1 and b2, and the number of the infrared cameras can be set to be 2 as the number of the touch screens. When the infrared lasers g1 and g2 are turned on, a laser plane is formed parallel to the surfaces of the first touch screen m1 and the second touch screen m2 and having a uniform thickness. When the user touches on the first touch screen m1, actually an optical signal of a finger touch point is formed on the first touch screen m1, then the infrared camera b1 captures each frame image of the finger touch point on the first touch screen m1, and then the infrared camera b1 transmits each captured frame image to the processor s. The processor s divides the finger touch points in each frame of image into a plurality of groups, tracks the motion condition of each group of finger touch points respectively, when the moving direction, the moving speed and the distance from the starting point of a certain group of finger touch points meet preset conditions, the processor s responds to the touch control of the group of finger touch points to execute corresponding operation, and the multi-screen splicing processor outputs the image content on the first touch screen m1 or preset associated information with the image content on the first touch screen m1 for the display of the second touch screen m2 or for the simultaneous display of the first touch screen m1 and the second touch screen m 2.
Referring to fig. 6, fig. 6 is a schematic diagram illustrating outputting image content or related information on a first touch screen for display on a second touch screen according to another embodiment of the present invention. When the moving direction, the moving speed and the distance from the starting point of the multiple groups of finger contacts meet the preset conditions, the user performs multi-finger touch on the first touch screen m1, and the image content on the first touch screen m1 (or preset associated information with the image content on the first touch screen m 1) is displayed on the second touch screen m 2.
When the multi-finger touch method provided by the embodiment of the invention is utilized, the correct multi-finger contact track can be obtained, and the recognition efficiency and precision are improved, so that the touch operation of a user can be accurately responded. Meanwhile, interaction among multiple screens in a multi-screen linkage technology can be realized, so that image display modes are richer.
In order to better understand the multi-finger touch method provided in an embodiment of the present invention, another embodiment of the present invention further provides a multi-finger touch device. The meaning of the noun is the same as that of the multi-finger touch method, and specific implementation details can refer to the description in the method embodiment.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a multi-finger touch device according to another embodiment of the present invention. The apparatus comprises an acquisition unit 31, a segmentation unit 32, a tracking unit 33 and an execution unit 34.
The acquisition unit 31 is configured to receive touch information of each frame of finger touch point input by a user on the first touch screen; the segmentation unit 32 is configured to perform clustering and grouping on each frame of finger contact points to obtain a plurality of groups of finger contact points; the tracking unit 33 is used for tracking the movement of each group of finger touch points; the execution unit 34 is configured to respond to the touch control of the multiple groups of finger contacts to execute corresponding operations when the moving directions, the moving speeds, and the distances from the starting points of the multiple groups of finger contacts all satisfy preset conditions or one or a combination of two of the preset conditions.
In a possible implementation manner, the segmentation unit 32 is configured to obtain coordinate information of the finger contacts in each frame, and perform clustering and grouping on the finger contacts in each frame by using any one of a DBSCAN clustering algorithm, a K value clustering algorithm, a system clustering algorithm, or a minimum distance clustering algorithm according to the coordinate information to obtain multiple groups of finger contacts.
In a possible embodiment, the tracking unit 33 is configured to number each group of finger touch points obtained by the segmentation unit 32, such that each finger touch point is assigned a unique group number and each finger touch point has a unique ID number; calculating the clustering center coordinate of each group of finger touch points, obtaining the clustering center coordinate of each group of finger touch points in the N-1 th frame in terms of the N frame, predicting the clustering center coordinate of each group of finger touch points in the N frame, comparing the distance between the clustering center coordinate of each group of finger touch points in the N frame and the predicted clustering center coordinate, and tracking the group numbers of two groups of finger touch points with the minimum distance in the two groups of finger touch points to be the same, wherein N is an integer more than or equal to 4; for the Nth frame, acquiring each finger predicted contact point of each finger contact point in the Nth frame in the Nth-1 frame, comparing the finger contact point in the Nth frame with the finger predicted contact points, tracking the ID numbers of the finger contact points with the minimum distance in the finger contact points and the finger predicted contact points to be the same, judging whether at least one finger contact point with the same ID number exists in groups with the same group number in adjacent frames, and if so, judging that the groups with the same group number are continuous groups; if not, the group with the same group number is judged to be a discontinuous group. And judging whether the moving direction, the speed and the distance of the clustering centers of the continuous group meet preset conditions, and if so, responding to the touch control of the continuous group to execute corresponding operation.
In other embodiments, the execution unit 34 is configured to output, in response to the touch of the multiple sets of finger contacts, the image content displayed on the first touch screen or the associated information with the display content of the first touch screen for display on the second touch screen.
Specifically, when the moving direction, the moving speed, and the distance from the starting point of only one group of the clustering centers all satisfy the preset conditions, the execution unit 34 executes the corresponding operation in response to the touch of the group of finger contacts; or, when the moving direction, the moving speed, and the distance from the starting point of at least two groups of finger contacts all satisfy the preset conditions, the execution unit 34 responds to the touch control of the multiple groups of finger contacts to execute corresponding operations according to the priority set by each group of finger contacts; alternatively, when the moving direction, the moving speed, and the distance from the starting point of at least two groups of cluster centers all satisfy the preset conditions, the execution unit 34 does not respond to the preset conditions, so as to avoid touch confusion.
The multi-finger touch device provided by the embodiment of the invention belongs to the same concept as the multi-finger touch method, any one of the methods provided in the multi-finger touch method embodiment can be operated on the multi-finger touch device, and the specific implementation process is described in the multi-finger touch method embodiment, and is not described herein again.
For the multi-finger touch device provided by the embodiment of the invention, each functional unit may be integrated in one processing chip, or each unit may exist alone physically, or two or more units are integrated in one module. The integrated module can be realized in a form of hardware, and can also be realized in a form of a functional module of software. The integrated unit, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium, such as a read-only memory, a magnetic or optical disk, or the like.
When the multi-finger touch device provided by the embodiment of the invention is utilized, the correct multi-finger contact track can be obtained, and the recognition efficiency and precision are improved, so that the touch operation of a user can be accurately responded.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a multi-finger touch device according to another embodiment of the present invention. At least a portion of the multi-touch method described above in connection with may be implemented by a multi-touch device 400, the device 400 including a processor 41, a memory 42, and a bus 43.
In some instances, the multi-fingered touch device 400 can also include an input device 401, an input port 402, an output port 403, and an output device 404. The input port 402, the processor 41, the memory 42, and the output port 403 are connected to each other via the bus 43, and the input device 401 and the output device 404 are connected to the bus 43 via the input port 402 and the output port 403, respectively, and further connected to other components of the device 400. It should be noted that the output interface and the input interface can also be represented by I/O interfaces. Specifically, the input device 401 receives input information from the outside and transmits the input information to the processor 41 through the input port 402; processor 41 processes the input information based on computer-executable instructions stored in memory 42 to generate output information, stores the output information temporarily or permanently in memory 42, and then transmits the output information to output device 404 through output port 403; the output device 403 outputs the output information to the outside of the device.
The memory 42 includes mass storage for data or instructions. By way of example, and not limitation, memory 42 may include an HDD, a floppy disk drive, flash memory, an optical disk, a magneto-optical disk, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these. Memory 42 may include removable or non-removable (or fixed) media, where appropriate. The memory 42 may be internal or external to the device, where appropriate. In a particular embodiment, the memory 42 is a non-volatile solid-state memory. In particular embodiments, memory 42 includes Read Only Memory (ROM). Where appropriate, the ROM may be mask-programmed ROM, Programmable ROM (PROM), Erasable PROM (EPROM), Electrically Erasable PROM (EEPROM), electrically rewritable ROM (EAROM), or flash memory or a combination of two or more of these.
Bus 43 comprises hardware, software, or both to couple the components of device 400 to one another. By way of example, and not limitation, bus 43 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a Front Side Bus (FSB), a Hyper Transport (HT) interconnect, an Industry Standard Architecture (ISA) bus, an infiniband interconnect, a Low Pin Count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, a Serial Advanced Technology Attachment (SATA) bus, a video electronics standards association local (VLB) bus, or other suitable bus, or a combination of two or more of these. The bus 43 may include one or more buses 43, where appropriate. Although specific buses have been described and shown in the embodiments of the invention, any suitable buses or interconnects are contemplated by the invention.
When the multi-finger touch method described in connection with FIG. 1 is implemented by the device 400 shown in FIG. 8, the input device 401 receives touch information for each frame of finger touch input by the user on the first touch screen, and in particular embodiments, the I/O interface connected to the output device may comprise hardware, software, or both, providing one or more interfaces for communication between the device 400 and one or more I/O devices. One or more of these I/O devices may allow communication between a person and device 400. By way of example, and not limitation, the I/O device may comprise a camera. The I/O interface may comprise one or more devices or software drivers capable of allowing processor 41 to drive one or more of these I/O devices, where appropriate. The processor 41, based on the computer executable instructions stored in the memory 42, divides the finger touch points in each frame to obtain a plurality of groups of finger touch points, tracks the motion of each group of finger touch points, and executes corresponding operations in response to the touch of the plurality of groups of finger touch points when the moving direction, the moving speed, and the distance from the starting point of the plurality of groups of finger touch points satisfy preset conditions.
According to some embodiments, a computer-readable storage medium is provided that may include instructions that, when executed on a computer, may cause the computer to perform the multi-touch method described above.
Therefore, the computer-readable storage medium receives touch information of each frame of finger touch points input by a user on the first touch screen, clusters and groups the finger touch points in each frame to obtain a plurality of groups of finger touch points, tracks the motion of each group of finger touch points, and executes corresponding operation in response to the touch of the plurality of groups of finger touch points when the moving direction, the moving speed and the distance from a starting point of the plurality of groups of finger touch points meet preset conditions. The accurate multi-finger contact track can be obtained, and the recognition efficiency and the accuracy are improved, so that the accurate multi-finger contact track can be obtained by accurately responding to the touch operation of a user, the recognition efficiency and the accuracy are improved, and the touch operation of the user can be accurately responded.
In some examples, a computer program product containing instructions is provided which, when run on a computer, causes the computer to perform the multi-touch method described above.
In some examples, a computer program is provided which, when run on a computer, causes the computer to perform the multi-touch method described above.
In the above examples, the implementation may be in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (9)

1. A multi-finger touch method is characterized by comprising the following steps:
receiving touch information of each frame of finger touch point input by a user on a first touch screen;
clustering and grouping the finger contacts in each frame to obtain a plurality of groups of finger contacts;
tracking the motion of each set of finger contacts;
the tracking of the movement of each group of finger contacts includes numbering each group of finger contacts such that each finger contact is assigned a unique group number and each finger contact has a unique ID number;
calculating the clustering center coordinate of each group of finger contacts;
for the Nth frame, obtaining a cluster center coordinate of each group of finger touch points in the Nth-1 th frame in each group of predicted cluster center coordinates of the Nth frame, comparing the distance between the cluster center coordinate of each group of finger touch points in the Nth frame and the predicted cluster center coordinates, and tracking the group numbers of two groups of finger touch points with the smallest distance in the two groups of finger touch points to be the same, wherein N is an integer greater than or equal to 4, and each group of predicted distance center coordinates is determined according to the cluster center coordinates of the finger touch points in the first frame to the third frame, the sampling interval, the moving speed and the acceleration of the cluster center of the finger touch points;
for the Nth frame, acquiring each finger predicted contact point of each finger contact point in the Nth frame in the Nth-1 frame, comparing the finger contact point in the Nth frame with the finger predicted contact points, and tracking the ID numbers of the finger contact points with the minimum distance in the finger contact points as the same;
when the moving direction, the moving speed and the distance from the starting point of the multiple groups of finger contacts meet preset conditions, responding to the touch control of the multiple groups of finger contacts to execute corresponding operations, wherein the corresponding operations comprise:
in response to the fact that at least one finger contact point with the same ID number exists in groups with the same group number in adjacent frames, judging the groups with the same group number as continuous groups, according to the cluster center coordinates of fingers with the same group number in an Nth frame and a first frame, setting the upper left corner of a touch screen as a coordinate origin, taking the right direction as the positive direction of an X axis, taking the left direction as the negative direction of the X axis, taking the upward direction as the positive direction of the Y axis, taking the downward direction as the negative direction of the Y axis, and taking the slope of the moving direction of the cluster center of the fingers of the continuous groups relative to the X axis as K, when the angle is more than 0 degrees and less than or equal to 45 degrees, indicating that the fingers slide upwards, and sending a first instruction for displaying image content or related information in a first touch screen to a second touch screen above the first touch screen; when the K is less than 45 degrees and less than 135 degrees, the finger slides leftwards, and a second instruction for displaying the image content or the related information in the first touch screen to a second touch screen positioned on the left side of the first touch screen is sent; when the angle is more than or equal to 135 degrees and less than or equal to 225 degrees, the finger slides downwards, and a third instruction for displaying the image content or the associated information in the first touch screen to a second touch screen positioned below the first touch screen is sent; when K is more than or equal to 225 degrees and less than 315 degrees, the finger slides to the right, and a fourth instruction for displaying the image content or the related information in the first touch screen to a second touch screen positioned on the right side of the first touch screen is sent; or, determining which direction the image content or the related information of the first touch screen is displayed on the second touch screen according to the moving direction X of the abscissa of the cluster center of the continuous group of the fingers in the Nth frame relative to the abscissa of the cluster center of the group corresponding to the fingers of the first frame, or according to the moving direction Y of the ordinate of the cluster center of the continuous group of the fingers in the Nth frame relative to the ordinate of the cluster center of the group corresponding to the fingers of the first frame, including: when X is larger than 0, the finger slides to the right, and a fifth instruction for displaying the image content or the associated information on the first touch screen on a second touch screen on the right side of the first touch screen is sent; when X is less than 0, the finger slides leftwards, and a sixth instruction for displaying the image content or the associated information on the first touch screen on the second touch screen on the left side of the first touch screen is sent; when Y is larger than 0, the finger slides upwards, and a seventh instruction for displaying the image content or the associated information on the first touch screen on a second touch screen above the first touch screen is sent; when Y is less than 0, the finger slides downwards, and an eighth instruction for displaying the image content or the associated information on the first touch screen on the second touch screen below the first touch screen is sent.
2. The multi-finger touch method according to claim 1, comprising:
and acquiring coordinate information of the finger contacts in each frame, and clustering and grouping the finger contacts in each frame by adopting any one of a DBSCAN clustering algorithm, a K value clustering algorithm, a system clustering algorithm or a minimum distance clustering algorithm according to the coordinate information to obtain a plurality of groups of finger contacts.
3. The multi-finger touch method according to claim 1, comprising:
for the second frame, calculating the distance between the clustering center coordinates in the first frame and the second frame, and tracking the group numbers of the two groups of finger contacts with the smallest distance between the clustering center coordinates to be the same;
for the second frame, calculating the distance between the finger contacts in the first frame and the second frame, and tracking the ID numbers of the two finger contacts with the minimum distance between the finger contacts to be the same;
for the third frame, acquiring the speed of the clustering center of each group of finger touch points in the second frame according to the clustering center coordinates of each group of finger touch points in the first two frames and the inter-frame time interval, and acquiring each group of predicted clustering center coordinates of the clustering center coordinates of each group of finger touch points in the second frame in the third frame according to the speed of the clustering center of each group of finger touch points;
and for the third frame, acquiring the speed of each finger touch point in the second frame according to the coordinate information of each finger touch point in the first two frames and the inter-frame time interval, and acquiring each finger predicted touch point of each finger touch point in the third frame in the second frame according to the speed of each finger touch point.
4. The multi-finger touch method according to any one of claims 1-3, wherein for the nth frame, the speed and the acceleration of the clustering center of each group of finger touch points in the nth frame are obtained according to the clustering center coordinates of each group of finger touch points in the previous N-1 frame and the inter-frame time interval, and each group of predicted clustering center coordinates of the clustering center coordinates of each group of finger touch points in the nth frame are obtained according to the speed and the acceleration of the clustering center of each group of finger touch points;
and for the Nth frame, acquiring the speed and the acceleration of each finger touch point in the Nth frame according to the coordinate information of each finger touch point in the previous N-1 frame and the inter-frame time interval, and acquiring each finger predicted touch point of each finger touch point in the Nth frame according to the speed and the acceleration of each finger touch point.
5. The multi-finger touch method according to any one of claims 1-3, wherein when N >4, for the nth frame, according to the cluster center coordinates of each group of finger touch points in the previous N-1 frame and the inter-frame time interval, obtaining first derivative values of the speed, the acceleration and the acceleration of the cluster center of each group of finger touch points in the nth-1 frame, and according to the first derivative values of the speed, the acceleration and the acceleration of the cluster center of each group of finger touch points, obtaining each group of predicted cluster center coordinates of the cluster center coordinates of each group of finger touch points in the nth frame;
and for the Nth frame, obtaining a first derivative value of the speed, the acceleration and the acceleration of each finger touch point in the Nth frame according to the coordinate information of each finger touch point in the previous N-1 frame and the inter-frame time interval, and obtaining each finger predicted touch point of each finger touch point in the Nth frame in the N-1 frame according to the first derivative value of the speed, the acceleration and the acceleration of each finger touch point.
6. The multi-finger touch method according to any one of claims 1-3, wherein when the moving direction of the cluster center of the consecutive group of the nth frame is within a preset moving direction range, the moving speed is within a preset moving speed range, and the distance from the starting point is within a preset distance range, it is determined that the consecutive group satisfies a preset condition, and a corresponding operation is performed in response to the touch of the consecutive group; or when the moving direction of the cluster center of the continuous group of the Nth frame is within a preset moving direction range and the distance from the initial point is within a preset distance range, judging that the continuous group meets a preset condition, and responding to the touch control of the continuous group to execute corresponding operation; or when the distance between the cluster center of the continuous group of the Nth frame and the starting point is within a preset distance range, judging that the continuous group meets a preset condition, and responding to the touch control of the continuous group to execute corresponding operation.
7. The multi-finger touch method according to any one of claims 1-3, wherein the Nth frame is a last frame.
8. The multi-finger touch method according to any one of claims 1-3, wherein when the groups having the same group number are determined to be discontinuous groups, the discontinuous groups are deleted.
9. The multi-finger touch method according to any one of claims 1-3, wherein in response to the touch of the plurality of finger touch points, the image content displayed on the first touch screen or preset associated information with the image content displayed on the first touch screen is output for display on the second touch screen, or
And responding to the touch control of the multiple groups of finger contacts, and outputting the image content displayed on the first touch screen or preset associated information with the image content displayed on the first touch screen for the first touch screen and the second touch screen to display simultaneously.
CN201710936886.4A 2017-10-10 2017-10-10 Multi-finger touch method, device, equipment and computer readable storage medium Active CN109656457B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710936886.4A CN109656457B (en) 2017-10-10 2017-10-10 Multi-finger touch method, device, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710936886.4A CN109656457B (en) 2017-10-10 2017-10-10 Multi-finger touch method, device, equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN109656457A CN109656457A (en) 2019-04-19
CN109656457B true CN109656457B (en) 2021-10-29

Family

ID=66108811

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710936886.4A Active CN109656457B (en) 2017-10-10 2017-10-10 Multi-finger touch method, device, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN109656457B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110212907B (en) * 2019-06-13 2023-06-02 深圳秋田微电子股份有限公司 Knob control method, capacitive knob and electrical equipment
CN110309023B (en) * 2019-06-20 2022-11-15 佛吉亚歌乐电子(丰城)有限公司 Off-line detection method for touch screen of vehicle machine
KR102340281B1 (en) 2020-02-19 2021-12-17 주식회사 픽셀스코프 Method ahd device for motion recognizing with lider sensor
CN113434068A (en) * 2021-05-28 2021-09-24 北京信和时代科技有限公司 Control method and device for suspension shortcut menu, electronic equipment and storage medium
WO2023272639A1 (en) * 2021-06-30 2023-01-05 东莞市小精灵教育软件有限公司 Stable finger frame detection method and computer-readable storage medium
CN113485579A (en) * 2021-06-30 2021-10-08 东莞市小精灵教育软件有限公司 Finger stable frame detection method and device and computer readable storage medium
CN114253417B (en) * 2021-12-02 2024-02-02 Tcl华星光电技术有限公司 Multi-touch point identification method and device, computer readable medium and electronic equipment
TWI824723B (en) * 2022-09-16 2023-12-01 大陸商北京集創北方科技股份有限公司 Finger tracking correction method, electronic chip and information processing device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622120A (en) * 2011-01-31 2012-08-01 宸鸿光电科技股份有限公司 Touch path tracking method of multi-point touch control panel
CN103389799A (en) * 2013-07-24 2013-11-13 清华大学深圳研究生院 Method for tracking motion trail of fingertip
CN103885707A (en) * 2014-02-27 2014-06-25 四川长虹电器股份有限公司 Multi-touch technology based human-computer interaction method and remote controller
CN106527917A (en) * 2016-09-23 2017-03-22 北京仁光科技有限公司 Multi-finger touch operation identification method for screen interactive system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622120A (en) * 2011-01-31 2012-08-01 宸鸿光电科技股份有限公司 Touch path tracking method of multi-point touch control panel
CN103389799A (en) * 2013-07-24 2013-11-13 清华大学深圳研究生院 Method for tracking motion trail of fingertip
CN103885707A (en) * 2014-02-27 2014-06-25 四川长虹电器股份有限公司 Multi-touch technology based human-computer interaction method and remote controller
CN106527917A (en) * 2016-09-23 2017-03-22 北京仁光科技有限公司 Multi-finger touch operation identification method for screen interactive system

Also Published As

Publication number Publication date
CN109656457A (en) 2019-04-19

Similar Documents

Publication Publication Date Title
CN109656457B (en) Multi-finger touch method, device, equipment and computer readable storage medium
US10565437B2 (en) Image processing device and method for moving gesture recognition using difference images
JP6962356B2 (en) Image processing device, display control device, image processing method, and recording medium
RU2613038C2 (en) Method for controlling terminal device with use of gesture, and device
TWI450128B (en) Gesture detecting method, gesture detecting system and computer readable storage medium
US20130169537A1 (en) Image processing apparatus and method, and program therefor
KR101457976B1 (en) Input user interface device, projecting device, command deciding method and program storage medium storing command deciding method program
US9880721B2 (en) Information processing device, non-transitory computer-readable recording medium storing an information processing program, and information processing method
EP2635952B1 (en) Method and device for detecting gesture inputs
CN109101172B (en) Multi-screen linkage system and interactive display method thereof
US20190278426A1 (en) Inputting information using a virtual canvas
JPWO2014061342A1 (en) Information processing system, information processing method, and program
CN109964202B (en) Display control apparatus, display control method, and computer-readable storage medium
US9948861B2 (en) Method and apparatus for capturing and displaying an image
JP2016099643A (en) Image processing device, image processing method, and image processing program
CN111986229A (en) Video target detection method, device and computer system
US9489077B2 (en) Optical touch panel system, optical sensing module, and operation method thereof
CN111046447B (en) Authority control method and authority control system
US20180059811A1 (en) Display control device, display control method, and recording medium
US9313408B2 (en) Mobile device for capturing images and control method thereof
JP2012221062A (en) Information processing device, information processing method, and program
CN111221482B (en) Interface layout adjusting method based on command console
JP5939594B2 (en) Apparatus and method for enlarging or reducing image
JP2014170367A (en) Object detection device, object detection method, object detection system and program
US20230368392A1 (en) Image capturing control apparatus, image capturing control method, and non-transitory computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant