CN110162257A - Multiconductor touch control method, device, equipment and computer readable storage medium - Google Patents

Multiconductor touch control method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN110162257A
CN110162257A CN201810150615.0A CN201810150615A CN110162257A CN 110162257 A CN110162257 A CN 110162257A CN 201810150615 A CN201810150615 A CN 201810150615A CN 110162257 A CN110162257 A CN 110162257A
Authority
CN
China
Prior art keywords
screen
touch
touch information
sub
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810150615.0A
Other languages
Chinese (zh)
Inventor
谭登峰
其他发明人请求不公开姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zen-Ai Technology Co Ltd
Original Assignee
Beijing Zen-Ai Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zen-Ai Technology Co Ltd filed Critical Beijing Zen-Ai Technology Co Ltd
Priority to CN201810150615.0A priority Critical patent/CN110162257A/en
Publication of CN110162257A publication Critical patent/CN110162257A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a kind of multiconductor touch control methods, wherein the described method includes: receiving the touch information from screen;The point set number of contact is judged according to touch information, the point set number of contact is equal to 1 if judging, executes the first operation according to touch information;If judge contact number be greater than or equal to 3 and contact point set number be 2, according to touch information execute second operate.The present invention can preferably solve the problems, such as existing in the prior art complicated for operation, and complicated operation can be directly realized by by the touch information of multiple finger contacts, improves the efficiency and convenience of operation.

Description

Multi-contact touch method, device, equipment and computer readable storage medium
Technical Field
The invention relates to the technical field of intelligent interaction, in particular to a multi-contact touch method, a multi-contact touch device, multi-contact touch equipment, a multi-contact touch system and a computer-readable storage medium.
Background
With the advent of the information age, human-information interaction technology plays an increasingly important role, wherein large-screen touch linkage display is one of the most modern video tools at present, and the large-screen touch linkage display is widely applied to various fields due to the characteristics of intuition.
As shown in fig. 1, fig. 1 is a schematic structural diagram of a large-screen touch-control linkage display system in the prior art, and the system includes: a server 101, at least one computer (three exemplary computers 102, 103, 104 are illustrated in fig. 1), a multi-screen stitching processor 105, and a screen (the screen may include at least one sub-screen, three exemplary sub-screens 106, 107, 108 are illustrated in fig. 1), the multi-screen stitching processor 105 is respectively connected to the server 101, the at least one computer 102, 103, 104, and the at least one sub-screen 106, 107, 108, the server 101 is connected to the at least one computer 102, 103, 104, the multi-screen stitching processor 105 is configured to receive contents of the at least one computer 102, 103, 104 and transmit the contents to the at least one sub-screen 106, 107, 108 for displaying, the server 105 is configured to receive touch information of a finger touch point collected by a sensor (such as a camera in fig. 1) attached to the at least one sub-screen 106, 107, 108, and after the touch information is processed, obtaining a touch instruction corresponding to the touch information, and sending the touch instruction to a corresponding computer to enable the computer to execute corresponding operation.
As is well known, the existing touch control only responds to the touch control information of a single finger touch point or the touch control information of two finger touch points, for example, a corresponding file or a zoom screen is moved according to the movement track of the single finger touch point or the relative movement distance of the two finger touch points, and more gesture operations are implemented by defining a complicated specific movement track (for example, a W character) corresponding to a specific operation instruction in advance, or by multiple steps, such as inputting W first, putting the screen into a layout adjustment mode, and then adjusting the layout of the display window on the screen by continuously recognizing the movement direction of the gesture. That is, the operation performed by the finger in the prior art is limited, or inconvenient and inefficient.
Disclosure of Invention
In view of the foregoing problems in the prior art, embodiments of the present invention provide a touch method, device, apparatus, and computer-readable storage medium for point-set gestures.
A first aspect of the present invention provides a multi-touch method, wherein the method comprises:
receiving touch information from a screen;
judging the number of point sets of the touch points according to the touch information, and if the number of the point sets of the touch points is judged to be equal to 1, executing a first operation according to the touch information; and if the number of the contact points is judged to be more than or equal to 3 and the number of the point sets of the contact points is judged to be 2, executing a second operation according to the touch information.
In some embodiments, determining the number of point sets of the touch point comprises: judging whether the distance between the contacts with the farthest distance is smaller than a first preset threshold value or not; if yes, judging that the number of the point sets of the contact points is 1; if not, judging that the number of the point sets of the contact points is 2.
In some embodiments, determining the number of point sets of the touch point comprises: judging whether the distance between the contacts farthest away in the first direction is smaller than a first preset threshold value or not; if yes, judging that the number of the point sets of the contact points is 1; if not, judging that the number of the point sets of the contact points is 2.
In some embodiments, the first direction is a horizontal direction.
In some embodiments, the first preset threshold is determined according to the distance between the two most distant finger tips when a single palm is spread.
In some embodiments, the method includes, if it is determined that the number of point sets within a preset time interval is equal to 1 according to the touch information, performing a first operation according to the touch information; and if the number of the point sets in the preset time interval is judged to be more than or equal to 3 and the number of the point sets of the touch points is judged to be 2 according to the touch information, executing a second operation according to the touch information.
In some embodiments, performing the first operation according to the touch information includes: executing corresponding operation on the content in the display window on the screen according to the touch information; the executing the second operation according to the touch information includes: and executing corresponding operation on the display window on the screen according to the touch information.
In some embodiments, performing the first operation according to the touch information includes: and moving, amplifying, reducing, deleting, marking, writing or turning pages of the content in the display window on the screen according to the touch information.
In some embodiments, the screen is formed by at least one sub-screen being tiled by the multi-screen tiling processor, the touch information is from a first sub-screen on the screen, and a second operation is performed according to the touch information, including:
controlling the multi-screen splicing processor to output the display window of the first sub-screen to other sub-screens for display according to the touch information; or controlling the multi-screen splicing processor to output the display window of the first sub-screen to the first sub-screen and other sub-screens for display according to the touch information; or
Controlling a multi-screen splicing processor to restore display windows on other sub-screens to the first sub-screen for display or restore display windows distributed across other sub-screens to the first sub-screen for display according to touch information; or
Controlling a multi-screen splicing processor to output pre-stored information associated with a display window of a first sub-screen to the first sub-screen for display according to the touch information; or
And controlling the multi-screen splicing processor to output pre-stored information associated with the display window of the first sub-screen to other sub-screens for display according to the touch information.
In some embodiments, the method comprises:
after the number of the point sets of the contact points is judged to be 2, points which are within a first preset threshold range from the contact point at the front end in the first direction in the direction are divided into first point sets, and points which are within the first preset threshold range from the contact point at the rear end in the direction in the first direction are divided into second point sets.
In some embodiments, the method further comprises:
calculating the minimum distance between the points in the first point set and the coordinates of the points in the second point set in the first direction, judging whether the minimum distance between the two point sets is larger than a second preset threshold value, and if so, executing a second operation according to touch information; or,
calculating the distance between two contact points with the smallest distance in the first point set and the second point set; and judging whether the distance between the two contacts with the minimum distance in the first point set and the second point set is greater than a second preset threshold value or not, and if so, executing a second operation according to the touch information.
In some embodiments, the method further comprises:
calculating the minimum distance between the points in the first point set and the coordinates of the points in the second point set in the first direction, judging whether the calculated minimum distance between the two point sets is larger than a second preset threshold value or not, and if not, not executing a second operation; or,
calculating the distance between two contact points with the smallest distance in the first point set and the second point set; and judging whether the distance between the two contact points with the minimum distance between the first point set and the second point set is greater than a second preset threshold value or not, and if not, not executing a second operation.
In some embodiments, performing the second operation according to the touch information includes: the second operation is performed according to touch information of touch points in a first point set of the 2 point sets and touch information of touch points in a second point set of the 2 point sets.
In some embodiments, performing the second operation according to the touch information includes: the second operation is performed according to a change of the contact points in a first point set of the 2 point sets with respect to the contact points in a second point set of the 2 point sets.
In some embodiments, the screen is formed by splicing at least one sub-screen via a multi-screen splicing processor, the display window is located on a first sub-screen, and performing the second operation according to the touch information includes:
controlling a multi-screen splicing processor to amplify the display window according to the touch information, so that the display window is distributed across a first sub-screen and other sub-screens; or
And controlling the multi-screen splicing processor to reduce the display window according to the touch information, so that the display window is reduced from full-screen display of the first sub-screen to partial display occupying the first sub-screen.
In some embodiments, performing the second operation according to the touch information includes:
respectively calculating the central coordinates of a first point set and a second point set in the 2 point sets;
calculating the variation of the distance between the center coordinates of the 2 point sets; enlarging or reducing the display window on the screen according to the variable quantity; or,
enlarging or reducing the display window according to the variation of the distance between two closest contacts in the first point set and the second point set; or,
and enlarging or reducing the display window according to the variation of the distance between two closest contact points in the first direction in the first point set and the second point set.
In some embodiments, performing the first operation according to the touch information includes: and calculating the center coordinates of the touch points, and outputting a pattern of a certain shape at a position corresponding to the center coordinates.
In some embodiments, performing the second operation according to the touch information includes: and respectively calculating the center coordinates of the touch points of each point set in the 2 point sets, and outputting a figure with a certain shape at a position corresponding to the center coordinates and/or outputting a connecting line of the center coordinates of the touch points in the first point set and the second point set in the 2 point sets.
In some embodiments, the contact points are divided into one or two point sets using any one of a DBSCAN clustering algorithm, a K-value clustering algorithm, a system clustering algorithm, or a minimum distance clustering algorithm.
A second aspect of the present invention provides a multi-touch device, wherein the device comprises a memory and a processor, wherein,
the memory is used for storing executable program codes;
the processor is used for reading the executable program codes stored in the memory so as to execute the multi-touch method.
A third aspect of the present invention provides a computer-readable storage medium comprising instructions that, when executed on a computer, cause the computer to perform the multi-touch method.
A fourth aspect of the present invention provides a multi-touch system, the system comprising at least one computer, a server, and a multi-screen stitching processor, wherein one screen is formed by stitching at least one sub-screen via the multi-screen stitching processor, a display window of the at least one computer is displayed on the screen in a predetermined layout via the multi-screen stitching processor, and the screen is adapted to collect a touch and display the display window; the server includes: a memory and a processor; the memory is used for storing executable program codes; the processor is used for reading the executable program codes stored in the memory so as to execute the multi-touch method.
A fifth aspect of the present invention provides a multi-touch system, the system including at least one computer, a server, a multi-screen stitching processor, and a screen, the screen being formed by at least one sub-screen being stitched together by the multi-screen stitching processor, for displaying at least one display window of the at least one computer on the screen in a predetermined layout by the multi-screen stitching processor, the screen being further adapted to collect a touch and display the display window; the server includes: a memory and a processor; the memory is used for storing executable program codes; the processor is used for reading the executable program codes stored in the memory so as to execute the multi-touch method.
In some embodiments, the screen comprises an infrared light curtain forming device and an infrared camera, wherein the infrared light curtain forming device is used for forming an infrared light curtain on the surface of the screen; the infrared camera is used for collecting touch input and sending the touch input to the server.
In some embodiments, the screen comprises a capacitive, resistive, infrared frame, or surface acoustic wave touch input and display screen.
The point set gesture touch method, the device, the equipment and the computer readable storage medium provided by the embodiment of the invention can better solve the problem that the finger touch operation is very limited or complex in operation in the prior art, simplify the operation steps, improve the convenience and the operation efficiency of human-computer interaction, and provide more human-computer interaction modes and possibilities at the same time.
Drawings
Fig. 1 is a schematic structural diagram of a large-screen touch-control linkage display system in the prior art.
FIG. 2 is a schematic diagram of a division of a finger contact into a first set of points and a second set of points, according to an embodiment of the invention.
FIG. 3 is a schematic structural diagram of a multi-touch device according to an embodiment of the invention
Fig. 4 is a schematic structural diagram of a multi-touch device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
According to one embodiment of the present invention, a multi-touch method (or a touch method also called a point-set gesture) includes:
and S11, receiving touch information from the screen.
In the present application, the screen covers a screen integrating touch control and display, including a screen displaying images in various ways and achieving the purpose of touch control, and the touch control thereon can be captured by various ways such as inductance, magnetic induction, thermal induction, light induction and the like through the screen and a sensor attached to the screen respectively and processed by a processor or a server. The touch screen may include, for example, a capacitive, resistive, infrared bezel-type or surface acoustic wave touch input and display screen, among others.
The touch screen also covers, for example, the following screens: the screen can directly display images (such as an LCD display), and an infrared light curtain and an infrared camera are arranged on the surface of the screen. When a finger touches a display window on the surface of the screen, an infrared camera attached to the screen can capture touch information of a finger touch point, and the information contains touch position information. The touch signal is then passed to a processor or server or computer for subsequent determination and processing, as will be described below. The infrared light curtain may be emitted by infrared laser diodes located at the edges of the screen, for example: the infrared laser diode array may be disposed over any one edge of the screen.
The touch screen also covers, for example, the following screens: an infrared light curtain is arranged on the surface of the screen, and the screen is provided with a projection device for displaying images and an infrared camera. When a finger touches a display window on the surface of the screen, an infrared camera attached to the screen can capture a touch signal of a finger touch point, and the signal contains touch position information. The touch information is then passed to a processor or server or computer for subsequent determination and processing, as will be described below. The infrared light curtain is for example realized by an infrared laser emitter placed at the edge of the screen or an infrared light curtain projector placed behind the screen.
In addition to the above, various screen touch technologies and touch screens exist in the prior art, and since these technologies are well known to those skilled in the art, they are not described herein.
In this application, the screen may also be formed by splicing at least two sub-screens together, for example, by a multi-screen splicing processor, and the sub-screens spliced together may be disposed adjacent to each other in space or may be disposed separately in space. Here, the multi-screen splicing processor may also enable the at least two sub-screens to respectively display different signal source contents, or the same signal source content, or may also enable the at least two sub-screens to display different pictures, where the different pictures are spliced into a complete signal source content or picture content.
The splicing processor may be implemented in hardware or in software in a server, and is well known to those skilled in the art, for example: the bird multi-screen splicing controller produced by bird science and technology is sold in the market, so the description is not repeated here.
The content of at least one computer is displayed on the at least two sub-screens through corresponding display windows, for example: the content of each computer is displayed on different sub-screens through respective display windows. The server can read the corresponding relation between the content of each computer and each sub-screen and the layout information of each sub-screen on the large screen formed by splicing through the multi-screen splicing processor.
S12: judging the number of the point sets of the touch points according to the touch information, and if the number of the point sets of the touch points is equal to 1, executing a first operation, for example, executing step S13; if the number of the touch points is greater than or equal to 3 and the number of the touch point sets is 2, a second operation is performed according to the touch information, for example, step S14 is performed.
The touch information may include information such as the shape of a touch point, the duration of touch, and a touch trajectory. The point set is also called a short for point set, and if two hands are separately placed on the screen for touch control, the point set is 2 point sets.
In the application, the number of the point sets of the touch points is determined according to the touch information, and for example, the touch points may be divided into one or two point sets by using any one of a DBSCAN clustering algorithm, a K value clustering algorithm, a system clustering algorithm, or a minimum distance clustering algorithm: if the points falling within a distribution radius are regarded as a point set, two piles of points far away form two point sets; or the following judgment method proposed by the present invention may also be utilized to judge or divide the point set.
In this step, the number of the touch points is set to 3, and when the number of the finger touch points is 3, 4, 5 … and the number of the point sets of the touch points is 2, step S14 is performed; if the number of the point sets of the touch points is equal to 1, step S13 is performed.
It should be noted that, in the present embodiment, the preset number is only 3 for example, in some other embodiments of the present invention, the preset number may also be another number, for example, the number may be set by a user according to a use habit in an actual application, or the number may be customized according to a user requirement, and may be any number of 4 to 10.
In some embodiments, it may also be determined whether the number of point sets in the preset time interval is equal to 1 according to the touch information, or whether the number of point sets in the preset time interval is greater than or equal to 3 and the number of point sets of the touch point is 2 according to the touch information, where the preset time interval may be a short time period, for example, 1S, and if it is determined that the number of finger touch points falling on the screen in 1S is greater than or equal to 3 and the number of point sets of the touch point is 2, the second operation is performed according to the touch information; and if the number of the point sets of the touch points is equal to 1, executing a first operation according to the touch information. The number and the point set of the finger contacts are determined by using the touch information of 1S second, and the executing of the first operation or the second operation may also include responding according to the continuous touch information after 1S; here, the information in 1S is used to determine the number of contacts in advance, and it is assumed that the number of contacts continues to the first number. Of course, the number of contacts may be determined all the time in the subsequent process, and the response mode may be adjusted in time to perform the first operation or the second operation. And executing corresponding first or second operation according to the touch information, and also executing corresponding operation according to the touch information in the whole touch process and the corresponding relation between the pre-stored touch information and the operation. In addition, when there are multiple display windows on the screen, the number of finger contacts and the number of point sets in each display window falling on the screen may be specifically determined, and the signal source corresponding to the corresponding display window may be operated.
S13: and executing corresponding operation on the content in the display window on the screen according to the touch information.
In this step, for example, when the server determines that the number of the point sets of the finger touch input in the display window is equal to 1, the server may send the touch information of the finger touch to a computer or a server corresponding to the display window, so that the computer or the server performs corresponding operations on the content in the display window according to the touch information of the finger touch, for example: and moving, amplifying, reducing, deleting, marking, writing or page turning the content in the display window according to the touch track or the touch action.
S14: and executing corresponding operation on the display window on the screen according to the touch information.
In the present application, performing an operation on a display window includes displaying content as an operation object with the display window itself or the display window as a whole. For example, assuming that the screen is formed by splicing at least one sub-screen via a multi-screen splicing processor, the touch information is from a first sub-screen on the screen, and a second operation is performed according to the touch information, including:
controlling the multi-screen splicing processor to output the display window of the first sub-screen to other sub-screens for display according to the touch information; or
Controlling a multi-screen splicing processor to output a display window of the first sub-screen to the first sub-screen and other sub-screens for display according to the touch information; or
Controlling a multi-screen splicing processor to restore display windows on other sub-screens to the first sub-screen for display or restore display windows distributed across other sub-screens to the first sub-screen for display according to touch information; or
Controlling a multi-screen splicing processor to output pre-stored information associated with a display window of a first sub-screen to the first sub-screen for display according to the touch information; or
And controlling the multi-screen splicing processor to output pre-stored information associated with the display window of the first sub-screen to other sub-screens for display according to the touch information.
With respect to steps S13 and S14, it is also conceivable to have various more specific first or second operations, for example, when two hands are separated and four to five fingers of each hand slide outward on the screen, the server or computer may control the stitching processor to enlarge the display window on the screen in response to the motion or touch information, thereby completing the second operation; or when the two hands are separated and each hand has four to five finger ends to move to one direction on the screen at the same time, pushing the display window on the screen to other screens pointed by the moving direction for displaying, thereby completing the second operation; when only one hand slides on the screen, the object content pointed by the sliding is moved, thereby completing the first operation.
Performing the second operation according to the touch information may further include: the second operation is performed according to touch information of touch points in a first point set of the 2 point sets and touch information of touch points in a second point set of the 2 point sets. For example, if the touch information in the first set of points indicates that the touch point therein is still, i.e. the finger is pressed on the screen for a long time, and the gesture in the second set of points is moving, at this time, a predefined corresponding operation may be performed in conjunction with the combined touch information of the first set of points and the second set of points, e.g. directly displaying the content of the display window on all screens in full screen.
Performing the second operation according to the touch information may further include: the second operation is performed according to a change of the contact points in a first point set of the 2 point sets with respect to the contact points in a second point set of the 2 point sets. For example, if the first set of points is moved down and the second set of points is moved up, a rotation operation on the window may be performed.
For another example, assuming that the screen is formed by splicing at least one sub-screen via a multi-screen splicing processor, the display window is located on the first sub-screen, and performing the second operation according to the touch information may include:
controlling a multi-screen splicing processor to amplify the display window according to the touch information, so that the display window is distributed across a first sub-screen and other sub-screens; or
And controlling the multi-screen splicing processor to reduce the display window according to the touch information, so that the display window is reduced from full-screen display of the first sub-screen to partial display occupying the first sub-screen.
In addition, performing the second operation according to the touch information may further include:
respectively calculating the central coordinates of a first point set and a second point set in the 2 point sets;
calculating the variation of the distance between the center coordinates of the 2 point sets; enlarging or reducing the display window on the screen according to the variable quantity; or,
enlarging or reducing the display window according to the variation of the distance between two closest contacts in the first point set and the second point set; or,
and enlarging or reducing the display window according to the variation of the distance between two closest contact points in the first direction in the first point set and the second point set.
In addition, performing the second operation according to the touch information may further include:
and respectively calculating the center coordinates of the touch points of each point set in the 2 point sets, and outputting a figure with a certain shape at a position corresponding to the center coordinates and/or outputting a connecting line of the center coordinates of the touch points in the first point set and the second point set in the 2 point sets.
The performing of the first operation according to the touch information may further include:
the executing the first operation according to the touch information includes: and calculating the center coordinates of the touch points, and outputting a pattern of a certain shape at a position corresponding to the center coordinates.
In a touch method of a point set gesture provided according to another embodiment of the present invention, the method includes:
s21: touch information from a screen is received.
S22: judging whether the distance between two contact points with the farthest distance (for example, the finger contact point positioned on the leftmost side and the finger contact point positioned on the rightmost side) is smaller than a first preset threshold, if so, judging that the number of point sets of the contact points is 1, and executing step S24; if not, the number of the point sets of the contact points is determined to be 2, and step S25 is executed.
In some embodiments, it may also be determined whether an absolute value of a difference between an abscissa of the leftmost finger contact and an abscissa of the rightmost finger contact in a certain direction, for example, the horizontal direction, is smaller than a first preset threshold, if so, the number of point sets of the contacts is determined to be 1, and step S24 is executed; if not, the number of the point sets of the contact points is determined to be 2, and step S25 is executed.
In this embodiment, different sub-screens may correspond to different infrared cameras, and each sub-screen is located in a working space of the infrared camera, for example: one sub-screen may correspond to one infrared camera. Because the working space of the infrared camera in each infrared camera may be different, the touch information of the finger contact under each infrared camera coordinate system can be converted into the touch information under the screen coordinate system. In some embodiments, one infrared camera may also be used to cover all screens. Can be established the initial point with the screen upper left corner, get right the direction and be X axle positive direction, the direction is X axle negative direction left, infrared laser or infrared laser array can set up in the screen edge that corresponds to on the X axle positive direction.
In this step, taking the X axis as the horizontal direction, according to the position information of the finger contact points, the finger contact point with the smallest horizontal coordinate and the finger contact point with the largest horizontal coordinate in the positive direction of the X axis can be determined, so that the finger contact point positioned on the leftmost side and the finger contact point positioned on the rightmost side can be determined, and whether the distance between the horizontal coordinates of the finger contact point positioned on the leftmost side and the finger contact point positioned on the rightmost side is smaller than a first preset threshold value is calculated, where the first preset threshold value may be set according to an empirical value, for example: can be determined according to the distance between the two finger tips which are farthest away when a single palm is spread. When the distance between the abscissa of the leftmost finger contact and the abscissa of the rightmost finger contact is smaller than a first preset threshold, the finger contacts are considered as a single point set, that is, the number of the point sets of the finger contacts is 1; and when the distance between the abscissa of the leftmost finger touch point and the abscissa of the rightmost finger touch point is greater than a first preset threshold, the finger touch points are considered as a double-point set, namely the number of the point sets is 2.
In some other embodiments of the present invention, the infrared light curtain may be projected onto the screen surface by an infrared light curtain projector, for example, projected onto the screen surface from the back of the screen, when a finger acts on a display window on a sub-screen on which the infrared light curtain is disposed, the light distribution of the infrared light curtain at the finger contact point will change due to the touch behavior, for example, part of the infrared light at the finger contact point is diffusely reflected by the touch finger away from the first sub-screen and is captured by an infrared camera located in front of the first sub-screen; or partial infrared light at the finger contact point penetrates through the first sub-screen due to the action of the touch finger and is shot by the infrared camera positioned behind the first sub-screen, then the infrared camera sends an infrared image including the finger contact point to the processor or the image processing unit in the server or the computer in the form of an electric signal, and the image processing unit processes and analyzes the received electric signal to obtain touch information of the user when the user touches the first sub-screen. The touch information may include touch position information and/or touch trajectory information. The touch position may change over time or have some distribution characteristic.
In some other embodiments of the present invention, the infrared laser or the infrared laser array may also be disposed at an edge of the screen corresponding to the Y-axis, and in some other embodiments of the present invention, the infrared laser or the infrared laser array may be disposed at least two edges of the screen at the same time, which is not limited in this respect.
For convenience of description, in the present embodiment and the following embodiments, the infrared laser or the infrared laser array is disposed above the screen, the upper left corner of the screen is used as an origin, the X axis is used as a horizontal direction, and the first coordinate is used as an abscissa for example.
S24: and executing a first operation on the content in the display window on the screen according to the touch information of the finger touch point in the single point set.
In the foregoing determination process, if it is determined that a distance between an abscissa of a leftmost finger contact and an abscissa of a rightmost finger contact in an abscissa direction is smaller than a first preset threshold, or a distance between contacts farthest away is smaller than the first preset threshold, it indicates that the finger contacts on the same palm may act on a display window, and the finger contacts at this time are considered to be a single-point set, a processor, a server, or a computer may obtain corresponding touch instructions according to touch information of the finger contacts in the single-point set, and then send the touch instructions to a multi-screen stitching processor, so as to control the multi-screen stitching processor to execute a first operation on content in the display window corresponding to the signal source according to the touch instructions. When the multi-screen splicing processor is implemented by software and is part of a server or a computer, the first operation can be directly implemented by the server or the computer.
For example: when each finger contact in the single-point set lasts for a period of time (for example, 1 second) in the display window, and the position information of each finger contact in the display window is within a preset range (that is, the variation of the abscissa and the ordinate of each finger contact in the display window relative to the sub-screen is within the preset range), the finger contact in the single-point set can be regarded as a long-press gesture at this time, a touch instruction corresponding to the long-press gesture can be stored in the server or the computer in advance, for example, the long-press action of the single-point set can be regarded as shutdown or startup, and therefore, the shutdown operation is executed in response to the long-press action of the single-point set.
When the touch device with the point set gestures provided by the embodiment of the invention is utilized, the single-point set and the double-point set are introduced, so that the operation steps are simplified, the convenience and the operation efficiency of human-computer interaction are improved, and more human-computer interaction modes and possibilities are provided.
In some embodiments of the present invention, when the finger touch point is determined to be a single point set, the method may further include:
s24 a: and calculating the center coordinates of the touch points, and outputting a pattern of a certain shape at a position corresponding to the center coordinates.
In this step, the position coordinates of each finger touch point in the single point set can be averaged, the center coordinates of the finger touch points in the single point set can be obtained, and then a certain shape of figure, such as a circle, a triangle, an ellipse, etc., is output to the display window at the center coordinates of the finger touch points in the single point set, so that the user can visually see the positions of the finger touch points.
S25: and executing a second operation on the display window according to the touch information.
In this step, when the number of finger contacts in the display window is greater than the preset number, and the distance between the abscissa of the leftmost finger contact and the abscissa of the rightmost finger contact in the abscissa direction is greater than a first preset threshold (the first preset threshold may be set according to an empirical value, for example, may be set in relation to the distance between two fingers farthest apart in the horizontal direction when a single palm is spread), and the server or the computer can obtain a corresponding touch instruction according to touch information of the finger contacts in the double-point set, and then sends the touch instruction to the multi-screen splicing processor to control the multi-screen splicing processor to execute a second operation on the display window corresponding to the signal source according to the touch execution.
In the method of performing the second operation on the display window according to the touch information of the two-point concentrated finger touch point according to the embodiment of the invention, the step S25 may include:
s251: and increasing the first coordinate of the leftmost finger touch point by the first preset threshold value to obtain the first coordinate of the boundary of the first point set.
In this step, as shown in fig. 2, fig. 2 is a schematic diagram of dividing the finger contact into a first point set and a second point set according to another embodiment of the present invention. Fig. 2 includes 7 finger contacts a1, a2, A3, a4, a5, a6, and a7, and it can be determined that a1 is the leftmost finger contact and a7 is the rightmost finger contact according to the abscissa of each finger contact in the horizontal direction, and then the abscissa of the finger contact a1 is increased by a first preset threshold value to obtain the right boundary abscissa of the first point set.
In the prior art, a clustering algorithm is usually acquired to realize grouping of finger contacts, however, the inventor finds that randomness exists when the clustering algorithm is adopted for grouping, and when two palms are close to each other in a display window, the finger contacts close to each other in different palms are easily divided into one group, so that wrong division occurs, and the use of a user is influenced.
S252: dividing a finger contact point located between the abscissa of the leftmost finger contact point (A1) and the right boundary abscissa of the first set of points into a first set of points.
In this step, the finger contacts a1, a2, and A3 are divided into a first set of points, as shown by the dashed and dotted lines in fig. 2.
S253: and subtracting the first preset threshold value from the abscissa of the rightmost finger touch point to obtain the left boundary coordinate of the second point set.
In this step, please continue to refer to fig. 2, a7 is the rightmost finger touch point, and the abscissa of the finger touch point a7 is subtracted by the first preset threshold to obtain the left boundary abscissa of the second point set.
S254: finger contacts located between the left boundary abscissa of the second set of points and the abscissa of the rightmost finger contact are divided into a second set of points.
In this step, as shown by the dashed lines in FIG. 2, the finger contacts A4, A5, A6, and A7 are divided into a second set of points.
The above steps can also be summarized as follows: points within a first preset threshold range from a frontmost contact in the direction in the first direction are divided into a first point set, and points within the first preset threshold range from the frontmost contact in the direction in the first direction are divided into a second point set.
S255: and calculating the distance between the two finger touch points with the minimum distance in the first point set and the second point set.
In this step, the minimum distance between the abscissa of the point in the first point set and the abscissa of the point in the second point set may be calculated.
In other embodiments of the present invention, when the infrared laser or the infrared laser array is disposed at the edge of the screen in the negative direction of the Y-axis, the absolute distance between the two finger contacts with the smallest distance between the ordinate may also be calculated.
S256: and judging whether the distance between the two finger contacts with the minimum distance in the first point set and the second point set is greater than a second preset threshold value, if so, executing a step S257, and if not, executing a step S258.
In this step, it is determined whether the distance between the two finger contacts with the minimum distance in the first point set and the second point set is greater than a second preset threshold, and when the distance is greater than the second preset threshold, step S257 is executed, and a second operation is executed on the display window according to the touch information of the finger contacts in the first point set and the second point set; when the touch information is smaller than the second threshold, step S258 is executed, and no response is made to the touch information of the finger touch points in the first point set and the second point set, that is, the second operation is not executed.
Corresponding to the step S255 of considering the distance by using the abscissa, in this step, it is determined whether the minimum distance between the abscissas of the points in the first point set and the points in the second point set is greater than a second preset threshold, if so, step S257 is performed, and a second operation is performed on the display window according to the touch information of the finger contacts in the first point set and the second point set; if the touch information is smaller than the second threshold, step S258 is executed, and no response is made to the touch information of the finger touch points in the first point set and the second point set, that is, the second operation is not executed.
S257: and executing a second operation on the display window according to the touch information of the finger touch point in the double-point set.
In this step, when the distance between two closest finger contacts in the first point set and the second point set is greater than a second preset threshold, the finger contacts at that time are considered to be not only a double-point set but also an effective double-point set, and the processor, the server, or the computer may obtain a corresponding touch instruction according to touch information of the finger contacts in the double-point set, and then send the touch instruction to the multi-screen splicing processor, so as to control the multi-screen splicing processor to execute a second operation on a display window corresponding to the signal source according to the touch instruction.
S258: and not responding to the touch information of the finger touch points in the double-point set.
In this step, when the distance between two closest finger contacts in the first point set and the second point set is smaller than a second preset threshold, the finger contacts in the first point set and the second point set may intersect or approach too closely, and the processor, the server, or the computer does not respond to the touch information of the finger contacts in the two point sets.
In some embodiments, the method may further comprise:
s2571 a: and outputting the central coordinates of the finger touch points in the first point set and the second point set to the display window for displaying in a certain shape, and outputting the connecting lines of the central coordinates of the finger touch points in the first point set and the second point set to the display window for displaying.
In this step, the coordinates of the finger contacts in the first point set and the second point set may be averaged to obtain the central coordinates of the finger contacts in the first point set and the second point set, and then the central coordinates of the finger contacts in the first point set and the second point set are output to a display window according to a certain shape, such as a circle, a triangle, an ellipse, and the like, so that the user can visually see the positions of the finger contacts.
S26: and executing corresponding operation on the content in the display window according to the touch information.
In this step, the processor, the server, or the computer may send the touch information of the finger touch to the computer corresponding to the display window, so that the computer performs corresponding operations on the content in the display window according to the touch information of the finger touch, for example: and moving, amplifying, reducing, deleting, marking, writing or page turning the content in the display window, and the like.
In this application, in some embodiments, performing the first operation according to the touch information includes: executing corresponding operation on the content in the display window on the screen according to the touch information; the executing the second operation according to the touch information includes: and executing corresponding operation on the display window on the screen according to the touch information.
In some embodiments, performing the first operation according to the touch information includes: and moving, amplifying, reducing, deleting, marking, writing or turning pages of the content in the display window on the screen according to the touch information.
In some embodiments, performing the first operation according to the touch information includes: and calculating the center coordinates of the touch points, and outputting a pattern of a certain shape at a position corresponding to the center coordinates.
In some embodiments, if it is assumed that the screen is formed by splicing at least one sub-screen via a multi-screen splicing processor, and the touch information is from a first sub-screen on the screen, performing a second operation according to the touch information includes:
controlling the multi-screen splicing processor to output the display window of the first sub-screen to other sub-screens for display according to the touch information; or controlling the multi-screen splicing processor to output the display window of the first sub-screen to the first sub-screen and other sub-screens for display according to the touch information; or
Controlling a multi-screen splicing processor to restore display windows on other sub-screens to the first sub-screen for display or restore display windows distributed across other sub-screens to the first sub-screen for display according to touch information; or
Controlling a multi-screen splicing processor to output pre-stored information associated with a display window of a first sub-screen to the first sub-screen for display according to the touch information; or
And controlling the multi-screen splicing processor to output pre-stored information associated with the display window of the first sub-screen to other sub-screens for display according to the touch information.
In some embodiments, performing the second operation according to the touch information includes: the second operation is performed according to touch information of touch points in a first point set of the 2 point sets and touch information of touch points in a second point set of the 2 point sets.
In some embodiments, if it is assumed that the screen is formed by splicing at least one sub-screen via a multi-screen splicing processor, and the display window is located on the first sub-screen, performing the second operation according to the touch information includes:
controlling a multi-screen splicing processor to amplify the display window according to the touch information, so that the display window is distributed across a first sub-screen and other sub-screens; or
And controlling the multi-screen splicing processor to reduce the display window according to the touch information, so that the display window is reduced from full-screen display of the first sub-screen to partial display occupying the first sub-screen.
In some embodiments, performing the second operation according to the touch information includes:
respectively calculating the central coordinates of a first point set and a second point set in the 2 point sets;
calculating the variation of the distance between the center coordinates of the 2 point sets; enlarging or reducing the display window on the screen according to the variable quantity; or,
enlarging or reducing the display window according to the variation of the distance between two closest contacts in the first point set and the second point set; or,
and enlarging or reducing the display window according to the variation of the distance between two closest contact points in the first direction in the first point set and the second point set.
In some embodiments, performing the second operation according to the touch information includes: and respectively calculating the center coordinates of the touch points of each point set in the 2 point sets, and outputting a figure with a certain shape at a position corresponding to the center coordinates and/or outputting a connecting line of the center coordinates of the touch points in the first point set and the second point set in the 2 point sets.
It will be understood by those skilled in the art that the above is merely an example of the operation, and those skilled in the art may interchange the first operation and the second operation or define more operation modes as necessary. This is not exemplified here.
In order to better implement the touch method for the point set gesture provided by the embodiment of the invention, the embodiment of the invention also provides a touch device for the point set gesture. The meaning of the noun is the same as the touch method of the point set gesture in the above embodiment, and specific implementation details may refer to the description in the method embodiment.
Fig. 3 is a schematic structural diagram of a touch device for a point-set gesture according to another embodiment of the present invention, as shown in fig. 3, the touch device includes a receiving unit 31, a determining unit 32, a first executing unit 33, and a second executing unit 34.
The receiving unit 31 is configured to receive touch information from a screen; the judging unit 32 is configured to judge the number of the contacts according to the touch information and judge the number of the point sets of the contacts according to the touch information; the first executing unit 33 is configured to execute a first operation according to the touch information when the determining unit determines that the number of the point sets of the finger contact is equal to 1; the second execution unit 34 is configured to execute a second operation according to the touch information when the number of the contacts is greater than or equal to 3 and the number of the contact sets is 2.
According to some embodiments of the present invention, the performing the first operation according to the touch information includes: executing corresponding operation on the content in the display window on the screen according to the touch information; the executing the second operation according to the touch information includes: and executing operation on the display window on the screen according to the touch information. The executing the first operation according to the touch information includes: and moving, amplifying, reducing, deleting, marking, writing or turning pages of the content in the display window on the screen according to the touch information.
In some embodiments of the present invention, determining the number of the point sets of the touch points according to the touch information includes: judging whether the distance between the contacts with the farthest distance is smaller than a first preset threshold value or not; if yes, judging that the number of the point sets of the contact points is 1; if not, judging that the number of the point sets of the contact points is 2.
In some embodiments of the present invention, determining the number of the point sets of the touch points according to the touch information includes: judging whether the distance between the contacts farthest away in the first direction is smaller than a first preset threshold value or not; if yes, judging that the number of the point sets of the contact points is 1; if not, judging that the number of the point sets of the contact points is 2.
When the touch device with the point set gestures provided by the embodiment of the invention is utilized, the single-point set and the double-point set are introduced, so that the operation steps are simplified, the convenience and the operation efficiency of human-computer interaction are improved, and more human-computer interaction modes and possibilities are provided.
Fig. 4 is a schematic structural view of a multi-touch device according to still another embodiment of the present invention. As shown in fig. 4, at least a portion of the touch device incorporating the touch method and the point-set gesture described above may be implemented by a multi-touch apparatus 400 including a processor 403, a memory 404, and a bus 410.
In some instances, the multi-touch device 400 can also include an input device 401, an input port 402, an output port 405, and an output device 406. The input port 402, the processor 403, the memory 404, and the output port 405 are connected to each other, and the input device 401 and the output device 406 are connected to the bus 410 through the input port 402 and the output port 405, respectively, and further connected to other components of the computing device 400. It should be noted that the output interface and the input interface can also be represented by I/O interfaces. Specifically, the input device 401 receives input information from the outside and transmits the input information to the processor 403 through the input port 402; processor 403 processes the input information based on computer-executable instructions stored in memory 404 to generate output information, stores the output information temporarily or permanently in memory 404, and then transmits the output information to output device 406 through output port 405; output device 406 outputs the output information to the exterior of device 400.
The memory 404 includes mass storage for data or instructions. By way of example, and not limitation, memory 404 may include an HDD, floppy disk drive, flash memory, optical disk, magneto-optical disk, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these. Memory 404 may include removable or non-removable (or fixed) media, where appropriate. Memory 404 may be internal or external to device 400, where appropriate. In a particular embodiment, the memory 404 is a non-volatile solid-state memory. In a particular embodiment, the memory 404 includes Read Only Memory (ROM). Where appropriate, the ROM may be mask-programmed ROM, Programmable ROM (PROM), Erasable PROM (EPROM), Electrically Erasable PROM (EEPROM), electrically rewritable ROM (EAROM), or flash memory or a combination of two or more of these.
Bus 410 includes hardware, software, or both to couple the components of device 400 to one another. By way of example, and not limitation, bus 410 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a Front Side Bus (FSB), a Hyper Transport (HT) interconnect, an Industry Standard Architecture (ISA) bus, an infiniband interconnect, a Low Pin Count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, a Serial Advanced Technology Attachment (SATA) bus, a video electronics standards association local (VLB) bus, or other suitable bus, or a combination of two or more of these. Bus 410 may include one or more buses 410, where appropriate. Although specific buses have been described and shown in the embodiments of the invention, any suitable buses or interconnects are contemplated by the invention.
When the touch device of the point-set gesture is implemented by the apparatus 400 shown in fig. 4, the input apparatus 401 receives touch information of a finger touch point input in a display window on a screen, and in a specific embodiment, an I/O interface connected to an output apparatus may include hardware, software, or both, providing one or more interfaces for communication between the apparatus 400 and one or more I/O devices. Device 400 may include one or more of these I/O devices, where appropriate. One or more of these I/O devices may allow communication between a person and device 400. By way of example, and not limitation, an I/O device may include a touch screen, a video camera, another suitable I/O device, or a combination of two or more of these. The I/O device may include one or more sensors. Embodiments of the present invention contemplate any suitable I/O devices and any suitable I/O interfaces for use therewith. The I/O interface may comprise one or more devices or software drivers capable of allowing the processor 403 to drive one or more of these I/O devices, where appropriate. The I/O interface may include one or more I/O interfaces, where appropriate. Although embodiments of the present invention describe and illustrate particular I/O interfaces, embodiments of the present invention contemplate any suitable I/O interfaces. The processor 403 determines whether the number of finger contacts is greater than a preset number based on the computer executable instructions stored in the memory 404, if so, performs a corresponding operation on the display window according to the touch information, and if not, performs a corresponding operation on the content in the display window according to the touch information. The results of the operation are then displayed as needed via output port 405 and output device 406.
According to some embodiments, a computer-readable storage medium is provided that may include instructions that, when executed on a computer, may cause the computer to perform the touch method of point set gestures described above.
In some examples, a computer program product containing instructions is provided that, when run on a computer, cause the computer to perform the touch method of point set gestures described above.
In some examples, a computer program is provided which, when run on a computer, causes the computer to perform the touch method of point set gestures described above.
In the above examples, the implementation may be in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
Some embodiments of the present invention further provide a multi-touch system, the system including at least one computer, a server, and a multi-screen stitching processor, one screen being formed by at least one sub-screen being stitched together by the multi-screen stitching processor, a display window of the at least one computer being displayed on the screen in a predetermined layout by the multi-screen stitching processor, the screen being adapted to capture a touch and display the display window; the server includes: a memory and a processor; the memory is used for storing executable program codes; the processor is configured to read the executable program code stored in the memory to execute the multi-touch method, which is not described herein again in view of the foregoing description of related methods.
Some embodiments of the present invention further provide a multi-touch system, the system including at least one computer, a server, a multi-screen stitching processor, and a screen, the screen being formed by at least one sub-screen being stitched together by the multi-screen stitching processor, for displaying at least one display window of the at least one computer on the screen in a predetermined layout by the multi-screen stitching processor, the screen being further adapted to capture a touch and display the display window; the server includes: a memory and a processor; the memory is used for storing executable program codes; the processor is configured to read the executable program code stored in the memory to execute the multi-touch method, which is not described herein again in view of the foregoing description of related methods.
In the touch system, the screen may include an infrared light curtain forming device and an infrared camera, and the infrared light curtain forming device is configured to form an infrared light curtain on the surface of the screen; the infrared camera is used for collecting touch input and sending the touch input to the server.
In the above touch system, the screen may comprise a capacitive, resistive, infrared frame, or surface acoustic wave touch input and display screen.
As mentioned above, in the present application, the screen is suitable for collecting touch input and displaying the display window. The screen may include an infrared light screen type touch screen, or may include other various touch screens, and the touch input may be captured by various other touch capturing modes shown in the corresponding various sensors, such as vector pressure sensing, capacitive type, resistive type, infrared frame type, near field imaging type, electromagnetic induction type, and surface acoustic wave type touch screens, that is, the screen may capture the touch input in one of vector pressure sensing type, capacitive type, resistive type, infrared frame type, near field imaging type, electromagnetic induction type, and surface acoustic wave type. The screen can also display the signal source in one mode of liquid crystal display, plasma display, organic light emitting diode display, vacuum fluorescent display and projection display. Since these touch input capturing and displaying manners and the screen integrating these touch input manners and displaying manners are well known to those skilled in the art, they are not described herein. In this application, for convenience of description, such a screen that can accept touch input and perform display is simply referred to as a screen, and accordingly, it can be understood by those skilled in the art that the screen of this application is intended to include a touch sensor (or a collector) and a display unit that implement the touch input and display, for example, the screen of this application covers the following screens:
the screen comprises a component for collecting touch input in one of a vector pressure sensing type, a capacitance type, a resistance type, an infrared frame type, a near field imaging type, an electromagnetic induction type and a surface acoustic wave type, and further comprises a liquid crystal display unit, a plasma display unit, an organic light emitting diode display unit, a vacuum fluorescent display unit or a projection display unit.
In some embodiments, the screen may include an infrared light curtain forming device for forming an infrared light curtain on a surface of the screen; the infrared camera is used for collecting touch input. And the infrared camera sends the touch input to a server.
In some embodiments, the infrared light curtain may be formed on the screen surface by an infrared laser or a laser array above the screen, and when the screen on which the infrared light curtain is disposed is acted on by a finger or the like, a touch input of a user in the screen is captured by the infrared camera. For example, when a finger touch point is input into a display window on the screen, the light distribution of the infrared light curtain at the finger touch point will change due to the touch behavior, for example, part of infrared light at the finger touch point is diffusely reflected off the screen by the touch finger and is then captured by the infrared camera located in front of the screen or the sub-screen; or partial infrared light at the touch input position penetrates through the screen due to the action of the touch finger and is shot by the infrared camera positioned behind the screen or the sub-screen, then the infrared camera sends an infrared image including the touch input to the server in an electric signal form, the server processes and analyzes the received electric signal to obtain touch information of a user when the user touches the display window on any sub-screen, and the touch information can include position, track and/or area information of a touch point.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
For the convenience of understanding, the multi-screen splicing processor is introduced to illustrate that the multi-screen splicing processor sends the content corresponding to the display window to the other sub-screen displays. However, it will be understood by those skilled in the art that the functions of the multi-screen splicing processor can be integrated into the server, and the processor implements the related functions of the multi-screen splicing processor, so that the advantage of this process is that the system size can be reduced.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A multi-touch method, wherein the method comprises:
receiving touch information from a screen;
judging the number of point sets of the touch points according to the touch information, and if the number of the point sets of the touch points is judged to be equal to 1, executing a first operation according to the touch information; and if the number of the contact points is judged to be more than or equal to 3 and the number of the point sets of the contact points is judged to be 2, executing a second operation according to the touch information.
2. The multi-touch method according to claim 1, wherein determining the number of touch point sets comprises: judging whether the distance between the contacts with the farthest distance is smaller than a first preset threshold value or not; if yes, judging that the number of the point sets of the contact points is 1; if not, judging that the number of the point sets of the contact points is 2.
3. The multi-touch method according to claim 1, wherein determining the number of touch point sets comprises: judging whether the distance between the contacts farthest away in the first direction is smaller than a first preset threshold value or not; if yes, judging that the number of the point sets of the contact points is 1; if not, judging that the number of the point sets of the contact points is 2.
4. The multi-contact touch method according to claim 2 or 3, wherein the first preset threshold is determined according to a distance between two finger tips that are farthest apart when a single palm is spread.
5. The multi-touch method according to claim 1, comprising, if it is determined from the touch information that the number of point sets within the preset time interval is equal to 1, performing a first operation according to the touch information; and if the number of the point sets in the preset time interval is judged to be more than or equal to 3 and the number of the point sets of the touch points is judged to be 2 according to the touch information, executing a second operation according to the touch information.
6. The multi-touch method according to claim 1, wherein performing the first operation according to the touch information comprises: executing corresponding operation on the content in the display window on the screen according to the touch information; the executing the second operation according to the touch information includes: and executing corresponding operation on the display window on the screen according to the touch information.
7. The multi-touch method of claim 1, wherein the screen is tiled by at least one sub-screen via a multi-screen tiling processor, the touch information is from a first sub-screen on the screen, and a second operation is performed according to the touch information, comprising:
controlling the multi-screen splicing processor to output the display window of the first sub-screen to other sub-screens for display according to the touch information; or controlling the multi-screen splicing processor to output the display window of the first sub-screen to the first sub-screen and other sub-screens for display according to the touch information; or
Controlling a multi-screen splicing processor to restore display windows on other sub-screens to the first sub-screen for display or restore display windows distributed across other sub-screens to the first sub-screen for display according to touch information; or
Controlling a multi-screen splicing processor to output pre-stored information associated with a display window of a first sub-screen to the first sub-screen for display according to the touch information; or
And controlling the multi-screen splicing processor to output pre-stored information associated with the display window of the first sub-screen to other sub-screens for display according to the touch information.
8. The multi-touch method according to claim 2 or 3, comprising:
after the number of the point sets of the contact points is judged to be 2, points which are within a first preset threshold range from the contact point at the front end in the first direction in the direction are divided into first point sets, and points which are within the first preset threshold range from the contact point at the rear end in the direction in the first direction are divided into second point sets.
9. The multi-touch method of claim 8, further comprising:
calculating the minimum distance between the points in the first point set and the coordinates of the points in the second point set in the first direction, judging whether the minimum distance between the two point sets is larger than a second preset threshold value, and if so, executing a second operation according to touch information; or,
calculating the distance between two contact points with the smallest distance in the first point set and the second point set; and judging whether the distance between the two contacts with the minimum distance in the first point set and the second point set is greater than a second preset threshold value or not, and if so, executing a second operation according to the touch information.
10. The multi-touch method of claim 8, comprising:
calculating the minimum distance between the points in the first point set and the coordinates of the points in the second point set in the first direction, judging whether the calculated minimum distance between the two point sets is larger than a second preset threshold value or not, and if not, not executing a second operation; or,
calculating the distance between two contact points with the smallest distance in the first point set and the second point set; and judging whether the distance between the two contact points with the minimum distance between the first point set and the second point set is greater than a second preset threshold value or not, and if not, not executing a second operation.
CN201810150615.0A 2018-02-13 2018-02-13 Multiconductor touch control method, device, equipment and computer readable storage medium Pending CN110162257A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810150615.0A CN110162257A (en) 2018-02-13 2018-02-13 Multiconductor touch control method, device, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810150615.0A CN110162257A (en) 2018-02-13 2018-02-13 Multiconductor touch control method, device, equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN110162257A true CN110162257A (en) 2019-08-23

Family

ID=67635449

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810150615.0A Pending CN110162257A (en) 2018-02-13 2018-02-13 Multiconductor touch control method, device, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110162257A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110045868A (en) * 2019-03-25 2019-07-23 深圳市德名利电子有限公司 Touch point modification method, touch device and electronic equipment based on algorithm of birdsing of the same feather flock together
CN114253417A (en) * 2021-12-02 2022-03-29 Tcl华星光电技术有限公司 Multi-touch-point identification method and device, computer readable medium and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102768595A (en) * 2011-11-23 2012-11-07 联想(北京)有限公司 Method and device for identifying touch operation instructions on touch screen
CN102855077A (en) * 2011-07-01 2013-01-02 宫润玉 Mode switching method for multifunctional touchpad
CN103970360A (en) * 2013-01-30 2014-08-06 北京汇冠新技术股份有限公司 Multipoint-touch-based gesture identification method and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855077A (en) * 2011-07-01 2013-01-02 宫润玉 Mode switching method for multifunctional touchpad
CN102768595A (en) * 2011-11-23 2012-11-07 联想(北京)有限公司 Method and device for identifying touch operation instructions on touch screen
CN103970360A (en) * 2013-01-30 2014-08-06 北京汇冠新技术股份有限公司 Multipoint-touch-based gesture identification method and system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110045868A (en) * 2019-03-25 2019-07-23 深圳市德名利电子有限公司 Touch point modification method, touch device and electronic equipment based on algorithm of birdsing of the same feather flock together
CN110045868B (en) * 2019-03-25 2022-07-26 深圳市德明利技术股份有限公司 Touch point correction method based on clustering algorithm, touch device and electronic equipment
CN114253417A (en) * 2021-12-02 2022-03-29 Tcl华星光电技术有限公司 Multi-touch-point identification method and device, computer readable medium and electronic equipment
CN114253417B (en) * 2021-12-02 2024-02-02 Tcl华星光电技术有限公司 Multi-touch point identification method and device, computer readable medium and electronic equipment

Similar Documents

Publication Publication Date Title
US10761610B2 (en) Vehicle systems and methods for interaction detection
US9069386B2 (en) Gesture recognition device, method, program, and computer-readable medium upon which program is stored
US10203764B2 (en) Systems and methods for triggering actions based on touch-free gesture detection
US20150268789A1 (en) Method for preventing accidentally triggering edge swipe gesture and gesture triggering
US11188125B2 (en) Information processing apparatus, information processing meihod and program
US7880720B2 (en) Gesture recognition method and touch system incorporating the same
US9529527B2 (en) Information processing apparatus and control method, and recording medium
US20160004373A1 (en) Method for providing auxiliary information and touch control display apparatus using the same
US20140189579A1 (en) System and method for controlling zooming and/or scrolling
US9454260B2 (en) System and method for enabling multi-display input
US10521101B2 (en) Scroll mode for touch/pointing control
US9880721B2 (en) Information processing device, non-transitory computer-readable recording medium storing an information processing program, and information processing method
CN108733302B (en) Gesture triggering method
US10969827B2 (en) Electronic device and method for controlling user interface therein
US10402080B2 (en) Information processing apparatus recognizing instruction by touch input, control method thereof, and storage medium
US20180052598A1 (en) Multi-touch based drawing input method and apparatus
CN109101173B (en) Screen layout control method, device, equipment and computer readable storage medium
US20140152569A1 (en) Input device and electronic device
US20150205483A1 (en) Object operation system, recording medium recorded with object operation control program, and object operation control method
TWI499938B (en) Touch control system
EP2799970A1 (en) Touch screen panel display and touch key input system
CN110162257A (en) Multiconductor touch control method, device, equipment and computer readable storage medium
JP2017215842A (en) Electronic apparatus
KR20140086805A (en) Electronic apparatus, method for controlling the same and computer-readable recording medium
TWI408488B (en) Interactive projection system and system control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20190823

WD01 Invention patent application deemed withdrawn after publication