CN115993894A - Touch response method and device, interaction panel and storage medium - Google Patents

Touch response method and device, interaction panel and storage medium Download PDF

Info

Publication number
CN115993894A
CN115993894A CN202111209550.0A CN202111209550A CN115993894A CN 115993894 A CN115993894 A CN 115993894A CN 202111209550 A CN202111209550 A CN 202111209550A CN 115993894 A CN115993894 A CN 115993894A
Authority
CN
China
Prior art keywords
touch
functional logic
point information
display screen
touch operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111209550.0A
Other languages
Chinese (zh)
Inventor
林德熙
李少珺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Original Assignee
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Electronics Thecnology Co Ltd, Guangzhou Shirui Electronics Co Ltd filed Critical Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority to CN202111209550.0A priority Critical patent/CN115993894A/en
Priority to PCT/CN2022/120117 priority patent/WO2023065939A1/en
Publication of CN115993894A publication Critical patent/CN115993894A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a touch response method, a device, an interaction panel and a storage medium, wherein the touch response precision of a touch frame equipped with the interaction panel reaches a set precision range, and the method comprises the following steps: receiving a first touch operation, wherein the first touch operation is generated based on touch point information fed back by the touch frame when a user switches a touch object and controls the touch object to be contacted with the display screen after switching; and executing first functional logic corresponding to the first touch operation, wherein the first functional logic is related to the current use scene. By the method, whether the touch object currently acting on the display screen is switched or not can be actively distinguished, and the corresponding touch operation under the current use scene when the touch object is contacted with the display screen after the switching can be sensitively and rapidly determined, so that the response of the touch operation is effectively given, and the improvement of the touch response efficiency on the interactive panel is realized.

Description

Touch response method and device, interaction panel and storage medium
Technical Field
The present disclosure relates to the field of touch technologies of electronic devices, and in particular, to a touch response method, a touch response device, an interaction panel, and a storage medium.
Background
The interactive tablet is equivalent to an integrated device that controls content displayed on a display tablet (display screen) through a touch technology and implements man-machine interaction operation. The equipment integrates multiple functions of a projector, an electronic whiteboard, a curtain, sound equipment, a television and a video conference terminal, is widely applied to the fields of education and teaching, enterprise conferences, business display and the like, and can effectively improve the communication environment and enhance the group communication efficiency.
At present, the interaction between the interaction panel and the user is mainly realized through touch control, for example, the user can interact through touching the touch screen on the interaction panel by fingers, and can also interact through touching the touch screen by a touch pen. In some application scenarios, for example, a user performs blackboard writing on an electronic whiteboard, where the electronic whiteboard is integrated on an interactive tablet, a touch medium, i.e. a stylus, is required to be used for writing in the blackboard writing process, and when the operations of element selection, dragging or zooming are performed, the touch medium is switched to use a finger for touching.
In practical applications, if a touch object in contact with the touch screen of the interactive panel is switched, it is difficult for the interactive panel to effectively determine what touch object is currently in contact with the touch display screen. On the basis that what kind of touching object can not be effectively distinguished to touch the display screen, the interaction panel can not give effective response to the touch operation of the touching object, so that natural interaction can not be realized, and further the user experience of the interaction panel is affected.
Disclosure of Invention
In view of this, the embodiments of the present application provide a touch response method, apparatus, interactive pad, and storage medium, so as to implement effective response of touch operation when a touch object in contact with the interactive pad is switched.
In a first aspect, an embodiment of the present application provides a touch response method, applied to an interaction panel, where touch response accuracy of a touch frame equipped by the interaction panel reaches a set accuracy range, the method includes:
receiving a first touch operation, wherein the first touch operation is generated based on touch point information fed back by the touch frame when a user switches a touch object and controls the touch object to be contacted with the display screen after switching;
and executing first functional logic corresponding to the first touch operation, wherein the first functional logic is related to the current use scene.
Further, the method further comprises:
receiving at least two second touch operations, wherein the second touch operations are generated based on touch point information fed back by the touch frame relative to each touch object respectively when the number of the touch objects controlled by a user is greater than 1;
and executing second functional logic corresponding to each second touch operation, wherein the second functional logic is related to the current use scene.
Further, the method further comprises:
receiving a third touch operation, wherein the third touch operation is generated based on touch point information fed back by the touch frame relative to the touch object when the touch object controlled by a user is in single-point contact with the display screen;
and executing third functional logic corresponding to the third touch operation, wherein the third functional logic is related to the current use scene.
Further, before executing the third functional logic corresponding to each third touch operation, the method further includes:
if the first execution object contained in the third functional logic exists in the fourth functional logic and the fourth functional logic is in an execution state, ignoring the response to the third touch operation and canceling the execution of the third functional logic;
the fourth function logic is executed when a fourth touch operation is received, and the fourth touch operation is generated based on touch point information fed back by the touch frame when a touch object controlled by a user is contacted with the display screen.
Further, the method further comprises:
receiving a fifth touch operation, wherein the fifth touch operation is generated based on touch point information fed back by the touch frame when a touch object controlled by a user is in multi-point contact with the display screen;
And executing fifth functional logic corresponding to the fifth touch operation, wherein the fifth functional logic is related to the current use scene.
Further, before executing the fifth functional logic corresponding to the fifth touch operation, the method further includes:
and if the second execution object contained in the fifth functional logic exists in the fourth functional logic and the fourth functional logic is in an execution state, ignoring the response to the fifth touch operation and canceling the execution of the fifth functional logic.
Further, the method further comprises:
and if the fourth touch operation is received in the process of executing the fifth functional logic and the fourth functional logic corresponding to the fourth touch operation contains a second execution object in the fifth functional logic, ignoring the response to the fourth touch operation and canceling the execution of the fourth functional logic.
Further, the controlled touching object is a first touching object or a second touching object,
the first touch object is specifically a passive stylus;
the second touching object is specifically a limb part of the user, and the limb part comprises fingers, a palm and a back of hand.
Further, in the current usage scenario, the functional logic executed in response to the touch operation includes: handwriting writing, element selection, element dragging movement, and element scaling;
the elements include: lines, text, images, graphics, video, and canvas.
Further, after the touch object is contacted with the display screen, the step of generating the touch operation based on the touch point information fed back by the touch frame includes:
acquiring touch point information fed back through the touch frame, and extracting the height and width of the touch point included in the touch point information;
determining the medium attribute of the touch object through the wide height range where the height and the width of the touch point are positioned;
and determining touch operation matched with the contact behavior of the touch object according to the medium attribute and other information in the touch point information.
Based on the optimization, the obtaining the touch point information fed back through the touch frame includes:
identifying each touch signal through a hardware circuit in the touch frame, wherein the touch signal is generated when the touch object moves on the display screen;
obtaining touch point information fed back by the touch frame for each touch signal through a human-computer interaction HID standard protocol,
Wherein one touch point information corresponds to one touch point, and the touch point information includes: touch point coordinates, touch point height and width, and touch rotation angle.
Further, after obtaining the touch point information fed back through the touch frame, the method further includes:
and processing each piece of touch point information so that each piece of touch point information has a unified unit format and data structure.
On the basis of the optimization, the processing each touch point information comprises the following steps:
according to the acquired size information of the touch frame and screen resolution information, converting units of various data information in the touch point information into a unified set unit format;
and recording the touch point information by adopting a data structure corresponding to the set unit format.
In a second aspect, an embodiment of the present application provides a touch response device configured on an interaction panel, where touch response accuracy of a touch frame equipped on the interaction panel reaches a set accuracy range, the device includes:
the first receiving module is used for receiving a first touch operation, and the first touch operation is generated based on touch point information fed back by the touch frame when a user switches a touch object and controls the touch object to be contacted with the display screen after switching;
And the second receiving module is used for executing first functional logic corresponding to the first touch operation, and the first functional logic is related to the current use scene.
In a third aspect, embodiments of the present application provide an interactive tablet, including:
the touch frame is provided with touch response precision reaching a set precision range and is used for responding to the touch operation of a touch object through a hardware circuit;
the display screen is covered with the touch frame to form a touch screen for displaying interactive contents;
one or more processors;
a storage means for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method as provided in the first aspect of the present application.
In a fourth aspect, embodiments of the present application also provide a storage medium containing computer-executable instructions for performing the method of the first aspect when executed by a computer processor.
The touch response method, the device, the interactive panel and the storage medium provided by the invention can be executed by the interactive panel, and the touch response precision of the touch frame arranged on the interactive panel reaches the set precision range; the method can receive a first touch operation and execute first function logic corresponding to the first touch operation, wherein the first touch operation is mainly generated based on touch point information fed back by a touch frame when a user switches a touch object and the touch object is contacted with a display screen after the user switches the touch object, and meanwhile, the first function logic is related to a current use scene. According to the technical scheme, the interactive flat plate of the high-precision touch frame is configured on the hardware structure, whether the touch object currently acting on the display screen is switched or not can be actively distinguished, and the corresponding touch operation under the current use scene when the touch object contacts the display screen after the switching can be sensitively and rapidly determined, so that the response of the touch operation can be effectively given. Compared with the existing implementation mode, the method provided by the embodiment realizes the active identification and the rapid and effective response of the interaction panel to the touch object switching, and further realizes the improvement of the touch response efficiency on the interaction panel.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the following drawings, in which:
fig. 1 is a schematic flow chart of a touch response method according to an embodiment of the present application;
FIG. 1a is a schematic diagram illustrating an exemplary implementation of the first function logic in the touch response method according to the first embodiment of the present disclosure;
FIG. 1b is a schematic diagram illustrating another exemplary implementation of the first function logic in the touch response method according to the first embodiment of the present disclosure;
FIG. 1c is a diagram showing a response effect of the third function logic in the touch response method according to the first embodiment of the present disclosure;
fig. 2 is a block diagram of a touch response device according to a second embodiment of the present application;
fig. 3 is a schematic structural diagram of an interactive tablet according to a third embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the following detailed description of the embodiments of the present application will be given with reference to the accompanying drawings. It should be understood that the described embodiments are merely some, but not all, of the embodiments of the present application. All other embodiments, based on the embodiments herein, which would be apparent to one of ordinary skill in the art without making any inventive effort, are intended to be within the scope of the present application.
When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
In the description of this application, it should be understood that the terms "first," "second," "third," and the like are used merely to distinguish between similar objects and are not necessarily used to describe a particular order or sequence, nor should they be construed to indicate or imply relative importance. The specific meaning of the terms in this application will be understood by those of ordinary skill in the art as the case may be. Furthermore, in the description of the present application, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Example 1
Fig. 1 is a schematic flow chart of a touch response method according to an embodiment of the present application. The embodiment is applicable to a case that a touch response is given relative to the contact of a touch object after the touch object is contacted with a display screen of an interactive flat panel. The method may be performed by a touch responsive device, which may be implemented in software and/or hardware, may be configured in an interactive tablet, and in particular in a processor of the interactive tablet, which may be a host processor in an intelligent processing system. Meanwhile, a touch frame is arranged in the interactive panel, and the touch response precision reaches a set precision range; in addition, the touch frame is also electrically connected with the display screen.
In practical application, the hardware part of the interactive flat board is composed of a display screen, an intelligent processing system and the like, and is combined by integral structural members, and meanwhile, the interactive flat board is supported by a special software system.
The display screen may specifically include a light emitting diode (LightEmittingDiode, LED) display screen, an organic light-EmittingDiode, OLED (organic light-EmittingDiode, OLED) display screen, a liquid crystal display screen (LiquidCrystalDisplay, LCD) display screen, and the like. Specifically, the interactive flat panel display screen refers to a touch screen, a touch screen and a touch panel, and is an inductive liquid crystal display device, when a graphic button on the screen is touched, the touch feedback system on the screen can drive various connecting devices according to a preprogrammed program, and can be used for replacing a mechanical button panel and producing vivid video and audio effects by means of a liquid crystal display picture. Touch screens are distinguished from technical principles and can be divided into five basic categories; vector pressure sensing technology touch screens, resistive technology touch screens, capacitive technology touch screens, infrared technology touch screens, and surface acoustic wave technology touch screens. Touch screens can be divided into four categories according to the operating principle of the touch screen and the medium for transmitting information: resistive, capacitive, infrared, and surface acoustic wave.
The optical touch sensor forming the touch frame can scan a touch object on the surface of the display screen by using optical signals, the touch object can be a finger, a touch pen and the like, and different touch objects can be regarded as different touch media for interactive operation with the interactive flat plate. In addition, it is understood that, in order to protect the display screen from being scratched by a touch object, a cover glass is disposed on a surface of the display screen, and thus, in the embodiment of the present specification, the surface of the display screen refers to a surface of the cover glass of the display screen.
Generally, the touch response of the interactive tablet may be considered as a response given by the interactive tablet at the software application level with respect to a trigger after the touch object touches the display screen and triggers a certain interface on the display screen, so as to implement execution of various interactive applications. Specifically, in the process of realizing touch response by the interactive panel, one key link is a touch operation generated when a touch frame on the interactive panel responds to a touch object to touch the display screen; and then corresponding touch operation information is transmitted to the intelligent processing system at the application level by the touch frame, so that touch response to touch operation is realized by the intelligent processing system.
It should be noted that, in the conventional application, a non-high-precision touch frame with a touch response precision in a conventional range is generally configured on the interactive tablet, when a touch mode signal of a touch object is responded, the size of a touch area of the touch object on a display screen may be difficult to identify for the non-high-precision touch frame with the touch response precision in the conventional range, so that in a touch writing mode, it is difficult to well judge what type of touch object is adopted by a user to write, or it is also difficult to judge what touch medium (finger and stylus) is adopted by the user to touch; in another example, in the touch erase mode, it is difficult to determine well what type of touching object is adopted by the user to erase, and only the erasing is completed based on the conventional erase mode. Meanwhile, the non-high-precision touch frame is difficult to ensure that the same type of touch object presents the same touch area in the touch process.
Meanwhile, in the interactive implementation of the interactive tablet, after different touch objects are contacted with the display screen, touch response possibly given by the intelligent processing system in an application layer is different in different use scenes. When the interactive panel presents an interactive interface of the electronic whiteboard, and the interface is touched by the touch pen, the corresponding touch response can be considered as the presentation of touch handwriting; when touched by a finger, the corresponding touch response can be considered as selection or dragging of a certain element.
In addition, in the process that the interactive tablet performs interaction with the user based on the touch response, one situation that may be encountered is that the user needs to trigger different functional logic of the interactive tablet response by changing the touch object on the interactive tablet. For example, dragging or zooming the content presented by the interface by adopting multi-touch control based on fingers; for another example, a stylus is used for writing, and the writing parameter setting requirement is met in the writing process, and the writing parameter setting requirement is converted into a finger for carrying out the selected operation of the touch writing to be set.
Therefore, for the existing interactive flat plate integrated with the conventional touch response precision touch frame, when a user has the touch object switching requirement in the use process, the interactive flat plate cannot actively know whether the touch object controlled by the user is switched, namely, the interactive flat plate cannot actively distinguish whether the touch object is a hand or a pen. But rather requires the outside world to send an exchange signal to the interactive pad to inform the touch object that a switch has occurred. Regarding the identification of the interaction panel to switch the touch object in contact with the display screen, the existing adopted mode mainly comprises the following steps:
1) The method mainly comprises the steps that when a user needs to touch an interactive flat plate by hands, the user firstly clicks a hand touch button presented on an interface to conduct a hand touch mode, and at the moment, the interactive flat plate can respond to hand touch operation so as to execute functional logics such as element selection, zooming and dragging; for another example, when the user needs to touch with a pen on the interactive tablet, the user needs to click a pen touch button presented on the interface to perform a touch mode, and at this time, the interactive tablet can respond to the pen touch operation, so as to execute functional logic such as handwriting writing. The user actively selects the triggering mode, and the shortcoming is mainly that when the user has the touch object switching requirement, the user needs to switch back and forth, so that the original interaction realization of the user can be interrupted, and the natural interaction can not be realized well.
2) The method mainly adopts a special pen, such as an active pen with electronic equipment, and when the active pen is used for contacting a screen, a signal is sent to an intelligent processing system layer by the active pen, so that an interaction panel can know that the active pen is switched to be used as a touch object at present. The disadvantage of this approach is that the use of an active pen with electronic equipment will increase costs and that the special active pen is difficult to maintain and manage; in addition, when the active pen is used in a switching mode, the active pen can delay or lose the signal in the process of sending the signal to the intelligent processing system layer, so that the interactive panel can not timely sense that the touch object changes.
Based on the above description, the touch response method provided by the embodiment is applied to the interactive panel, and the interactive panel integrates the touch frame with the touch response precision reaching the set precision range, and the touch frame can feed back finer touch information to the intelligent processing system of the application layer. The intelligent processing system of the interaction panel can actively identify whether the touch object contacted with the display screen changes based on finer touch information, and further accurate touch response is given to touch interaction between a user and the interaction panel by adopting the touch response method provided by the embodiment of the application.
As shown in fig. 1, the touch response method provided in the first embodiment specifically includes the following operations:
s101, receiving a first touch operation, wherein the first touch operation is generated based on touch point information fed back by the touch frame when a user switches a touch object and controls the touch object to be contacted with the display screen after switching.
In this embodiment, the generation timing of the first touch operation may be considered that the user touches the display screen of the interactive panel with a touch object different from the previous touch object, so as to implement interaction with the interactive panel, that is, compared with the previous touch object used by the user, the currently used touch object touching the display screen is switched, where the interactive panel may be considered as an execution subject of the method provided by the embodiment.
In this embodiment, the touch object may specifically be a finger of a user, an active touch pen, or the like, and the user may control the touch object to make contact with a surface of a display screen integrated with the interaction panel. It will be appreciated that the contact behavior of a user with a display screen via a touch object may include sliding across the surface of the display screen, pressing the display screen, or dragging a display interface in the display screen, etc. And the contact behaviors can be determined by the mutual cooperation analysis of the related hardware or software of the execution main body, and can determine what operation is specifically performed on the display screen by the user through the analysis of the related data information of the contact behaviors, so that the touch operation triggered by the user can be received through the step.
The related data information associated with the touch object contacting the display screen can be determined by a touch frame on the interactive flat panel and fed back to the application layer of the interactive flat panel. In this embodiment, the specific implementation of the analysis by the execution subject to determine that the user performs the first touch operation on the screen may be described as follows: the touch frame with the touch response precision reaching the set precision range is configured on the execution main body, so that a touch signal generated by touching when a user controls a touch object to contact the display screen can be responded, and touch point information generated when the touch object contacts the display screen is obtained and fed back to the upper intelligent processing system; the touch point information at least comprises the touch width and the touch height of the touch object when the touch object contacts the display screen, so that the intelligent processing system can determine the touch area of the touch object on the display screen, if the touch area is different from the touch area of the previous touch object, the touch object currently contacted with the display screen can be determined to be switched, and the specific medium attribute of the touch object can be determined according to the range of the touch area.
And then, the intelligent processing system can also determine what the contact behavior of the touch object is controlled to contact the display screen by the user through analyzing other data in the touch point information fed back by the touch frame, such as sliding behavior, clicking behavior or dragging behavior. Finally, the execution body can determine that the user performs interaction of the first touch operation relative to the interaction panel by combining the determined medium attribute and the specific contact behavior of the touch object, so that the first touch operation can be generated, and the first touch operation generated by interaction with the user is received through the step.
In the execution body, regarding the first touch operation received in this step, on the one hand, the key aspect of the generation is that a high-precision touch frame with high touch response precision is adopted on the interaction panel, and by using the touch frame adopted in this embodiment, finer touch information can be provided to an upper application layer, for example, a touch area of a touch object, more accurate touch point coordinates, a rotation angle of the touch object in a touch process, and the like can be provided. On the other hand, the analysis of the information of the touch points fed back by the touch frame mainly aims at the intelligent processing system.
Specifically, the intelligent processing system can comprise a host processor, the host processor belongs to an interactive flat panel processor, and software built in the host processor can realize different functional applications and display pictures by a display screen to produce vivid video and audio effects. The host processor belongs to an operation module with higher performance.
For example, the host processor may be an Android (Android) module, i.e., an Android (Android) system may be installed, and components such as a CPU (central processing unit), a GPU (graphics processing unit), a RAM (random access memory), a ROM (Read-only memory), and the like are configured, for example, for the Android7.0 version, the CPU is dual core a72 and quad core a53, the GPU is MaliT860, the RAM is 4GB, the ROM is 32GB, and the like.
For another example, the host processor may be a PC (personal computer) module configured with components such as a CPU, a GPU, a memory, and a hard disk, for example, for a plug-in type IntelCore series modular computer, the CPU is IntelCore i5/i7, the GPU is a core display intel hdgraphics, the memory is DDR48G/16G, and the hard disk is 128G/256G.
Further, the touch object controlled by the embodiment is a first touch object or a second touch object, where the first touch object is specifically a passive stylus; the second touching object is specifically a limb part of the user, and the limb part comprises fingers, a palm and a back of hand.
In this embodiment, standing at the angle of the user, the user is equivalent to performing switching of the touch object, and the touch object is contacted with the display screen after the switching is controlled, so that a first touch operation is generated on the interactive flat panel level. The touch object controlled by the user may be a first touch object or a second touch object, the switching of the touch object may be regarded as that the first touch object is switched to the second touch object, or the second touch object is switched to the first touch object, and the first touch object and the second touch object are touch objects with different medium properties, in this embodiment, the first touch object is preferably a passive stylus, that is, a stylus that does not actively send signals to the interactive tablet; the second touching object is preferably a limb part of the user, and is mainly a hand part of the user, such as a finger, a palm or a back of the hand.
S102, executing first functional logic corresponding to the first touch operation, wherein the first functional logic is related to the current use scene.
It is known that in some usage scenarios, as the touching object is switched, even though the touching behavior exhibited by the touching object in contact with the display screen is manipulated, the executing actions characterized by the touching behavior are different at the application level. For example, in a scene of document editing by an electronic whiteboard, when a controlled touch object is a stylus and a contact behavior with a display screen is a sliding behavior, an execution action represented in the editing scene may be writing on the electronic whiteboard so as to present writing handwriting; for another example, when the touch object controlled in the editing scene is a finger and the contact behavior of the touch object and the display screen is a sliding behavior, the represented execution action may be drag roaming of the electronic whiteboard.
Therefore, after the execution main body determines that the touch object controlled by the user is switched and generates the first touch operation based on analysis of corresponding contact behaviors of the touch object contacting the display screen, the execution main body can determine that the first touch operation can also have different function logics in different use scenes.
In this embodiment, this step may specifically determine what touch response should be given with respect to the first touch operation by analyzing the current usage scenario, so as to execute the first functional logic matching the touch response. The functional logic may be specifically understood as execution logic for implementing a certain application function on the application layer by the interactive tablet, and in this embodiment, the functional logic corresponding to the first touch operation is denoted as first functional logic.
In this embodiment, because the interactive operations executable by the interactive tablet at the application level are diversified, the interactive tablet can set touch response of different functional logics for different touch behaviors generated by the touch object being controlled by the user to touch the display screen in advance based on different application software, application modes or application scenes. Therefore, after the application software is triggered to run or different application modes or application scenes are entered, the functional logic of the touch operation in the use scene can be matched as long as the touch operation is received.
In the process of teaching through the interactive tablet, one teaching scene is a teaching scene, that is, the teaching person displays the edited teaching document to give lessons, and meanwhile, the electronic whiteboard (commonly called a small blackboard) with a fixed area size can be triggered to start, and in the use scene of the blackboard, writing is performed by a touch pen, handwriting attribute selection is performed by hand, or operations such as dragging and scaling are performed on the current display area of the electronic whiteboard. In the use scene, firstly, touch object switching in interactive operation is involved, and meanwhile, different touch objects correspond to different functional logic.
According to the method provided by the embodiment, after the touch object is switched, the first touch operation generated when the touch object is contacted with the display screen in a certain contact behavior can be controlled through the step S101, and then the step S102 is used for determining that the first touch operation performs element editing on the electronic whiteboard, namely, the function logic specifically corresponding to the use scene is executed, so that the touch response to the first touch operation is realized.
By way of example, one of the first functional logic may perform the description of: assuming that the first touch operation is that after a touch object is switched to a touch pen by a finger, a user controls the touch pen to slide on a display screen, and meanwhile, assuming that a current use scene of the first touch operation is that writing is performed in an electronic whiteboard, the function logic corresponding to the first touch operation under the use scene can be directly matched to perform writing on the electronic whiteboard, and at this moment, the operation specifically executed in S102 is that writing is performed according to a set writing presentation strategy along with the sliding of the touch pen on the display screen.
Fig. 1a is a schematic diagram illustrating an exemplary implementation of the first functional logic in the touch response method according to the first embodiment of the present application. As shown in FIG. 1a, the line 11 in the figure may be considered as writing trace presented by a user based on a swipe of a stylus on an electronic whiteboard interface displayed on an interactive tablet after switching the touching object from a finger to the stylus.
Following the above description, another execution description of the first functional logic may be: assuming that the first touch operation is that after a touch object is switched to a finger by a touch pen, a user controls the mobile phone to slide on the display screen, and meanwhile assuming that a current use scene of the first touch operation is that writing is performed in the electronic whiteboard, the function logic corresponding to the first touch operation under the use scene can be directly matched to move the display area on the electronic whiteboard. At this time, the operation specifically performed in S102 is to drag the display area of the electronic whiteboard to move along with the sliding of the finger on the display screen.
Fig. 1b is a schematic diagram illustrating another exemplary implementation of the first functional logic in the touch response method according to the first embodiment of the present application. Assuming the position of line 11 in the electronic whiteboard in fig. 1a is the reference position, then the position of line 11 in fig. 1b is closer to the upper left corner of the canvas than the position of line 11 in the canvas in fig. 1a is in the center, as shown in fig. 1 b. That is, after the user switches the touch object from the stylus pen to the finger, the canvas including the line 11 moves on the electronic whiteboard based on the sliding of the finger.
Further, it can be known that, with different application software started by the interactive tablet or different application of the executed function, the usage scenario corresponding to the current touch operation (the first touch operation) is different, and the function logic executed in response to the touch operation is also different. Wherein, the elements involved in the execution of the functional logic can be, but are not limited to, lines, characters, images, graphics, video, canvas, etc.
According to the touch response method provided by the embodiment, the execution main body executing the method is firstly provided with the high-precision touch frame on the hardware structure, whether the touch object currently acting on the display screen is switched or not can be actively distinguished, and the corresponding touch operation under the current use scene when the touch object is contacted with the display screen after the switching can be sensitively and rapidly determined, so that the response of the touch operation is effectively given. Compared with the existing implementation mode, the method provided by the embodiment realizes the active identification and the rapid and effective response of the interaction panel to the touch object switching, and further realizes the improvement of the touch response efficiency on the interaction panel.
It should be noted that, in the method provided in this embodiment, the key point is that the touch response precision of the touch frame equipped on the adopted interactive panel reaches the set precision range, so that the autonomous recognition of the switching of the interactive panel to the touch object can be realized, and further the interruption to the original interactive operation and the high investment of the recognition cost in the existing method for recognizing the switched touch object are avoided.
Correspondingly, on the basis that the interaction panel can autonomously identify whether the touch object is switched and directly respond to the corresponding touch operation after the touch object is switched, the interaction panel can also realize the following operations: all touch objects which are in contact with the display screen can be identified (the medium properties of the touch objects can be the same or different), and concurrent response is carried out on the touch operation which is respectively associated with a plurality of touch objects, for example, a user controls the second touch object to be in contact with the display screen on the basis of controlling the first touch object to be in contact with the display screen, and the interaction panel can identify each touch object which is in contact with the display screen and carry out touch response on the touch operation which is respectively triggered by each touch object; for another example, the user controls the second and even the third first touching objects to contact the display screen on the basis of controlling the first touching objects to contact the display screen, and the interactive panel can identify each first touching object in contact and perform touch response on the touch operation triggered by each first touching object.
In the above implementation, the interaction tablet may present some new problems in the interaction implementation with the user, for example, there are different simultaneous cases where the functional logic execution of the plurality of touch operations that are performed simultaneously. For example, the interactive tablet performs presentation of writing in response to a touch operation of the first touch object; the interactive flat plate responds to the touch operation of the second touch object, the handwriting thickness of the writing handwriting presented above is required to be adjusted, however, the interactive flat plate cannot adjust the handwriting thickness of the writing handwriting, so that the adjustment of the handwriting thickness cannot be normally corresponding to the situation that the touch response fails.
Based on this, the present embodiment introduces the execution of each step of the improved method in the touch response through the alternative embodiment.
Specifically, as a first alternative embodiment of the first embodiment, the execution of the following steps is further optimized and present on the basis of the above-described embodiment:
a1, receiving at least two second touch operations, wherein the second touch operations are generated based on touch point information fed back by the touch frame relative to each touch object respectively when the number of the touch objects controlled by a user is larger than 1.
It may be known that the case corresponding to the second touch operation may be described as: the user controls a plurality of touch objects with the same medium attribute, such as first touch objects, and the touch objects are contacted with the display screen, namely, each first touch object and the display screen generate contact behaviors, the generated contact behaviors are mutually independent, at this time, the intelligent processing system of the interactive panel can respectively judge what contact behaviors are carried out on each first touch object and the display screen based on the touch point information fed back by the touch frame relative to each first touch object, and respectively generate corresponding second touch operation. The step may receive each second touch operation.
b1, executing second functional logic corresponding to each second touch operation, wherein the second functional logic is related to the current use scene.
After receiving each second touch operation, the second functional logic corresponding to each second touch operation can be executed in the known current use scene. For example, assuming that the first touch object is a stylus, after the user equivalently controls the plurality of styluses to contact the display screen, and after analysis, it is determined that the contact behavior of each stylus slides on the electronic whiteboard, the second function logic corresponding to each correspondingly generated second touch operation under the use scene may be considered as handwriting writing, so that the first alternative embodiment may respond to the second touch operation of each stylus and execute the handwriting writing execution logic.
It can be seen that this first alternative embodiment mainly describes a case where a plurality of touching objects with the same media attribute are touched and the touch behaviors displayed by each touching object are controlled so as not to affect each other. As an extension of the first alternative embodiment, when each touch object controlled has different medium properties, but the touch behaviors of each touch object do not affect each other before the touch object contacts the display screen, the steps of the first alternative embodiment may be adopted to implement the execution of the touch response.
As a second alternative embodiment of the first embodiment, the execution of the following steps is further optimized on the basis of the first embodiment:
and a2, receiving a third touch operation, wherein the third touch operation is generated based on touch point information fed back by the touch frame relative to the touch object when the touch object controlled by the user is in single-point contact with the display screen.
It can be known that when the touch object controlled by the user is a hand or a stylus, the touch object can be in single-point contact with the display screen. The single point contact, that is, the same time, only one point of contact with the display screen exists on the touch object, for example, the pen point of the touch pen, and for example, a single finger contacts the display screen. At this time, if the touch object is a single-point touch object, the touch frame can feedback the responsive touch point information, and finally, the intelligent processing system determines the corresponding contact behavior through analysis of the touch point information, so as to generate the third touch operation.
And b2, executing third functional logic corresponding to the third touch operation, wherein the third functional logic is related to the current use scene.
For example, still with the content editing example performed under the electronic whiteboard, the functional logic corresponding to the single-point sliding operation of the touch pen is handwriting writing, the single-point sliding operation of the finger, and the sliding track forms a closed area, and the functional logic corresponding to the sliding track is the selection of the element content on the canvas.
Further, on the basis of the second alternative embodiment, the third alternative embodiment further comprises the following steps:
before executing the third functional logic corresponding to each third touch operation, if the first execution object contained in the third functional logic exists in a fourth functional logic and the fourth functional logic is in an execution state, ignoring the response to the third touch operation and canceling the execution of the third functional logic.
The fourth function logic is executed when a fourth touch operation is received, and the fourth touch operation is generated based on touch point information fed back by the touch frame when a touch object controlled by a user is contacted with the display screen.
The third alternative embodiment is equivalent to providing an optimization processing mode of the interactive tablet in touch response implementation, that is, according to the second alternative embodiment, the third touch operation is an operation correspondingly generated when a touch pen or a finger makes single-point contact on the display screen, one of the third functional logic matched with the third touch operation in the use scene is selected as the element content to be edited, and the element content to be edited is an execution object of the third functional logic.
If it is detected that one of the functional logics is already in an executing state and the execution object(s) of the functional logic is (are) the first execution object(s) are (are) recorded herein, the first execution object(s) may be one or a plurality of execution objects of the third functional logic when the interactive tablet is to respond to the execution of the third functional logic, then since one execution object(s) cannot respond to two execution actions at the same time, it is considered that the execution objects of the third functional logic and the fourth functional logic collide, and thus the execution of the third functional logic can be ignored first by the operation of the third alternative embodiment.
For example, assuming that the fourth touch operation is that the user controls the stylus to touch the display screen to perform sliding action, the corresponding fourth functional logic is that written handwriting is presented on the electronic whiteboard, and the fourth functional logic is continuously performed until the user finishes the sliding operation of the stylus; when the fourth function logic is in an execution state, that is, the interactive tablet always presents the written handwriting, a third touch operation is received, and the third function logic corresponding to the third touch operation is analyzed to select the writing being written, so that the selected operation on the written handwriting can be ignored through the third alternative embodiment.
Fig. 1c is a response effect diagram of the third functional logic in the touch response method according to the first embodiment. As shown in fig. 1c, assume that the first line 12 presented in the canvas is a writing trace written by a stylus and the writing trace is still in a continued writing state; assuming that the presented second line 13 is a sliding track of a finger on the display screen, and the execution object of the sliding track is the selected first line 12, since the first line 12 is always in a writing state, the third touch operation forming the second line 13 is omitted in this alternative embodiment, and the execution of the selected third functional logic of the first line 12 is canceled.
On the basis of the third alternative embodiment, this fourth alternative embodiment further optimizes the presence of the following steps:
and a3, receiving a fifth touch operation, wherein the fifth touch operation is generated based on touch point information fed back by the touch frame when a touch object controlled by a user is in multi-point contact with the display screen.
In this fourth alternative embodiment, mainly, the execution scheme corresponding to the third functional logic is that, generally, when the touch object operated by the user is a hand, multi-point contact with the display screen will only occur, for example, at least two fingers are simultaneously used to contact the display screen. The multi-point contact, i.e., the same time, the touching object can create multiple points of contact with the touching object, such as multiple fingers of a hand. At this time, if the touch object is a multi-point touch object, the touch frame can give touch point information corresponding to each touch point, and finally, the intelligent processing system can determine the touch behaviors corresponding to the multiple points together through analysis of the touch point information, so as to generate the fifth touch operation.
b3, executing fifth functional logic corresponding to the fifth touch operation, wherein the fifth functional logic is related to the current use scene.
For example, still taking content editing as an example under the electronic whiteboard, the functional logic corresponding to the multi-touch sliding operation of the plurality of fingers may be to drag and move the canvas when the sliding directions of the plurality of fingers are the same, or to scale the canvas when the sliding directions of the plurality of fingers are opposite.
On the basis of the fourth alternative embodiment, this fifth alternative embodiment further optimizes the presence of the following steps:
before executing the fifth functional logic corresponding to the fifth touch operation, if a second execution object contained in the fifth functional logic exists in the fourth functional logic and the fourth functional logic is in an execution state, ignoring a response to the fifth touch operation and canceling the execution of the fifth functional logic.
The fifth alternative embodiment is equivalent to providing an optimization processing mode of the interactive tablet in the implementation of touch response, that is, according to the third alternative embodiment, the fourth touch operation is triggered and generated by the fact that the user controls the touch object to perform a certain contact action with the display screen, and meanwhile, the fourth execution logic corresponding to the fourth touch operation can be responded under the normal response condition.
If the present alternative embodiment also monitors that there is currently one functional logic already in an executing state when the interactive tablet is to respond to the execution of the fifth functional logic, and the execution object(s) of the functional logic (herein, the second execution object may be one or more) also serve as the execution object(s) of the fifth functional logic, then, also because one execution object cannot respond to two execution actions at the same time, it may be considered that the execution objects of the fourth functional logic and the fifth functional logic collide, so that the execution of the fifth functional logic may be ignored first through the operation of the present fifth alternative embodiment.
In an exemplary use scenario of content editing by the electronic whiteboard, if the fourth functional logic is handwriting writing and presenting by touch control of the stylus, the fifth touch operation is that the canvas is slid in the same direction by multiple points by fingers, and the corresponding fifth functional logic is that the canvas is dragged. Since the drawn canvas contains the writing trace, the fifth functional logic may be considered to include the execution object (writing trace) in the execution state in the fourth functional logic, and thus, the interactive tablet cannot normally respond to the drawing movement of the canvas, and the fifth touch operation may be ignored by the fifth alternative embodiment, so that the execution of the fifth functional logic corresponding to the fifth touch operation is canceled.
On the basis of the foregoing embodiment, another implementation is provided in the sixth alternative embodiment, and specifically, if the fourth touch operation is received during the execution of the fifth functional logic, and the fourth functional logic corresponding to the fourth touch operation includes the second execution object in the fifth functional logic, the response to the fourth touch operation is ignored, and the execution of the fourth functional logic is cancelled.
It is to be noted that this sixth alternative embodiment corresponds to a contradictory scenario to the interaction scenario mentioned in the fifth alternative embodiment, i.e. when the interaction tablet does not find that a response conflict occurs after receiving the fifth touch operation, then the corresponding fifth functional logic is executed normally. In the process of executing the fifth functional logic, if the fourth touch operation is received, and the fourth functional logic corresponding to the fourth touch operation includes the execution object in the fifth functional logic currently being executed, the response to the fourth touch operation may be ignored as well.
By way of example, the fifth functional logic is determined to be a drag on the canvas, the canvas includes a writing trace, and if a fourth touch operation for continuing writing the writing trace is received during the drag process, the writing trace is in an execution state along with the drag of the canvas, and cannot directly respond to the fourth functional logic at this time, so that the response to the fourth touch operation can be ignored, and the execution logic for writing the writing trace can be canceled.
The above-mentioned alternative embodiment of the embodiment provides various possible interaction scenes encountered by the interaction panel in the visual layer, and also provides matched execution strategies.
It is known that any functional logic at the interactive flat panel visualization level is executed based on the support of the underlying algorithm. The present embodiment thus further gives a technical description of the implementation of the visual layer functional logic from the bottom layer perspective.
Specifically, based on the foregoing embodiment or any one of the foregoing optional embodiments, the seventh optional embodiment further optimizes a step of generating a touch operation based on touch point information fed back by the touch frame after the touch object contacts the display screen. For example, the present embodiment optimizes the generating step of the touch operation as follows:
and a4, acquiring touch point information fed back through the touch frame, and extracting the height and width of the touch point included in the touch point information.
Regarding the touch object contact to the display screen and the information feedback, through the above description in the embodiments of the present application, it can be known that the interactive tablet is further configured with a touch frame combined with the display screen, where the touch frame may specifically be a frame formed by an optical touch sensor and nested at an edge of the display screen. In this step, the touch frame may generate a touch signal based on the optical touch sensor included when the touch object moves on the display screen, and identify corresponding touch point information through a response to the touch signal.
In this embodiment, a process of responding to a touch object through a touch frame provided on the interactive pad may be described as: firstly, one or more optical touch sensors are arranged on two sides of the edge of a display screen of the interactive flat plate, so that a touch frame is formed. The touch of the touch object controlled by the user and the display screen can be acquired by a touch frame, specifically, during the starting and running of the interactive flat panel, the processor can start the optical touch sensor, the optical touch sensor scans optical signals on the surface of the display screen of the interactive flat panel, whether the touch object appears on the surface of the display screen is detected according to the transmission condition of the optical signals, and when the touch object is detected, the touch signal of the touch object contacted with the display screen can be generated in real time. Meanwhile, the touch frame can respond to the generated touch signal, so that the touch point data identified after the response is fed back to the upper layer (such as a main processor in the intelligent processing system) of the interactive panel, and the touch point data is recorded as touch point information in the embodiment.
In this embodiment, the touch frame used in this embodiment may be considered to be a high-precision touch frame in view of the fact that the touch response precision of the touch frame disposed on the interactive pad with respect to the touching object reaches the set precision range. The touch point information fed back through the touch frame in the steps is superior to the touch point information fed back by the conventional touch frame in accuracy and information detail degree. The touch point information fed back by the touch frame to the upper layer of the interactive panel at least comprises touch point coordinates of a touch point, touch point height and width of a touch signal generated by a touch object, touch area and the like, and touch rotation information and the like corresponding to the touch object when the touch object rotates, and the interactive panel can extract the touch point height and width included in the touch point information through the step.
And b4, determining the medium attribute of the touch object through the wide height range where the height and the width of the touch point are located.
The present embodiment may pre-establish an association relationship between the height and width of the touch point and the touch object media attribute, and the association relationship may be established based on a wide height range and the media attribute type. For example, the association relationship created in the embodiment may be that when the height and width of the touch point are in a wide height range of 1-3 pixel points, the medium attribute of the touch object is considered to be a stylus, and the stylus is a thin-pointed stylus, such as a ball-point pen; when the height and width of the touch point are in the wide height range of 3-15 pixel points, the medium attribute of the touch object is considered to be a touch pen, and the touch pen is a thick-nib touch pen such as a writing brush; when the height and width of the touch point are greater than the wide height range of 15 pixel points, the dielectric property of the touching object can be considered as a body part, such as a finger.
In actual operation, for example, in the execution of this step, the media attribute of the touch object currently operated by the user can be determined directly through the acquired height and width of the touch point and in combination with a preset association relationship.
And c4, determining touch operation matched with the contact behavior of the touch object according to the medium attribute and other information in the touch point information.
Through the steps, the medium attribute of the touch object can be determined, so that the current touch object adopted by the user is judged, and then the touch operation specifically triggered by the user can be determined through the steps. Specifically, other information included in the touch point information can be determined by whether the touch point has a moving direction, what the moving direction is, what the moving speed is, what the contact pressure is, and what the contact behavior is performed by the user at present can be accurately determined through the data interaction panel. After determining the touching object and the touching behavior, it can be determined what touch operation is actually represented in the usage scene, so as to generate touch operation, and the related module of the embodiment receives the touch operation.
On the basis of the seventh optional embodiment, the eighth optional embodiment further includes, after obtaining the touch point information fed back through the touch frame:
and processing each piece of touch point information so that each piece of touch point information has a unified unit format and data structure.
It should be noted that, in this embodiment, the operation of presenting the handwriting is mainly performed by the intelligent processing system on the upper layer of the interactive tablet, specifically, the host processor may perform the operation, and the touch point information required for presenting the handwriting is mainly fed back by the touch frame on the hardware layer of the interactive tablet, where the touch point information fed back by the touch frame may be regarded as the input information required on the upper layer.
For the touch frame configured on the interactive panel, if the touch frame is from different manufacturers, the execution parameters of the touch frame are different, so that the difference in the representation form of the touch information fed back by the touch frame is likely to be caused, and the normal execution of the handwriting erasing method is affected. In order to ensure the unification of the data information in the execution flow of the writing presentation, the information processing operation proposed by the alternative embodiment is added on the basis of the first embodiment.
By way of example, the present optional embodiment may analyze production information and batch information of the touch frame, determine an original information format of the touch point information fed back by the touch frame, and then process a unit format and a data structure of the touch point information, so as to ensure that data input to an upper layer of the interactive tablet has a uniform information format. The touch point information after processing is removed from the unit format related to the touch frame manufacturer or batch, for example, the unit of the touch area fed back in the touch point information in the original information format is basically the touch width unit and the touch height unit according to the number of the optical triggering sensors blocked on the touch frame, and the alternative embodiment can convert the touch width unit and the touch height unit into abstract units in unified software, for example, pixel units.
The sixth alternative embodiment may then further optimize processing each of the touch point information as:
according to the acquired size information of the touch frame and screen resolution information, converting units of various data information in the touch point information into a unified set unit format;
and recording the touch point information by adopting a data structure corresponding to the set unit format.
In a specific implementation of this alternative embodiment, in order to obtain relatively accurate data information from the inside of the touch frame, it is necessary to know the size of the touch frame currently equipped with the interactive tablet, screen resolution information of the display screen, and the like, where these information may be obtained by means of hardware communication with the touch frame or reading from the intelligent processing system, respectively.
For a specific processing item of touch point information, the present optional embodiment may uniformly convert data information, such as coordinates of a touch point, height and width of the touch point, or vertices of a geometric figure formed during touch, of a touch frame identified in an original information format into a unit value that is relatively abstract at a software layer, such as a coordinate point, a width or height value represented by a pixel, and the like.
Similarly, another advantage of the high-precision touch frame is that it can capture the rotation operation of the touch object during the touch process, and determine the rotation angle of the touch rotation, and at this time, by using the processing manner of the alternative embodiment, the rotation angle obtained initially can be processed according to a uniform radian unit.
The above optional embodiment of the first embodiment of the present application specifically provides a technical support of the touch response method provided by the first embodiment of the present application from a bottom layer perspective, and simultaneously optimizes and increases a processing operation of the touch point information fed back by the touch frame.
Example two
Fig. 2 is a block diagram of a touch response device according to a second embodiment of the present application, where the touch response device is integrated in an interactive tablet, and touch response accuracy of a touch frame equipped in the interactive tablet is within a set accuracy range. The device specifically comprises the following modules:
the first receiving module 21 is configured to receive a first touch operation, where the first touch operation is generated based on touch point information fed back by the touch frame when a user switches a touch object and controls the switched touch object to contact with the display screen;
the first execution module 22 is configured to execute a first function logic corresponding to the first touch operation, where the first function logic is related to a current usage scenario.
The touch response device provided in the second embodiment is integrated in an interactive panel with a high-precision touch frame configured on a hardware structure, and can actively distinguish whether a touch object currently acting on a display screen is switched, and sensitively and rapidly determine a corresponding touch operation under a current use scene when the touch object is contacted with the display screen after switching, so that the response of the touch operation is effectively given. Compared with the existing implementation mode, the method provided by the embodiment realizes the active identification and the rapid and effective response of the interaction panel to the touch object switching, and further realizes the improvement of the touch response efficiency on the interaction panel.
Further, the apparatus may further include:
the second receiving module is used for receiving at least two second touch operations, and when the number of the touch objects controlled by the user is greater than 1, the second touch operations are generated based on touch point information fed back by the touch frame relative to the touch objects respectively;
and the second execution module is used for executing second functional logic corresponding to each second touch operation, and the second functional logic is related to the current use scene.
Further, the apparatus may further include:
The third receiving module is used for receiving a third touch operation, and the third touch operation is generated based on touch point information fed back by the touch frame relative to the touch object when the touch object controlled by the user is in single-point contact with the display screen;
and the third execution module is used for executing a third functional logic corresponding to the third touch operation, and the third functional logic is related to the current use scene.
Further, the apparatus may further include:
the fourth execution module is used for ignoring the response to the third touch operation and canceling the execution of the third functional logic when a first execution object contained in the third functional logic exists in the fourth functional logic and the fourth functional logic is in an execution state before the third execution module executes the third functional logic corresponding to each third touch operation;
the fourth function logic is executed when a fourth receiving module included in the device receives a fourth touch operation, and the fourth touch operation is generated based on touch point information fed back by the touch frame when a touch object controlled by a user is contacted with the display screen.
Further, the apparatus may further include:
The fifth receiving module is used for receiving a fifth touch operation, and the fifth touch operation is generated based on touch point information fed back by the touch frame when a touch object controlled by a user is in multi-point contact with the display screen;
and the fifth execution module is used for executing fifth functional logic corresponding to the fifth touch operation, and the fifth functional logic is related to the current use scene.
Further, the apparatus may further include:
and the sixth execution module is used for ignoring the response to the fifth touch operation and canceling the execution of the fifth functional logic when a second execution object contained in the fifth functional logic exists in the fourth functional logic and the fourth functional logic is in an execution state before the fifth execution module executes the fifth functional logic corresponding to the fifth touch operation.
Further, the apparatus may further include:
and the seventh execution module is used for ignoring the response to the fourth touch operation and canceling the execution of the fourth functional logic when the fourth touch operation is received by the fourth receiving module in the process of executing the fifth functional logic and the fourth functional logic corresponding to the fourth touch operation contains the second execution object in the fifth functional logic.
Further, the controlled touching object is a first touching object or a second touching object,
the first touch object is specifically a passive stylus;
the second touching object is specifically a limb part of the user, and the limb part comprises fingers, a palm and a back of hand.
Further, in the current usage scenario, the functional logic executed in response to the touch operation includes: handwriting writing, element selection, element dragging movement, and element scaling;
the elements include: lines, text, images, graphics, video, and canvas.
Further, after the touch object is contacted with the display screen, an operation generating module included in the device may specifically include:
an information acquisition unit for acquiring touch point information fed back through the touch frame;
an information extraction unit for extracting a touch point height and width included in the touch point information;
the information determining unit is used for determining the medium attribute of the touch object through the wide height range where the height and the width of the touch point are located;
and the operation generating unit is used for determining touch operation matched with the contact behavior of the touch object according to the medium attribute and other information in the touch point information.
Further, the information acquisition unit may specifically be configured to:
identifying each touch signal through a hardware circuit in the touch frame, wherein the touch signal is generated when the touch object moves on the display screen;
obtaining touch point information fed back by the touch frame for each touch signal through a human-computer interaction HID standard protocol,
wherein one touch point information corresponds to one touch point, and the touch point information includes: touch point coordinates, touch point height and width, and touch rotation angle.
Further, the operation generating module included in the apparatus may further include:
and the information processing unit is used for processing each piece of touch point information after the touch point information fed back by the touch frame is obtained by the information obtaining unit so that each piece of touch point information has a unified unit format and data structure.
Further, the information processing unit may specifically be configured to:
according to the acquired size information of the touch frame and screen resolution information, converting units of various data information in the touch point information into a unified set unit format;
and recording the touch point information by adopting a data structure corresponding to the set unit format.
Example III
Fig. 3 is a schematic structural diagram of an interactive tablet according to a third embodiment of the present application. The interactive tablet includes: processor 40, memory 41, display 42, input device 43, output device 44, touch frame 45. The number of processors 40 in the interactive tablet may be one or more, one processor 40 being illustrated in fig. 3. The number of memories 41 in the interactive tablet may be one or more, one memory 41 being taken as an example in fig. 3. The processor 40, memory 41, display 42, input device 43, output device 44, and touch frame 45 of the interactive tablet may be connected by a bus or other means, for example by a bus connection in fig. 3.
The memory 41 is used as a computer readable storage medium for storing a software program, a computer executable program, and modules, and the program instructions/modules (e.g., the first receiving module 21, the first executing module 22 in the touch response device) corresponding to the interactive tablet according to any embodiment of the present invention. The memory 41 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for functions; the storage data area may store data created according to the use of the device, etc. In addition, memory 41 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some examples, memory 41 may further include memory located remotely from processor 40, which may be connected to the device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The display 42 and the touch frame 45 are covered (the covering relationship is not shown in fig. 3), and may be configured on the touch screen for displaying interactive contents, in general, the display 42 is used for displaying data according to the instruction of the processor 40, and is also used for receiving a touch operation acting on the display 42 and sending a corresponding signal to the processor 40 or other devices.
The input means 43 may be used for receiving input digital or character information and for generating key signal inputs related to user settings and function control of the display device, as well as a camera for capturing graphics and a sound pick-up device for capturing audio data. The output device 44 may include an audio apparatus such as a speaker. The specific composition of the input device 43 and the output device 44 may be set according to the actual situation.
The touch frame 45 has a touch response accuracy within a set accuracy range, and is used for responding to a touch operation of a touch object by a hardware circuit.
The processor 40 executes various functional applications of the device and data processing, i.e., implements the touch response method described above, by running software programs, instructions, and modules stored in the memory 41.
The interaction panel provided by the above can be used for executing the touch response method provided by any embodiment, and has corresponding functions and beneficial effects.
Example IV
A fourth embodiment of the present application also provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform a touch response method, comprising:
receiving a first touch operation, wherein the first touch operation is generated based on touch point information fed back by the touch frame when a user switches a touch object and controls the touch object to be contacted with the display screen after switching;
and executing first functional logic corresponding to the first touch operation, wherein the first functional logic is related to the current use scene.
Of course, the storage medium containing the computer executable instructions provided in the embodiments of the present application is not limited to the operation of the touch response method described above, but may also perform the related operations in the touch response method provided in any embodiment of the present invention, and has corresponding functions and beneficial effects.
From the above description of embodiments, it will be clear to a person skilled in the art that the present application may be implemented by means of software and necessary general purpose hardware, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as a floppy disk, a Read-only memory (ROM), a random access memory (RandomAccessMemory, RAM), a FLASH memory (FLASH), a hard disk or an optical disk of a computer, etc., and include several instructions for causing an interactive tablet (which may be a robot, a personal computer, a server, or a network device, etc.) to perform the touch response method according to any embodiment of the present application.
It should be noted that, in the course recommendation device, each unit and module included are only divided according to the functional logic, but not limited to the above division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for distinguishing from each other, and are not used to limit the protection scope of the present application.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
Note that the above is only a preferred embodiment of the present application and the technical principle applied. Those skilled in the art will appreciate that the present application is not limited to the particular embodiments described herein, but is capable of numerous obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the present application. Therefore, while the present application has been described in connection with the above embodiments, the present application is not limited to the above embodiments, but may include many other equivalent embodiments without departing from the spirit of the present application, the scope of which is defined by the scope of the appended claims.

Claims (16)

1. The touch response method is characterized by being applied to an interaction panel, wherein the touch response precision of a touch frame equipped on the interaction panel reaches a set precision range, and the method comprises the following steps:
receiving a first touch operation, wherein the first touch operation is generated based on touch point information fed back by the touch frame when a user switches a touch object and controls the touch object to be contacted with the display screen after switching;
and executing first functional logic corresponding to the first touch operation, wherein the first functional logic is related to the current use scene.
2. The method as recited in claim 1, further comprising:
receiving at least two second touch operations, wherein the second touch operations are generated based on touch point information fed back by the touch frame relative to each touch object respectively when the number of the touch objects controlled by a user is greater than 1;
and executing second functional logic corresponding to each second touch operation, wherein the second functional logic is related to the current use scene.
3. The method as recited in claim 1, further comprising:
receiving a third touch operation, wherein the third touch operation is generated based on touch point information fed back by the touch frame relative to the touch object when the touch object controlled by a user is in single-point contact with the display screen;
And executing third functional logic corresponding to the third touch operation, wherein the third functional logic is related to the current use scene.
4. The method of claim 3, further comprising, prior to executing the third functional logic corresponding to each of the third touch operations:
if the first execution object contained in the third functional logic exists in the fourth functional logic and the fourth functional logic is in an execution state, ignoring the response to the third touch operation and canceling the execution of the third functional logic;
the fourth function logic is executed when a fourth touch operation is received, and the fourth touch operation is generated based on touch point information fed back by the touch frame when a touch object controlled by a user is contacted with the display screen.
5. The method as recited in claim 4, further comprising:
receiving a fifth touch operation, wherein the fifth touch operation is generated based on touch point information fed back by the touch frame when a touch object controlled by a user is in multi-point contact with the display screen;
and executing fifth functional logic corresponding to the fifth touch operation, wherein the fifth functional logic is related to the current use scene.
6. The method of claim 5, further comprising, prior to executing the fifth functional logic corresponding to the fifth touch operation:
and if the second execution object contained in the fifth functional logic exists in the fourth functional logic and the fourth functional logic is in an execution state, ignoring the response to the fifth touch operation and canceling the execution of the fifth functional logic.
7. The method as recited in claim 5, further comprising:
and if the fourth touch operation is received in the process of executing the fifth functional logic and the fourth functional logic corresponding to the fourth touch operation contains a second execution object in the fifth functional logic, ignoring the response to the fourth touch operation and canceling the execution of the fourth functional logic.
8. The method of any of claims 2-7, wherein the touch object manipulated is a first touch object or a second touch object,
the first touch object is specifically a passive stylus;
the second touching object is specifically a limb part of the user, and the limb part comprises fingers, a palm and a back of hand.
9. The method of any of claims 1-7, wherein in a current usage scenario, the functional logic performed in response to the touch operation comprises: handwriting writing, element selection, element dragging movement, and element scaling;
the elements include: lines, text, images, graphics, video, and canvas.
10. The method according to any one of claims 1 to 7, wherein the step of generating the touch operation based on the touch point information fed back by the touch frame after the touch object is in contact with the display screen includes:
acquiring touch point information fed back through the touch frame, and extracting the height and width of the touch point included in the touch point information;
determining the medium attribute of the touch object through the wide height range where the height and the width of the touch point are positioned;
and determining touch operation matched with the contact behavior of the touch object according to the medium attribute and other information in the touch point information.
11. The method of claim 10, wherein the obtaining touch point information fed back through the touch frame comprises:
identifying each touch signal through a hardware circuit in the touch frame, wherein the touch signal is generated when the touch object moves on the display screen;
Obtaining touch point information fed back by the touch frame for each touch signal through a human-computer interaction HID standard protocol,
wherein one touch point information corresponds to one touch point, and the touch point information includes: touch point coordinates, touch point height and width, and touch rotation angle.
12. The method of claim 11, further comprising, after obtaining touch point information fed back through the touch frame:
and processing each piece of touch point information so that each piece of touch point information has a unified unit format and data structure.
13. The method of claim 12, wherein said processing each of said touch point information comprises:
according to the acquired size information of the touch frame and screen resolution information, converting units of various data information in the touch point information into a unified set unit format;
and recording the touch point information by adopting a data structure corresponding to the set unit format.
14. A touch response device, characterized in that it is configured on an interactive tablet, and the response accuracy of a touch frame equipped on the interactive tablet reaches a set accuracy range, the device comprising:
The first receiving module is used for receiving a first touch operation, and the first touch operation is generated based on touch point information fed back by the touch frame when a user switches a touch object and controls the touch object to be contacted with the display screen after switching;
and the second receiving module is used for executing first functional logic corresponding to the first touch operation, and the first functional logic is related to the current use scene.
15. An interactive tablet, comprising:
the touch frame is provided with touch response precision reaching a set precision range and is used for responding to the touch operation of a touch object through a hardware circuit;
the display screen is covered with the touch frame to form a touch screen for displaying interactive contents;
one or more processors;
a storage means for storing one or more programs;
when executed by the one or more processors, causes the one or more processors to implement the method of any of claims 1-13.
16. A storage medium containing computer executable instructions which, when executed by a computer processor, are for performing the method of any of claims 1-13.
CN202111209550.0A 2021-10-18 2021-10-18 Touch response method and device, interaction panel and storage medium Pending CN115993894A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111209550.0A CN115993894A (en) 2021-10-18 2021-10-18 Touch response method and device, interaction panel and storage medium
PCT/CN2022/120117 WO2023065939A1 (en) 2021-10-18 2022-09-21 Touch response method and apparatus, interactive panel, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111209550.0A CN115993894A (en) 2021-10-18 2021-10-18 Touch response method and device, interaction panel and storage medium

Publications (1)

Publication Number Publication Date
CN115993894A true CN115993894A (en) 2023-04-21

Family

ID=85992662

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111209550.0A Pending CN115993894A (en) 2021-10-18 2021-10-18 Touch response method and device, interaction panel and storage medium

Country Status (2)

Country Link
CN (1) CN115993894A (en)
WO (1) WO2023065939A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200844813A (en) * 2007-05-15 2008-11-16 Htc Corp Delay judgment systems and methods, and machine readable medium and related electronic devices thereof
US8514187B2 (en) * 2009-09-30 2013-08-20 Motorola Mobility Llc Methods and apparatus for distinguishing between touch system manipulators
CN103324430B (en) * 2012-03-19 2017-06-30 宏达国际电子股份有限公司 The objects operating method and device of many fingers
CN105760019B (en) * 2016-02-22 2019-04-09 广州视睿电子科技有限公司 Touch operation method and system based on interactive electronic whiteboard
CN109947300A (en) * 2019-03-18 2019-06-28 深圳市康冠商用科技有限公司 A kind of method, apparatus and medium for adjusting infrared touch-control machine and writing color
CN113126835A (en) * 2021-04-30 2021-07-16 北京钛方科技有限责任公司 Detection method, touch device and storage medium

Also Published As

Publication number Publication date
WO2023065939A1 (en) 2023-04-27

Similar Documents

Publication Publication Date Title
US8836649B2 (en) Information processing apparatus, information processing method, and program
CN103729055B (en) Multi-display equipment, input pen, more display control methods and multidisplay system
AU2014208041B2 (en) Portable terminal and method for providing haptic effect to input unit
CN110058782B (en) Touch operation method and system based on interactive electronic whiteboard
CN101278251B (en) Interactive large scale touch surface system
JP6233314B2 (en) Information processing apparatus, information processing method, and computer-readable recording medium
CN111475097B (en) Handwriting selection method and device, computer equipment and storage medium
CN108733296B (en) Method, device and equipment for erasing handwriting
CN104011629A (en) Enhanced target selection for a touch-based input enabled user interface
CN104040470A (en) Proximity-aware multi-touch tabletop
JP2001265523A (en) Information input/output system, information input/ output method and program storage medium
CN106502667B (en) Rendering method and device
CN104199552A (en) Multi-screen display method, device and system
CN106716493A (en) Method of styling content and touch screen device for styling content
CN114690853B (en) Interaction method and interaction panel
CN105247463B (en) The painting canvas environment of enhancing
CN114327064A (en) Plotting method, system, equipment and storage medium based on gesture control
CN106325726B (en) Touch interaction method
CN114115637A (en) Display device and electronic drawing board optimization method
CN113515228A (en) Virtual scale display method and related equipment
WO2021068405A1 (en) Element transfer method, apparatus and device, and storage medium
CN110333780A (en) Function triggering method, device, equipment and storage medium
CN110688190A (en) Control method and device of intelligent interactive panel
CN115993894A (en) Touch response method and device, interaction panel and storage medium
CN113485590A (en) Touch operation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination