CN114690977B - Interactive calling method and device based on elastic waves - Google Patents

Interactive calling method and device based on elastic waves Download PDF

Info

Publication number
CN114690977B
CN114690977B CN202110436252.9A CN202110436252A CN114690977B CN 114690977 B CN114690977 B CN 114690977B CN 202110436252 A CN202110436252 A CN 202110436252A CN 114690977 B CN114690977 B CN 114690977B
Authority
CN
China
Prior art keywords
touch
windows
screen
elastic wave
split
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110436252.9A
Other languages
Chinese (zh)
Other versions
CN114690977A (en
Inventor
陈玉香
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Chuangzhi Technology Co ltd
Original Assignee
Guangzhou Chuangzhi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Chuangzhi Technology Co ltd filed Critical Guangzhou Chuangzhi Technology Co ltd
Priority to CN202110436252.9A priority Critical patent/CN114690977B/en
Publication of CN114690977A publication Critical patent/CN114690977A/en
Application granted granted Critical
Publication of CN114690977B publication Critical patent/CN114690977B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04162Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • G06F9/4418Suspend and resume; Hibernate and awake

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an interaction calling method and device based on elastic waves. According to the technical scheme provided by the embodiment of the application, the touch sensing signals corresponding to the touch operation performed on the screen are obtained, wherein the touch sensing signals comprise the touch position signals and the elastic wave signals, the object types corresponding to the touch objects performing the touch operation are determined according to the elastic wave signals, when a plurality of object types exist and the combination mode of a plurality of different object types or the corresponding touch position signals meet the preset split screen instruction condition, whether a plurality of windows exist in the current system environment is further determined, when a plurality of windows exist, a plurality of target windows are determined from the windows to perform split screen display, in one interaction scene, the split screen operation can be realized by performing the touch operation on the screen through different object types, a richer interaction mode is provided for a user, and the user experience is optimized.

Description

Interactive calling method and device based on elastic waves
Technical Field
The embodiment of the application relates to the technical field of electronic equipment control, in particular to an interaction calling method and device based on elastic waves.
Background
With the continuous progress of technology development, many electronic devices with handwriting input systems have come into the market, such as mobile phones, electronic books, tablet computers and other intelligent terminals with handwriting functions. In order to enrich the application of the intelligent terminal for handwriting functions, an electronic whiteboard application program is generated, and the electronic whiteboard can display input characters or pictures according to handwriting on a screen by a user.
In the use scene of the existing interactive tablet, the interactive operation such as writing, selecting, erasing, dragging and the like can be performed with the large tablet through tools such as fingers, touch pens and the like. The large plate generally recognizes the contact area, position, moving speed and other information of the touch object in an infrared or capacitive mode and the like, and accordingly achieves corresponding interactive operation.
Because the interaction realized by the operations of different touch objects on the large plate is the same, only one interaction mode can be realized in the same scene, and the user experience is poor.
Disclosure of Invention
The embodiment of the application provides an interaction calling method and device based on elastic waves, which are used for providing a user with richer interaction modes and optimizing user experience.
In a first aspect, an embodiment of the present application provides an elastic wave-based interaction evoked method, including:
Acquiring a touch sensing signal corresponding to a touch operation performed on a screen, wherein the touch sensing signal comprises a touch position signal and an elastic wave signal;
determining an article type corresponding to the touch article touching the screen according to the elastic wave signal;
when a plurality of different object types exist and the combination mode of the plurality of different object types or the corresponding touch position signals meet the preset split screen instruction conditions, determining whether a plurality of windows exist in the current system environment;
if a plurality of windows exist, a plurality of target windows are determined from the windows, and split-screen display is carried out on the target windows.
In a second aspect, an embodiment of the present application provides an elastic wave-based interaction evocative device, including a touch detection module, a type detection module, a window detection module, and a split screen display module, where:
the touch detection module is used for acquiring touch sensing signals corresponding to touch operation performed on a screen, wherein the touch sensing signals comprise touch position signals and elastic wave signals;
the type detection module is used for determining an article type corresponding to the touch article touching the screen according to the elastic wave signal;
The window detection module is used for determining whether a plurality of windows exist in the current system environment when a plurality of different object types exist and the combination mode of the plurality of different object types or the corresponding touch position signals meet the preset split screen instruction conditions;
and the split screen display module is used for determining a plurality of target windows from the windows when the windows exist, and performing split screen display on the target windows.
In a third aspect, an embodiment of the present application provides an interactive tablet, including: a memory and one or more processors;
the memory is used for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the elastic wave based interaction invocation method as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a storage medium containing computer-executable instructions for performing the elastic wave-based interaction evoking method of the first aspect when executed by a computer processor.
According to the embodiment of the application, the touch sensing signals corresponding to the touch operation performed on the screen are obtained, wherein the touch sensing signals comprise the touch position signals and the elastic wave signals, the object types corresponding to the touch objects performing the touch operation are determined according to the elastic wave signals, when a plurality of object types exist and the combination mode of the different object types or the corresponding touch position signals meet the preset screen splitting instruction condition, whether a plurality of windows exist in the current system environment is further determined, when the plurality of windows exist, the plurality of target windows are determined from the windows to perform screen splitting display, in an interaction scene, the screen splitting operation can be realized through the touch operation performed on the screen by the different object types, more abundant interaction modes are provided for users, and the user experience is optimized.
Drawings
FIG. 1 is a flow chart of an elastic wave based interaction evoked method provided by an embodiment of the present application;
FIG. 2 is a first interface schematic of an interactive flat screen provided by an embodiment of the present application;
fig. 3 is a schematic diagram of a first operation of performing a touch operation on an interactive tablet according to the present embodiment;
fig. 4 is a schematic diagram of a first display effect of an interactive flat screen according to an embodiment of the present application;
FIG. 5 is a flow chart of another method for elastic wave based interaction evoked in accordance with an embodiment of the present application;
FIG. 6 is a second interface schematic of an interactive flat screen provided by an embodiment of the present application;
fig. 7 is a schematic diagram of a second operation of performing a touch operation on the interactive pad according to the present embodiment;
fig. 8 is a schematic diagram of a second display effect of an interactive flat screen according to an embodiment of the present application;
FIG. 9 is a flow chart of another method for elastic wave based interaction evoked in accordance with an embodiment of the present application;
FIG. 10 is a third interface schematic of an interactive flat screen provided by an embodiment of the present application;
fig. 11 is a schematic diagram of a third operation of performing a touch operation on the interactive pad according to the present embodiment;
fig. 12 is a schematic diagram of a third display effect of an interactive flat screen according to an embodiment of the present application;
FIG. 13 is a flow chart of another method for elastic wave based interaction evoked in accordance with an embodiment of the present application;
FIG. 14 is a fourth interface schematic of an interactive flat screen provided by an embodiment of the present application;
fig. 15 is a schematic diagram of a fourth operation of performing a touch operation on the interactive pad according to the present embodiment;
fig. 16 is a schematic diagram of a fourth display effect of an interactive flat screen according to an embodiment of the present application;
FIG. 17 is a flow chart of another method for elastic wave based interaction evoked in accordance with an embodiment of the present application;
FIG. 18 is a fifth interface schematic of an interactive flat screen provided by an embodiment of the present application;
fig. 19 is a schematic view of a fifth operation of performing a touch operation on the interactive pad according to the present embodiment;
FIG. 20 is a schematic view of a fifth display effect of an interactive flat screen according to an embodiment of the present application;
FIG. 21 is a flow chart of another method for elastic wave based interaction evoked in accordance with an embodiment of the present application;
FIG. 22 is a sixth interface schematic of an interactive flat screen provided by an embodiment of the present application;
fig. 23 is a schematic diagram of a sixth operation of performing a touch operation on the interactive pad according to the present embodiment;
fig. 24 is a schematic view of a sixth display effect of an interactive flat screen according to an embodiment of the present application;
FIG. 25 is a flow chart of another method for elastic wave based interaction evoked in accordance with an embodiment of the present application;
fig. 26 is a schematic diagram of a seventh operation of performing a touch operation on the interactive pad according to the present embodiment;
fig. 27 is a schematic view of a seventh display effect of an interactive flat screen according to an embodiment of the present application;
FIG. 28 is a flow chart of another method for elastic wave based interaction evoked in accordance with an embodiment of the present application;
fig. 29 is an eighth operation schematic diagram of a touch operation performed on the interactive pad according to the present embodiment;
fig. 30 is a schematic view of an eighth display effect of an interactive flat screen according to an embodiment of the present application;
FIG. 31 is a schematic view of an elastic wave based interaction evoked device according to an embodiment of the present application;
fig. 32 is a schematic structural diagram of an interactive tablet provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the following detailed description of specific embodiments of the present application is given with reference to the accompanying drawings. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting thereof. For convenience of description, only some, but not all, of the matters related to the present application are shown in the accompanying drawings. Before discussing exemplary embodiments in more detail, it should be mentioned that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart depicts operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently, or at the same time. Furthermore, the order of the operations may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like.
It should be noted that in the present application, relational terms such as first and second are used solely to distinguish one entity or action or object from another entity or action or object without necessarily requiring or implying any actual such relationship or order between such entities or actions or objects. For example, "first" and "second" of a first touch item and a second touch item are used to distinguish between touch items of two different time touch screens. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art according to the specific circumstances. Furthermore, in the description of the present application, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
For ease of understanding, the exemplary description is given in the embodiments with the interactive tablet as the elastic wave-based interaction evoked device. The interactive panel may be an integrated device that operates content displayed on the display panel and implements man-machine interaction through a touch technology, and integrates one or more functions of a projector, an electronic whiteboard, a curtain, sound equipment, a television, a video conference terminal, and the like. In practical application, the hardware part of the interactive tablet is composed of a screen, an intelligent processing system and the like, and is combined by integral structural members, and meanwhile, the interactive tablet is supported by a special software system, wherein the screen has a touch control function. In an embodiment, the electronic whiteboard is displayed on the screen, a user can perform touch operation on the screen through a finger or a touch pen, and the intelligent processing system generates writing handwriting to be drawn on the electronic whiteboard according to the touch operation input by the user or generates a control instruction according to the touch operation input by the user so as to process display content on the electronic whiteboard.
Typically, the interactive tablet is installed with at least one type of operating system, including, but not limited to, android systems, linux systems, and Windows systems. And processing the touch operation received through the screen through the operating system. Further, the interactive tablet may install at least one application based on the operating system, in embodiments described by way of example as an electronic whiteboard. For example, an electronic whiteboard application is installed in the interactive tablet. The application program may be an application program of the operating system, or may be an application program downloaded from a third party device or a server. The elastic wave based interaction evoked device may also be an electronic whiteboard application. Optionally, under the function of the electronic whiteboard, functions of writing, inserting forms, inserting pictures, inserting multimedia, inserting files (such as PPT, etc.), playing multimedia, inserting graphics, drawing forms, etc. can be realized. It will be appreciated that when a user writes on a screen, the corresponding handwriting is displayed on the electronic whiteboard.
Fig. 1 is a flowchart of an elastic wave-based interaction calling method according to an embodiment of the present application, where the elastic wave-based interaction calling method according to the embodiment of the present application may be implemented by an elastic wave-based interaction calling device, and the elastic wave-based interaction calling device may be implemented by hardware and/or software and integrated in an interaction tablet.
The following description will be made taking, as an example, a method for executing an elastic wave-based interaction evoked device. Referring to fig. 1, the elastic wave-based interaction evoked method includes:
s101: and acquiring a touch sensing signal corresponding to the touch operation performed on the screen, wherein the touch sensing signal comprises a touch position signal and an elastic wave signal.
For example, to realize the touch function of the screen, optical touch sensors may be disposed on two sides of the surface of the screen, where when an object is touched, the optical touch sensors scan the object through optical signals to sense the operation performed by the object on the screen and output corresponding touch position signals. Specifically, the optical touch sensor comprises an infrared emitter and an infrared receiver, wherein the infrared emitter is used for emitting infrared light, the infrared receiver is used for receiving the infrared light, and the touch point is positioned by utilizing the light beam grids formed by the infrared light densely distributed in different directions. For example, one side of the screen in the horizontal direction is provided with M infrared emitters, one side in the vertical direction is provided with N infrared emitters, the other side in the horizontal direction is provided with M infrared receivers correspondingly, the other side of the vertical direction is provided with N infrared receivers, the infrared transmitter transmits infrared light at a certain frequency, and the corresponding infrared receiver receives infrared light at a certain frequency. When the touch object touches the screen, the touch object completely or partially shields one or more vertical and horizontal infrared lights, and then an infrared light intensity pixel map of M x N is obtained. First, a position larger than a first light intensity threshold is found on an infrared light intensity pixel diagram, the first light intensity threshold represents real and effective touch, and the real touch position is the contact position when a touch object is contacted with a screen, wherein the light intensity is measured when infrared light is blocked in a half way when noise or an object approaches but is not completely touched. Further, a position, larger than a second light intensity threshold value, in the vicinity of the actual touch position is found, the second light intensity threshold value is larger than a noise value, the extension of the actual touch region can be represented, the position, larger than the second light intensity threshold value, in the vicinity of the actual touch position is marked as an effective touch region, and the area of the touch region is the contact position of the touch object and the screen. When the touch object moves on the screen, the infrared light intensity pixel diagrams with continuous multi-frame time record the contact position and the contact area of the touch object on the screen, and the movement speed of the touch object on the screen can be obtained through the contact position and the contact area recorded in the continuous frames. When the touch object leaves the screen, the shielding of the touch object on infrared light is reduced, if the contact position in the infrared light intensity pixel diagram of the previous frame is not more than a pixel with a third light intensity threshold value in the relative position and the adjacent area in the current infrared light intensity pixel diagram, the relationship between the touch object and the screen is determined to be separated from touch, and the third light intensity threshold value represents the maximum light intensity when separated from touch. It can be appreciated that when a user touches the screen by touching an item, an optical touch sensor mounted on the screen will measure the contact area of the touching item with the screen, the speed of movement of the touching item on the screen, and the touch position of the touching item when it is in contact with the screen. I.e. the contact area, the movement speed, the touch position, etc. can be measured by the optical touch sensor.
In addition, the screen is also provided with an elastic wave sensor which is used for outputting an elastic wave signal corresponding to the object when the object touches the screen. For example, a piezoelectric elastic wave sensor is installed at the frame of the screen or at the inner side of the cover plate, when a touch object touches the screen, an elastic wave signal with characteristics is generated, and the elastic wave signal propagates from a contact point, along the screen to the periphery, or propagates to the inner side of the screen. The piezoelectric elastic wave sensor positioned at the frame of the screen or at the inner side of the cover plate can convert an elastic wave signal into a voltage signal, and the voltage signal is transmitted to an IC chip with a temperature compensation circuit through a flexible circuit board for amplification treatment and is converted into a digital elastic wave signal through an analog-to-digital conversion circuit. The elastic wave signals generated when different touch objects touch the screen are different, so that different touch objects can be distinguished through the elastic wave signals. And using the elastic wave signal as an input signal, outputting the identification of the touch object through the machine learning model, and determining the touch object corresponding to the elastic wave signal through the identification of the touch object. It can be understood that after the elastic wave signal is obtained, the elastic wave signal is input into a machine learning model trained in advance, and the machine learning model outputs the identification of the touch object corresponding to the elastic wave signal.
The touch sensing signals provided by the embodiment include touch position signals and elastic wave signals, wherein the touch position signals are detected by an optical touch sensor arranged on a screen, the touch position signals reflect clicking positions, clicking areas, touch tracks and the like corresponding to touch operations on the screen, and the elastic wave signals are detected by the elastic wave sensor arranged on the screen.
In this embodiment, objects of different object types are used as touch objects to perform touch operations on a screen, for example, a finger and a stylus are used to perform touch operations on the screen respectively, and the finger and the stylus are detected by an optical touch sensor and an elastic wave sensor respectively when performing touch operations. The optical touch sensor outputs corresponding touch position signals which respectively reflect the clicking positions, clicking areas, touch tracks and the like of the fingers and the touch pen on the screen, and the elastic wave sensor respectively outputs elastic wave signals corresponding to the fingers and the touch pen.
The interaction calling device based on the elastic wave receives touch sensing signals in real time through communication interfaces connected with the optical touch sensor and the elastic wave sensor, and identifies touch position signals and elastic wave signals from the touch sensing signals. For example, the interaction triggering device is connected to the optical touch sensor and the elastic wave sensor through a serial bus, receives the touch sensing signals from the optical touch sensor through the serial bus, and recognizes the touch position signal and the elastic wave signal according to the vendor identification code, the product identification code and the report identification code recorded by the touch sensing signals.
S102: and determining the type of the object corresponding to the touch object touching the screen according to the elastic wave signal.
After the touch position signal and the elastic wave signal are obtained, the type of the object corresponding to the touch object touching the screen needs to be further determined. Specifically, the elastic wave signal is used as an input signal, the identification of the touch object is output through the machine learning model, and the touch object corresponding to the elastic wave signal is determined through the identification of the touch object.
It can be understood that when the types of the objects touching the screen are the same, two elastic wave signals corresponding to the continuous touch events are generated at this time, the obtained elastic wave signals are respectively input into the machine learning model, and the machine learning model outputs the same identification. When the types of the objects touching the screen are different, two elastic wave signals corresponding to the continuous touch events are generated at the moment, the obtained elastic wave signals are respectively input into a machine learning model, and the machine learning model outputs different identifications. Assume that the identifier corresponding to the finger is identifier 1, and the identifier corresponding to the stylus is identifier 2. When clicking or sliding on different positions of the screen through two fingers, the machine learning model outputs the marks corresponding to the two continuous touch events as marks 1, and can determine that the object types of the two touch objects are fingers. Assuming that when a finger and a touch pen click or slide at different positions of a screen, the machine learning model outputs identifiers corresponding to two continuous touch events as an identifier 1 and an identifier 2 respectively, and it can be determined that the object types of two touch objects are the finger and the touch pen respectively.
S103: and when a plurality of different object types exist and the combination mode of the plurality of different object types or the corresponding touch position signal meets the preset split screen instruction condition, determining whether a plurality of windows exist in the current system environment.
After determining the type of the object corresponding to the touch object, judging whether a plurality of different object types exist, and if only one object type exists or the object types corresponding to the touch objects are the same, responding according to a preset response mode. For example, in the use scene of handwriting input, writing is performed on the screen through a plurality of touch pens, the types of objects corresponding to the touch pens are the same, and then respective writing scripts are determined and displayed on the screen in response to touch position signals corresponding to different touch pens.
If a plurality of different object types exist, further judging that the combination mode of the different object types or the corresponding touch position signals meet the preset split screen instruction conditions, and determining whether a plurality of windows exist in the current system environment when the preset split screen instruction conditions are met. Specifically, the split screen instruction condition is used for indicating conditions required to be met by triggering the split screen instruction, and the conditions comprise an article combination condition and a touch position condition, wherein the article combination condition is used for indicating different preset combinations required by meeting the article combination condition, and the touch position condition is used for indicating different preset position combinations required by meeting the touch position condition. And when the article combination condition or the touch position condition is met or the article combination condition and the touch position condition are met at the same time, the preset split screen instruction condition is considered to be met.
For example, for the article combination conditions, determining a combination mode corresponding to the different article types, and matching the combination mode in preset combinations required by the article combination conditions, and if the preset combinations are matched, considering that the article combination conditions are met. The preset combinations required for the article combination conditions are, for example: when a plurality of different article types are determined and the combination mode of the article types is finger+stylus, the same preset combination is matched in the preset combinations required by the article combination conditions, and the article combination conditions are considered to be met. In one possible embodiment, the article combination condition may be set to have a plurality of different article types, and the combination of specific article types is not limited, and the article combination condition may be satisfied by performing a touch operation using two or more different touch articles.
In addition, for the touch position conditions, matching the touch position signals corresponding to the touch objects with different preset position combinations required by the touch position conditions, and if the touch position signals can be matched to the identical preset position combinations, considering that the touch position conditions are met. For example, the preset positions are combined with: click + click, click + slip etc. click the screen through the finger to slide on the screen through the stylus, and produce the touch position signal that reflects click operation and slip operation respectively, this touch position signal can match to click + slip preset position combination, think that satisfies touch position condition.
It will be appreciated that in order to distinguish from the originally preset predefined actions, the preset position combinations should be distinguished from the position combinations corresponding to the originally preset predefined actions, for example, the predefined actions are provided with: sliding in opposite or opposite directions, the corresponding response action is a zoom window. When different touch objects slide on the screen in opposite directions or opposite directions, the corresponding action corresponds to the pre-defined action preset originally, and the response action at the moment is a window shrinking or enlarging. In addition, when the touch position signals which are not matched with the consistent preset position combination exist, responding according to a preset response mode. For example, under the use scene of handwriting input, writing is performed on a screen through different touch objects such as fingers, touch pens and the like, at the moment, a plurality of different object types exist, but touch position signals generated under writing operation cannot be matched with a preset position combination, and then the respective writing scripts are respectively determined in response to the touch position signals corresponding to the fingers and the touch pens and displayed on the screen.
It can be understood that the requirement of meeting the split-screen instruction condition can be that both the article combination condition and the touch position condition are met at the same time, or that one of the article combination condition or the touch position condition is met.
And when the preset split screen instruction condition is met, further determining whether a plurality of windows exist in the current system environment. If no window or only one window exists in the current system environment, no response is made. In one possible embodiment, if there is no window or only one window in the current system environment, an openable application is provided for selection, and the selected application is launched, at which point there are multiple windows in the current system environment.
S104: if a plurality of windows exist, a plurality of target windows are determined from the windows, and split-screen display is carried out on the target windows.
When the preset split screen instruction condition is met and a plurality of windows exist in the current system environment, selecting the windows as target windows, and performing split screen display on the target windows.
The technical solutions provided in the embodiments are described below by way of example:
fig. 2 is a schematic diagram of a first interface of an interactive flat screen according to an embodiment of the present application, and referring to fig. 2, it is assumed that two windows (a first window and a second window) exist in a current system environment, where the first window is shown on the screen, and the second window runs in the background or is covered by the first window. Fig. 3 is a first operation schematic diagram of a touch operation on an interactive flat panel provided by the present embodiment, and fig. 4 is a first display effect schematic diagram of an interactive flat panel screen provided by the present embodiment, referring to fig. 3, at this time, by clicking a screen (clicking positions are a point a and a point B in fig. 3) with a finger and a stylus, based on an optical touch sensor and an elastic wave sensor, corresponding touch position signals and elastic wave signals are obtained respectively, it may be determined that a touch object corresponds to a plurality of different object types, and a combination manner of the plurality of different object types and the corresponding touch position signals satisfy preset split screen instruction conditions, two windows of a current system environment are determined as target windows, and split screen display is performed on the two target windows, where the split screen display effect is shown in fig. 4.
According to the method, the touch sensing signals corresponding to the touch operation performed on the screen are obtained, wherein the touch sensing signals comprise the touch position signals and the elastic wave signals, the object types corresponding to the touch objects performing the touch operation are determined according to the elastic wave signals, when a plurality of object types exist and the combination mode of the different object types or the corresponding touch position signals meet the preset screen splitting instruction condition, whether a plurality of windows exist in the current system environment is further determined, when the plurality of windows exist, the plurality of target windows are determined from the windows to perform screen splitting display, in an interaction scene, the screen splitting operation can be realized through the touch operation performed on the screen by the different object types, richer interaction modes are provided for users, and user experience is optimized.
On the basis of the above embodiment, fig. 5 shows a flowchart of another method for interaction calling based on elastic waves according to an embodiment of the present application, where the method for interaction calling based on elastic waves is a specific implementation of the method for interaction calling based on elastic waves. Referring to fig. 5, the elastic wave-based interaction evoked method includes:
s201: and acquiring a touch sensing signal corresponding to the touch operation performed on the screen, wherein the touch sensing signal comprises a touch position signal and an elastic wave signal.
S202: and determining the type of the object corresponding to the touch object touching the screen according to the elastic wave signal.
S203: and when a plurality of different object types exist and the combination mode of the plurality of different object types or the corresponding touch position signal meets the preset split screen instruction condition, determining whether a plurality of windows exist in the current system environment.
S204: and if a plurality of windows exist, determining the quantity of the objects corresponding to the touch objects.
When a combination mode of a plurality of different object types or corresponding touch position signals meet preset split screen instruction conditions and a plurality of windows exist in the current system environment, the number of objects corresponding to the touch objects performing touch operation on the screen is further determined.
S205: and determining a plurality of target windows corresponding to the article quantity from the windows according to the window hierarchy relation, and carrying out split screen display on the target windows.
Determining window hierarchy relations of all windows in the current system environment, sequentially selecting target windows from a plurality of windows according to the sequence from high to low of the window hierarchy relations, and enabling the number of the selected target windows to be consistent with the number of the objects. It will be appreciated that when the number of windows in the current system environment is less than the number of items, then all windows are selected as target windows. Further, after the target windows are determined, the target windows are displayed in a split screen mode.
Fig. 6 is a schematic diagram of a second interface of an interactive flat screen provided by an embodiment of the present application, where it is assumed that a current system environment has five windows, and a window hierarchy relationship sequentially decreases from a first window to a fifth window, where the first window is shown on the screen, and the second window to the fifth window run in the background or are covered by the first window. Fig. 7 is a second operation schematic diagram of a touch operation on an interactive flat panel provided by the present embodiment, fig. 8 is a second display effect schematic diagram of an interactive flat panel screen provided by the present embodiment, referring to fig. 7, at this time, by three fingers (for example, a thumb, an index finger and a middle finger in a palm) and a touch pen clicking a screen (the finger clicking position is C-E in fig. 7, the touch pen clicking position is F in fig. 7), based on the optical touch sensor and the elastic wave sensor, corresponding touch position signals and elastic wave signals are obtained respectively, it may be determined that a touch object corresponds to a plurality of different object types (three fingers and one touch pen), and a combination manner of a plurality of different object types and corresponding touch position signals all satisfy preset split screen instruction conditions, if the number of current objects is 4 can be determined according to the touch position signals or the elastic wave signals, then the first window to the fourth window are determined as target windows according to a window level relationship, and split screen display is performed on the 4 target windows, as shown in fig. 8.
According to the method, the touch sensing signals corresponding to the touch operation performed on the screen are obtained, wherein the touch sensing signals comprise the touch position signals and the elastic wave signals, the object types corresponding to the touch objects performing the touch operation are determined according to the elastic wave signals, when a plurality of object types exist and the combination mode of the different object types or the corresponding touch position signals meet the preset screen splitting instruction condition, whether a plurality of windows exist in the current system environment is further determined, when the plurality of windows exist, the plurality of target windows are determined from the windows to perform screen splitting display, in an interaction scene, the screen splitting operation can be realized through the touch operation performed on the screen by the different object types, richer interaction modes are provided for users, and user experience is optimized. And determining target windows corresponding to the number of the objects from a plurality of windows according to the window hierarchy relation, carrying out split screen display on the target windows, selecting the corresponding number of touch objects according to the window number requirement of the split screen display for touch operation, obtaining the split screen interaction effect of the corresponding number of target windows, and providing richer interaction modes for users.
On the basis of the above embodiment, fig. 9 shows a flowchart of another method for interaction calling based on elastic waves according to an embodiment of the present application, where the method for interaction calling based on elastic waves is a specific implementation of the method for interaction calling based on elastic waves. Referring to fig. 9, the elastic wave-based interaction evoked method includes:
s301: and acquiring a touch sensing signal corresponding to the touch operation performed on the screen, wherein the touch sensing signal comprises a touch position signal and an elastic wave signal.
S302: and determining the type of the object corresponding to the touch object touching the screen according to the elastic wave signal.
S303: and when a plurality of different object types exist and the combination mode of the plurality of different object types or the corresponding touch position signal meets the preset split screen instruction condition, determining whether a plurality of windows exist in the current system environment.
S304: if a plurality of windows exist, determining an operation track corresponding to each touch object according to the touch position signals, and determining a split screen display strategy of a split screen area according to the operation track, wherein the split screen display strategy is used for indicating a determination mode of the windows required to be displayed in the corresponding split screen area.
In this embodiment, different split screen display strategies are designated in advance for different types of operation tracks, where the split screen display strategies are used to indicate a determination manner of a window to be displayed in a corresponding split screen region. The split screen areas are areas which are divided on the screen and used for displaying windows, the number of the split screen areas is consistent with the number of the touch objects, one split screen area is used for displaying one window, and it is understood that the number of the windows displayed in a split screen mode is consistent with the number of the touch objects. For example, when a touch operation is performed by using one finger and one stylus, two split screen areas for displaying a window are divided on a screen when touch items of two item types are detected, and when a touch operation is performed by using two fingers and one stylus, three split screen areas for displaying a window are divided on a screen when touch items of three item types are detected. The window to be displayed in the split screen area is determined according to a split screen display strategy.
Specifically, when a combination mode of a plurality of different object types or corresponding touch position signals meet preset split screen instruction conditions and a plurality of windows exist in a current system environment, determining the number of objects touching the objects, determining to divide a plurality of split screen areas on a screen based on the number of the objects, and each split screen area corresponds to one touch object. Further, an operation track corresponding to each touch object is determined according to the touch position signals, and a split screen display strategy for a plurality of windows is determined according to the corresponding relation between the operation track and the split screen display strategy.
S305: and determining a plurality of target windows from the windows based on the split-screen display strategy, and performing split-screen display on the target windows in the corresponding split-screen areas.
From the above, the operation track provided by the scheme includes clicking and sliding, and the corresponding split-screen display strategy includes determining according to the window hierarchy relationship and determining according to the selected window. The determining according to the window hierarchy relation can be understood as selecting a window with the highest hierarchy from windows which are not selected as target windows according to the window hierarchy relation; determining according to the selected window can be understood as displaying the window available for split-screen presentation on the screen, and taking the selected window as a target window. Optionally, the relative positional relationship between the plurality of split screen areas is determined based on the relative positional relationship between the corresponding touch position signals.
Specifically, a target window is selected from a plurality of windows for each split screen area according to a corresponding split screen display strategy. It can be understood that, because the touch operations of different touch objects on the screen are independent of each other, there are situations that the operation tracks corresponding to different touch objects are the same or different, and correspondingly, there are situations that the split-screen display strategies corresponding to different split-screen areas are the same or different.
Further, after the target window corresponding to each split screen area is determined, the corresponding target window is displayed in each split screen area, so that split screen display of the windows is realized.
Fig. 10 is a schematic diagram of a third interface of an interactive flat screen provided by an embodiment of the present application, where it is assumed that four windows exist in a current system environment, and a window hierarchy relationship sequentially decreases from a first window to a fourth window, and at this time, the first window is displayed on the screen, and the second window to the fourth window run in the background or are covered by the first window. Fig. 11 is a third schematic operation diagram of performing a touch operation on an interactive flat panel according to the present embodiment, and fig. 12 is a third schematic display effect diagram of an interactive flat panel screen according to the present embodiment, referring to fig. 11, at this time, a screen is clicked by a touch pen (a clicking position is a point G in fig. 11), and meanwhile, a screen is slid by a finger according to an arrow direction shown in the drawing (an operation track is shown by an arrow H in fig. 11), based on that an optical touch sensor and an elastic wave sensor respectively acquire a corresponding touch position signal and an elastic wave signal, it may be determined that a touch object corresponds to a plurality of different object types, and a combination manner of a plurality of different object types and a corresponding touch position signal all satisfy a preset split screen instruction condition. According to the touch position signal or the elastic wave signal, the current article quantity can be determined to be 2, and two split screen areas are determined to be divided on the screen, and the split screen areas are not displayed in actual display. At this time, the operation tracks corresponding to the finger and the touch pen are respectively clicking and sliding, and then the split screen display strategies corresponding to the two split screen areas are respectively determined according to the window hierarchical relationship and the selected window. Then, thumbnail images of windows for split screen are displayed in the corresponding region of the finger operation track (as shown in thumbnail image I in fig. 11, first window to fourth window are shown for selection by the user, wherein the thumbnail images can be displayed by information such as display interfaces of the corresponding windows, related text descriptions and the like, so that the user can know the corresponding types of the windows, running programs or displayed contents conveniently), if the third window is selected, the third window is determined to be the target window of the finger corresponding split screen region, and according to the window hierarchy relationship, the target window determined by the touch pen corresponding split screen region is determined to be the first window. Further, the 2 target windows are displayed in a split screen mode in two split screen areas, and the split screen display effect is shown in fig. 12.
According to the method, the touch sensing signals corresponding to the touch operation performed on the screen are obtained, wherein the touch sensing signals comprise the touch position signals and the elastic wave signals, the object types corresponding to the touch objects performing the touch operation are determined according to the elastic wave signals, when a plurality of object types exist and the combination mode of the different object types or the corresponding touch position signals meet the preset screen splitting instruction condition, whether a plurality of windows exist in the current system environment is further determined, when the plurality of windows exist, the plurality of target windows are determined from the windows to perform screen splitting display, in an interaction scene, the screen splitting operation can be realized through the touch operation performed on the screen by the different object types, richer interaction modes are provided for users, and user experience is optimized. And determining a split screen display strategy of the split screen area according to the operation track corresponding to each touch object, determining a plurality of target windows from a plurality of windows according to the split screen display strategy, and performing split screen display on the target windows in the corresponding split screen areas, so that corresponding touch operation can be performed according to specific window requirements of split screen display, split screen interaction effects of the corresponding windows are obtained, and richer interaction modes are provided for users.
On the basis of the above embodiment, fig. 13 shows a flowchart of another method for interaction evoked by elastic waves according to an embodiment of the present application, where the method for interaction evoked by elastic waves is embodied in the method for interaction evoked by elastic waves. Referring to fig. 13, the elastic wave-based interaction evoked method includes:
s401: and acquiring a touch sensing signal corresponding to the touch operation performed on the screen, wherein the touch sensing signal comprises a touch position signal and an elastic wave signal.
S402: and determining the type of the object corresponding to the touch object touching the screen according to the elastic wave signal.
S403: and when a plurality of different object types exist and the combination mode of the plurality of different object types or the corresponding touch position signal meets the preset split screen instruction condition, determining whether a plurality of windows exist in the current system environment.
S404: if a plurality of windows exist and the window which meets the preset corresponding relation with the object type exists, determining a plurality of target windows from the plurality of windows, carrying out split screen display on the plurality of target windows, and enabling the plurality of target windows to have the window which meets the preset corresponding relation.
The scheme presets the preset corresponding relation between different article types and different types of windows, such as a preset corresponding relation of a mobile phone corresponding screen transmission application window, an eraser corresponding handwriting board window, a chalk corresponding slide window and the like.
Specifically, when a combination mode of a plurality of different article types or corresponding touch position signals meet a preset split screen instruction condition and a plurality of windows exist in the current system environment, whether the detected article types have article types consistent with the article types recorded in a preset corresponding relation or not is further judged. If not, determining a plurality of target windows corresponding to the number of the articles from the plurality of windows according to the window hierarchy relationship, and carrying out split screen display on the plurality of target windows.
If the detected object types are the object types consistent with the object types recorded in the preset corresponding relation, windows corresponding to the consistent object types are determined according to the preset corresponding relation, and the windows are used as target windows corresponding to corresponding touch objects. And determining a target window corresponding to the corresponding touch object from the windows according to the window hierarchy relation for the object types which are not recorded in the preset corresponding relation. Further, after the target windows are determined, the target windows are displayed in a split screen mode.
Fig. 14 is a schematic diagram of a fourth interface of an interactive flat screen provided by the embodiment of the present application, where three windows are assumed to exist in a current system environment, and a window hierarchy relationship sequentially decreases from a first window to a third window, where the first window is displayed on the screen, the second window to the third window run in the background or are covered by the first window, and the third window is a screen transfer application window, and where a preset correspondence relationship is assumed to have a mobile phone corresponding screen transfer application window. Fig. 15 is a fourth schematic operation diagram of a touch operation on an interactive flat panel provided by the present embodiment, and fig. 16 is a fourth schematic display effect diagram of an interactive flat panel screen provided by the present embodiment, referring to fig. 15, at this time, by clicking a screen (the clicking positions are J and K in fig. 15) with a finger and a mobile phone, based on an optical touch sensor and an elastic wave sensor, respectively obtaining a corresponding touch position signal and an elastic wave signal, it may be determined that an object type corresponding to a touch object is a finger and a mobile phone, a combination manner of a plurality of different object types is satisfied, and at the same time, the corresponding touch position signals all satisfy preset split screen instruction conditions, and satisfy preset corresponding relationships of mobile phone and a screen transfer application window, then determining a third window of a current system environment as a target window corresponding to the mobile phone, determining the first window as a target window corresponding to the finger according to a window hierarchy relationship, and performing split screen display on the two target windows, where the split screen display effect is as shown in fig. 16.
According to the method, the touch sensing signals corresponding to the touch operation performed on the screen are obtained, wherein the touch sensing signals comprise the touch position signals and the elastic wave signals, the object types corresponding to the touch objects performing the touch operation are determined according to the elastic wave signals, when a plurality of object types exist and the combination mode of the different object types or the corresponding touch position signals meet the preset screen splitting instruction condition, whether a plurality of windows exist in the current system environment is further determined, when the plurality of windows exist, the plurality of target windows are determined from the windows to perform screen splitting display, in an interaction scene, the screen splitting operation can be realized through the touch operation performed on the screen by the different object types, richer interaction modes are provided for users, and user experience is optimized. And when the object types meeting the preset corresponding relation exist, taking the window corresponding to the preset corresponding relation as a target window, determining the corresponding target window from a plurality of windows according to the window hierarchy relation for the object types not meeting the preset corresponding relation, carrying out split-screen display on the target window, and selecting the corresponding touch object to carry out touch operation according to the specific window requirements of the split-screen display, so that the split-screen display is carried out on the target window quickly, and richer interaction modes are provided for users.
On the basis of the above embodiment, fig. 17 is a flowchart of another method for interaction evoked by elastic waves according to an embodiment of the present application, where the method for interaction evoked by elastic waves is embodied in the method for interaction evoked by elastic waves. Referring to fig. 17, the elastic wave-based interaction evoked method includes:
s501: and acquiring a touch sensing signal corresponding to the touch operation performed on the screen, wherein the touch sensing signal comprises a touch position signal and an elastic wave signal.
S502: and determining the type of the object corresponding to the touch object touching the screen according to the elastic wave signal.
S503: and when a plurality of different object types exist and the combination mode of the plurality of different object types or the corresponding touch position signal meets the preset split screen instruction condition, determining whether a plurality of windows exist in the current system environment.
S504: if a plurality of windows exist, a plurality of target windows are determined from the windows, and split-screen display is carried out on the target windows.
S505: and when the type of the object touching the object is the cut object type, and the touch position signal indicates that the falling point position of the touch object is on the target picture, determining a cutting path according to the touch position signal.
Specifically, when the type of the object touching the screen is determined to be the cut object type according to the elastic wave signal, and the touch position signal indicates that the object touching the screen is the touch screen, the position of the drop point of the object touching the object is on the target picture, and then it is determined that the cutting operation is required to be performed on the target picture. The type of cut article is understood to be the type of article corresponding to a thin and hard article, such as a scissors, a utility knife, etc.
After determining that the target picture needs to be cut, determining a cutting path of the target picture according to the touch position signal. Optionally, the cutting path is displayed on the screen while the cutting path is determined, for example, the cutting path is displayed on the screen in the form of a dotted line, and updated synchronously with the update of the touch position signal.
It can be understood that after determining, according to the elastic wave signal, an article type corresponding to the touch article touching the screen, if only one article type exists, a preset split screen instruction condition is not satisfied, or only one window exists, at this time, a condition for split screen display is not satisfied, whether the article type is a cut article type is determined, and whether the drop point position is on the target picture is determined, if yes, a cutting operation is performed according to the touch position signal.
S506: and determining a cutting range of the target picture according to the cutting path, and cutting the target picture along the cutting range.
Specifically, when it is determined that the touch object is lifted (no new touch position signal is received at this time), determining a cutting range of the target picture according to the determined cutting path, cutting the target picture along the cutting range, implementing a cutting operation on the target picture, and displaying the cut target picture. Optionally, after cutting the target picture is completed, a first part of picture in the cutting range and a second part of picture outside the cutting range in the target picture are obtained, and the first part of picture and the second part of picture can be displayed at the same time for the user to select, or only the first part of picture is displayed, so that the cut target picture is directly provided for the user.
It can be understood that when determining the cutting range of the target picture according to the cutting path, if the cutting path is a closed path or the cutting path can be extended, the closed area corresponding to the cutting path is determined as the cutting range of the target picture, and if the cutting path is not a closed path, the cutting operation of the target picture is canceled.
Fig. 18 is a schematic diagram of a fifth interface of an interactive flat screen according to an embodiment of the present application, where it is assumed that a window exists in the current system environment, and two pictures (picture 1 and picture 2) are displayed on the window. Fig. 19 is a fifth schematic operation diagram of touch operation on an interactive flat panel, fig. 20 is a fifth schematic operation diagram of display effect of an interactive flat panel screen, referring to fig. 19, when a cutting region of a picture 1 needs to be customized, touch operation is performed on the screen through scissors, the operation path moves along L1 in fig. 19, the operation path forms a closed path, a range formed by the closed path encloses a portion of the picture 1, a corresponding touch position signal and an elastic wave signal are acquired based on an optical touch sensor and an elastic wave sensor respectively, an object type corresponding to a touch object can be determined to be a cut object type, meanwhile, the touch position signal indicates that a drop point position of the touch object is on the picture 1, then the picture 1 is determined to be a target picture to be cut, a cutting path corresponding to the operation path L1 is determined according to the touch position signal, a cutting range of the target picture is determined according to the cutting path, the cut target picture is displayed, and the cutting effect of the picture is shown in fig. 20.
Above-mentioned, when the article type of touching article is cutting article type to touch position signal indicates that touch article's place position is in the target picture, confirm cutting route according to touch position signal, and confirm the cutting scope to the target picture according to cutting route, cut the target picture along cutting scope, and demonstrate the target picture after cutting, cut the picture of digging, provide richer interactive mode to the user fast.
On the basis of the above embodiment, fig. 21 shows a flowchart of another method for interaction evoked by elastic waves according to an embodiment of the present application, where the method for interaction evoked by elastic waves is embodied in the method for interaction evoked by elastic waves. Referring to fig. 21, the elastic wave-based interaction evoked method includes:
s601: and acquiring a touch sensing signal corresponding to the touch operation performed on the screen, wherein the touch sensing signal comprises a touch position signal and an elastic wave signal.
S602: and determining the type of the object corresponding to the touch object touching the screen according to the elastic wave signal.
S603: and when a plurality of different object types exist and the combination mode of the plurality of different object types or the corresponding touch position signal meets the preset split screen instruction condition, determining whether a plurality of windows exist in the current system environment.
S604: if a plurality of windows exist, a plurality of target windows are determined from the windows, and split-screen display is carried out on the target windows.
S605: and when the object type of the touch object is a mobile terminal type and a screen transfer application and/or a quick transfer application exist in the current system environment, the screen transfer application and/or the quick transfer application are unfolded.
Specifically, when the object type of the touch object touching the screen is determined to be the mobile terminal type (such as a mobile phone, a tablet computer, etc.) according to the elastic wave signal, whether a screen transfer application and/or a quick transfer application exist in the current system environment is determined. If so, the screen transmission application and/or the quick transmission application are/is unfolded on the screen.
It can be understood that after determining, according to the elastic wave signal, an article type corresponding to a touch article touching the screen, if only one article type exists, a preset split-screen instruction condition is not met, or only one window exists, and at the moment, a condition for split-screen display is not met, whether the article type is a mobile terminal type is judged, if yes, when a screen transfer application and/or a fast transfer application exist in the current system environment, the screen transfer application and/or the fast transfer application are deployed.
Fig. 22 is a schematic diagram of a sixth interface of an interactive flat screen provided by an embodiment of the present application, where it is assumed that two windows exist in a current system environment, and a first window is shown on the screen, a second window runs in the background or is covered by the first window, and the second window is a screen transfer application window. Fig. 23 is a sixth operation schematic diagram of a touch operation on an interactive flat panel provided in this embodiment, fig. 24 is a sixth display effect schematic diagram of an interactive flat panel screen provided in an embodiment of the present application, referring to fig. 23, at this time, by clicking the screen with a mobile phone, based on the optical touch sensor and the elastic wave sensor to obtain the corresponding touch position signal and the elastic wave signal, it may be determined that the type of the object corresponding to the touch object is a mobile terminal type, and the clicking position is on the screen, at this time, a screen transfer application window exists in the current system environment, a second window running the screen transfer application is unfolded, and the display effect is shown in fig. 24.
When the mobile terminal such as the mobile phone is used for touching the screen, if the system runs the screen transfer application and/or the quick transfer application at the moment, the corresponding screen transfer application and/or the quick transfer application is unfolded, so that the user can quickly interact with the interaction panel by using the screen transfer application and/or the quick transfer application, and a richer interaction mode is provided for the user.
On the basis of the above embodiment, fig. 25 is a flowchart of another method for interaction evoked by elastic waves according to an embodiment of the present application, where the method for interaction evoked by elastic waves is embodied in the method for interaction evoked by elastic waves. Referring to fig. 25, the elastic wave-based interaction evoked method includes:
s701: and acquiring a touch sensing signal corresponding to the touch operation performed on the screen, wherein the touch sensing signal comprises a touch position signal and an elastic wave signal.
S702: and determining the type of the object corresponding to the touch object touching the screen according to the elastic wave signal.
S703: and when a plurality of different object types exist and the combination mode of the plurality of different object types or the corresponding touch position signal meets the preset split screen instruction condition, determining whether a plurality of windows exist in the current system environment.
S704: if a plurality of windows exist, a plurality of target windows are determined from the windows, and split-screen display is carried out on the target windows.
S705: and when the object type of the touch object is a graph drawing type and the pressure sensing type indicated by the elastic wave signal is a heavy press, determining a drawing track according to the touch position signal.
In the scheme, the machine learning model outputs the pressure sensing information corresponding to the touch object when the touch screen is touched besides outputting the object type corresponding to the elastic wave signal. And determining the corresponding pressure type of the touch object when the touch screen is touched according to the pressure value range corresponding to the pressure information, wherein the pressure type comprises heavy pressing and light pressing. For example, when the pressure value reflected by the pressure information is greater than the set pressure threshold value, the pressure type is determined to be a heavy press, and when the pressure value reflected by the pressure information is less than the set pressure threshold value, the pressure type is determined to be a light press.
Specifically, when the object type of the touch object touching the screen is determined to be the graphic drawing type according to the elastic wave signal, whether the pressure sensing type indicated by the elastic wave signal is a heavy pressing or not is judged. Wherein the graphic drawing type can be a drawing pen, chalk, and other objects which can be used for drawing the graphic.
If the pressure sensing type is a slight press, determining that the current touch operation is a common writing operation, responding according to the normal writing operation, and not performing pattern recognition processing. If the pressure sensing type is a heavy press, determining that the current touch operation is a recognition writing operation, and determining a drawing track in real time according to the touch position signal. Alternatively, the drawing tracks are synchronously displayed on the screen through the dotted line while being updated in real time.
S706: and carrying out pattern recognition according to the drawing track to obtain an input pattern, and displaying the input pattern.
Specifically, when it is determined that the touch object is lifted (for example, 300ms after a new touch position signal is no longer received), pattern recognition is performed according to the drawn track to obtain an input pattern, and the input pattern is displayed on the screen.
Fig. 26 is a schematic diagram showing a seventh operation of performing a touch operation on an interactive flat panel according to the present embodiment, and fig. 27 is a schematic diagram showing a seventh display effect of an interactive flat panel screen according to the present embodiment, with reference to fig. 26, at this time, by heavily pressing a drawing pen on the screen and performing graphics drawing, assuming that drawing is performed on the screen according to a circular outline, a corresponding touch position signal and an elastic wave signal are obtained based on an optical touch sensor and an elastic wave sensor, respectively, it is possible to determine that an object type corresponding to a touch object is a graphics drawing type, and a pressure type is a heavy pressing, it is determined that the current touch operation is a recognition writing operation, a drawing track is determined according to the touch position signal, and graphics recognition is performed according to the drawing track to obtain an input graphic, so that a circle of the input graphic corresponding to the drawing track is identifiable, and the input graphic is displayed, and the display effect is as shown in fig. 27.
According to the method, the required graph is heavily pressed and drawn on the screen through the graph drawing object, so that rapid graph recognition is performed, corresponding input graphs are displayed, the graph input operation of a user is quickened, and a richer interaction mode is provided for the user.
On the basis of the above embodiment, fig. 28 is a flowchart of another method for interaction evoked by elastic waves according to an embodiment of the present application, where the method for interaction evoked by elastic waves is embodied in the method for interaction evoked by elastic waves. Referring to fig. 28, the elastic wave-based interaction evoked method includes:
s801: and acquiring a touch sensing signal corresponding to the touch operation performed on the screen, wherein the touch sensing signal comprises a touch position signal and an elastic wave signal.
S802: and determining the type of the object corresponding to the touch object touching the screen according to the elastic wave signal.
S803: and when a plurality of different object types exist and the combination mode of the plurality of different object types or the corresponding touch position signal meets the preset split screen instruction condition, determining whether a plurality of windows exist in the current system environment.
S804: if a plurality of windows exist, a plurality of target windows are determined from the windows, and split-screen display is carried out on the target windows.
S805: when the object type of the touch object is a human joint type and the touch position signal indicates that the touch position of the touch object is at the side of the screen, displaying a side rail at the side of the screen corresponding to the touch position.
Specifically, when the object type of the touch object touching the screen is determined to be a human joint type (such as a human finger joint) according to the elastic wave signal, the touch position of the touch object on the screen is determined according to the touch position signal. If the touch position is not at the side of the screen, the processing is not performed. If the touch position is on the side of the screen (either side of the screen), a sidebar is displayed on the side of the screen corresponding to the touch position.
Fig. 29 is a schematic diagram showing an eighth operation of performing a touch operation on an interactive flat panel according to the present embodiment, fig. 30 is a schematic diagram showing an eighth display effect of an interactive flat panel screen according to the present embodiment, referring to fig. 29, in which the screen is divided into a first side area S1, a middle area S0, and a second side area S2 by dotted lines L2 and L3 (dotted lines L2 and L3 are not shown in the actual display effect, and the area division on the screen is only illustrated by dotted lines L2 and L3), at this time, by tapping the second side area S2 of the screen with a finger knuckle, a corresponding touch position signal and an elastic wave signal are acquired based on an optical touch sensor and an elastic wave sensor, and it may be determined that an object type corresponding to a touch object is a human body joint type, and the touch position is at a side of the screen, then a side bar is displayed in the second side area S2 of the screen, and a corresponding interactive operation may be performed by an interactive button provided in the side bar, as shown in fig. 30.
Above-mentioned, through human joint carrying out touch operation at the side of screen, arouse the side rail fast and provide richer interactive mode for the user, supply the user to carry out corresponding side rail interactive operation fast, provide richer interactive mode for the user.
Fig. 31 is a schematic structural diagram of an interaction calling device based on elastic waves according to an embodiment of the present application. Referring to fig. 31, the elastic wave-based interaction evoked device includes a touch detection module 31, a type detection module 32, a window detection module 33, and a split screen display module 34.
The touch detection module 31 is configured to obtain a touch sensing signal corresponding to a touch operation performed on a screen, where the touch sensing signal includes a touch position signal and an elastic wave signal; the type detection module 32 is configured to determine an article type corresponding to a touch article touching the screen according to the elastic wave signal; the window detection module 33 is configured to determine whether a plurality of windows exist in the current system environment when a plurality of different article types exist and a combination manner of the plurality of different article types or a corresponding touch position signal meets a preset split screen instruction condition; the split screen display module 34 is configured to determine a plurality of target windows from a plurality of windows when there are a plurality of windows, and perform split screen display on the plurality of target windows.
According to the method, the touch sensing signals corresponding to the touch operation performed on the screen are obtained, wherein the touch sensing signals comprise the touch position signals and the elastic wave signals, the object types corresponding to the touch objects performing the touch operation are determined according to the elastic wave signals, when a plurality of object types exist and the combination mode of the different object types or the corresponding touch position signals meet the preset screen splitting instruction condition, whether a plurality of windows exist in the current system environment is further determined, when the plurality of windows exist, the plurality of target windows are determined from the windows to perform screen splitting display, in an interaction scene, the screen splitting operation can be realized through the touch operation performed on the screen by the different object types, richer interaction modes are provided for users, and user experience is optimized.
In one possible embodiment, the split-screen instruction condition includes an article combination condition for indicating different preset combinations required to satisfy the article combination condition, and a touch position condition for indicating different preset position combinations required to satisfy the touch position condition.
In one possible embodiment, the split-screen display module 34 is specifically configured to:
When a plurality of windows exist, determining the quantity of the articles corresponding to the touch articles;
and determining a plurality of target windows corresponding to the article quantity from the windows according to the window hierarchy relation, and carrying out split screen display on the target windows.
In one possible embodiment, the split-screen display module 34 is specifically configured to:
when a plurality of windows exist, determining an operation track corresponding to each touch object according to the touch position signals, and determining a split screen display strategy of a split screen area according to the operation track, wherein the split screen display strategy is used for indicating a determination mode of the window required to be displayed in the corresponding split screen area;
and determining a plurality of target windows from the windows based on the split-screen display strategy, and performing split-screen display on the target windows in the corresponding split-screen areas.
In one possible embodiment, the operation track includes clicking and sliding, and the corresponding split-screen display strategy includes determining according to a window hierarchy relationship and determining according to the selected window.
In a possible embodiment, the relative positional relationship between the split screen areas is determined based on the relative positional relationship between the corresponding touch position signals.
In one possible embodiment, the split-screen display module 34 is specifically configured to:
when a plurality of windows exist and the window which meets the preset corresponding relation with the object type exists, determining a plurality of target windows from the plurality of windows, carrying out split screen display on the plurality of target windows, and enabling the window which meets the preset corresponding relation to exist in the plurality of target windows.
In a possible embodiment, the apparatus further comprises a picture cutting module, and the picture cutting module is specifically configured to:
when the object type of the touch object is a cut object type, and the touch position signal indicates that the falling point position of the touch object is on a target picture, determining a cutting path according to the touch position signal;
and determining a cutting range of the target picture according to the cutting path, and cutting the target picture along the cutting range.
In a possible embodiment, the apparatus further includes a terminal interaction module, where the terminal interaction module is specifically configured to:
and when the object type of the touch object is a mobile terminal type and a screen transfer application and/or a quick transfer application exist in the current system environment, the screen transfer application and/or the quick transfer application are unfolded.
In a possible embodiment, the device further comprises a pattern recognition module, the pattern recognition module being specifically configured to:
when the object type of the touch object is a graph drawing type and the pressure sensing type indicated by the elastic wave signal is a heavy press, determining a drawing track according to the touch position signal;
and carrying out pattern recognition according to the drawing track to obtain an input pattern, and displaying the input pattern.
In a possible embodiment, the apparatus further includes a sidebar interaction module, where the sidebar interaction module is specifically configured to:
when the object type of the touch object is a human joint type and the touch position signal indicates that the touch position of the touch object is at the side of the screen, displaying a side rail at the side of the screen corresponding to the touch position.
The embodiment of the application also provides an interaction panel which can integrate the interaction calling device based on the elastic wave. Fig. 32 is a schematic structural diagram of an interactive tablet provided in an embodiment of the present application. Referring to fig. 32, the interactive tablet includes: an input device 43, an output device 44, a memory 42, and one or more processors 41; the memory 42 is configured to store one or more programs; the one or more programs, when executed by the one or more processors 41, cause the one or more processors 41 to implement the elastic wave based interaction evoked method as provided by the above embodiments. Wherein the input device 43, the output device 44, the memory 42 and the processor 41 may be connected by a bus or otherwise, for example in fig. 32.
The memory 42 is used as a computer readable storage medium for storing software programs, computer executable programs and modules, and is configured to store program instructions/modules corresponding to the method for invoking interaction based on elastic waves according to any embodiment of the present application (e.g., the touch detection module 31, the type detection module 32, the window detection module 33 and the split screen display module 34 in the device for invoking interaction based on elastic waves). The memory 42 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for functions; the storage data area may store data created according to the use of the device, etc. In addition, memory 42 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some examples, memory 42 may further comprise memory located remotely from processor 41, which may be connected to the device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input means 43 may be used to receive entered numeric or character information and to generate key signal inputs related to user settings and function control of the device. The output means 44 may comprise a display device such as a screen.
The processor 41 executes various functional applications of the device and data processing by running software programs, instructions and modules stored in the memory 42, i.e. implements the above-described elastic wave-based interaction evoked method.
The interaction calling device and the interaction panel based on the elastic wave can be used for executing the interaction calling method based on the elastic wave provided by any embodiment, and have corresponding functions and beneficial effects.
Embodiments of the present application also provide a storage medium containing computer-executable instructions for performing the elastic wave-based interaction evoking method as provided by the above embodiments when executed by a computer processor.
Storage media-any of various types of memory devices or storage devices. The term "storage medium" is intended to include: mounting media such as CD-ROM, floppy disk or tape devices; computer system memory or random access memory such as DRAM, DDR RAM, SRAM, EDO RAM, lanbas (Rambus) RAM, etc.; nonvolatile memory such as flash memory, magnetic media (e.g., hard disk or optical storage); registers or other similar types of memory elements, etc. The storage medium may also include other types of memory or combinations thereof. In addition, the storage medium may be located in a first computer system in which the program is executed, or may be located in a second, different computer system connected to the first computer system through a network such as the internet. The second computer system may provide program instructions to the first computer for execution. The term "storage medium" may include two or more storage media that may reside in different locations (e.g., in different computer systems connected by a network). The storage medium may store program instructions (e.g., embodied as a computer program) executable by one or more processors.
Of course, the storage medium containing the computer executable instructions provided in the embodiments of the present application is not limited to the method for interaction initiation based on elastic waves as described above, and may also perform the related operations in the method for interaction initiation based on elastic waves provided in any embodiment of the present application.
The elastic wave-based interaction calling device, the interaction panel and the storage medium provided in the above embodiments may perform the elastic wave-based interaction calling method provided in any embodiment of the present application, and technical details not described in detail in the above embodiments may be referred to the elastic wave-based interaction calling method provided in any embodiment of the present application.
The foregoing description is only of the preferred embodiments of the application and the technical principles employed. The present application is not limited to the specific embodiments described herein, but is capable of numerous modifications, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, while the application has been described in connection with the above embodiments, the application is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit of the application, the scope of which is set forth in the following claims.

Claims (14)

1. An elastic wave-based interaction evoked method, comprising:
acquiring a touch sensing signal corresponding to a touch operation performed on a screen, wherein the touch sensing signal comprises a touch position signal and an elastic wave signal;
determining an article type corresponding to the touch article touching the screen according to the elastic wave signal;
when a plurality of different object types exist and the combination mode of the plurality of different object types or the corresponding touch position signals meet the preset split screen instruction conditions, determining whether a plurality of windows exist in the current system environment;
if a plurality of windows exist, a plurality of target windows are determined from the windows, and split-screen display is carried out on the target windows.
2. The method of claim 1, wherein the split-screen instruction conditions include an item combination condition for indicating different preset combinations required to satisfy the item combination condition and a touch location condition for indicating different preset location combinations required to satisfy the touch location condition.
3. The method for invoking interaction based on elastic waves according to claim 1, wherein if there are a plurality of windows, determining a plurality of target windows from the plurality of windows, and performing split-screen display on the plurality of target windows, comprising:
If a plurality of windows exist, determining the quantity of the articles corresponding to the touch articles;
and determining a plurality of target windows corresponding to the article quantity from the windows according to the window hierarchy relation, and carrying out split screen display on the target windows.
4. The method for invoking interaction based on elastic waves according to claim 1, wherein if there are a plurality of windows, determining a plurality of target windows from the plurality of windows, and performing split-screen display on the plurality of target windows, comprising:
if a plurality of windows exist, determining an operation track corresponding to each touch object according to the touch position signals, and determining a split screen display strategy of a split screen area according to the operation track, wherein the split screen display strategy is used for indicating a determination mode of the windows required to be displayed in the corresponding split screen area;
and determining a plurality of target windows from the windows based on the split-screen display strategy, and performing split-screen display on the target windows in the corresponding split-screen areas.
5. The method of claim 4, wherein the operation trace comprises a click and a slide, and wherein the split-screen presentation strategy comprises a determination according to a window hierarchy and a determination according to the selected window.
6. The acoustic wave based interaction evoking method according to claim 4, wherein a relative positional relationship between the split screen areas is determined based on a relative positional relationship between the corresponding touch position signals.
7. The method for invoking interaction based on elastic waves according to claim 1, wherein if there are a plurality of windows, determining a plurality of target windows from the plurality of windows, and performing split-screen display on the plurality of target windows, comprising:
if a plurality of windows exist and the window which meets the preset corresponding relation with the object type exists, determining a plurality of target windows from the plurality of windows, carrying out split screen display on the plurality of target windows, and enabling the plurality of target windows to have the window which meets the preset corresponding relation.
8. The elastic wave based interaction evoking method according to claim 1, wherein the method further comprises:
when the object type of the touch object is a cut object type, and the touch position signal indicates that the falling point position of the touch object is on a target picture, determining a cutting path according to the touch position signal;
And determining a cutting range of the target picture according to the cutting path, and cutting the target picture along the cutting range.
9. The elastic wave based interaction evoking method according to claim 1, wherein the method further comprises:
and when the object type of the touch object is a mobile terminal type and a screen transfer application and/or a quick transfer application exist in the current system environment, the screen transfer application and/or the quick transfer application are unfolded.
10. The elastic wave based interaction evoking method according to claim 1, wherein the method further comprises:
when the object type of the touch object is a graph drawing type and the pressure sensing type indicated by the elastic wave signal is a heavy press, determining a drawing track according to the touch position signal;
and carrying out pattern recognition according to the drawing track to obtain an input pattern, and displaying the input pattern.
11. The elastic wave based interaction evoking method according to claim 1, wherein the method further comprises:
when the object type of the touch object is a human joint type and the touch position signal indicates that the touch position of the touch object is at the side of the screen, displaying a side rail at the side of the screen corresponding to the touch position.
12. The utility model provides an interactive calling device based on elastic wave which characterized in that includes touching detection module, type detection module, window detection module and divides screen display module, wherein:
the touch detection module is used for acquiring touch sensing signals corresponding to touch operation performed on a screen, wherein the touch sensing signals comprise touch position signals and elastic wave signals;
the type detection module is used for determining an article type corresponding to the touch article touching the screen according to the elastic wave signal;
the window detection module is used for determining whether a plurality of windows exist in the current system environment when a plurality of different object types exist and the combination mode of the plurality of different object types or the corresponding touch position signals meet the preset split screen instruction conditions;
and the split screen display module is used for determining a plurality of target windows from the windows when the windows exist, and performing split screen display on the target windows.
13. An interactive tablet, comprising: a memory and one or more processors;
the memory is used for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the elastic wave based interaction invocation method of any of claims 1-11.
14. A storage medium containing computer executable instructions which, when executed by a computer processor, are for performing the elastic wave based interaction evoking method according to any of claims 1-11.
CN202110436252.9A 2021-04-22 2021-04-22 Interactive calling method and device based on elastic waves Active CN114690977B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110436252.9A CN114690977B (en) 2021-04-22 2021-04-22 Interactive calling method and device based on elastic waves

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110436252.9A CN114690977B (en) 2021-04-22 2021-04-22 Interactive calling method and device based on elastic waves

Publications (2)

Publication Number Publication Date
CN114690977A CN114690977A (en) 2022-07-01
CN114690977B true CN114690977B (en) 2023-11-21

Family

ID=82136279

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110436252.9A Active CN114690977B (en) 2021-04-22 2021-04-22 Interactive calling method and device based on elastic waves

Country Status (1)

Country Link
CN (1) CN114690977B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105577913A (en) * 2014-10-31 2016-05-11 Lg电子株式会社 Mobile terminal and method of controlling the same
CN106527825A (en) * 2016-09-30 2017-03-22 南京仁光电子科技有限公司 Large-screen remote control interaction system and interaction method thereof
CN109960446A (en) * 2017-12-25 2019-07-02 华为终端有限公司 It is a kind of to control the method and terminal device that selected object is shown in application interface
CN209562594U (en) * 2019-01-18 2019-10-29 北京钛方科技有限责任公司 Smart machine key control device
CN110413188A (en) * 2018-04-28 2019-11-05 北京钛方科技有限责任公司 Smart machine control method and device
CN110543210A (en) * 2005-03-04 2019-12-06 苹果公司 Touch and force sensing device and system and method for sensing touch and force
WO2021043223A1 (en) * 2019-09-06 2021-03-11 华为技术有限公司 Split-screen display method and electronic device
WO2021063074A1 (en) * 2019-09-30 2021-04-08 华为技术有限公司 Method for split-screen display and electronic apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10296206B2 (en) * 2014-09-23 2019-05-21 Microsoft Technology Licensing, Llc Multi-finger touchpad gestures
KR102444920B1 (en) * 2014-11-20 2022-09-19 삼성전자주식회사 Device and control method thereof for resizing a window

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110543210A (en) * 2005-03-04 2019-12-06 苹果公司 Touch and force sensing device and system and method for sensing touch and force
CN105577913A (en) * 2014-10-31 2016-05-11 Lg电子株式会社 Mobile terminal and method of controlling the same
CN106527825A (en) * 2016-09-30 2017-03-22 南京仁光电子科技有限公司 Large-screen remote control interaction system and interaction method thereof
CN109960446A (en) * 2017-12-25 2019-07-02 华为终端有限公司 It is a kind of to control the method and terminal device that selected object is shown in application interface
CN110413188A (en) * 2018-04-28 2019-11-05 北京钛方科技有限责任公司 Smart machine control method and device
CN209562594U (en) * 2019-01-18 2019-10-29 北京钛方科技有限责任公司 Smart machine key control device
WO2021043223A1 (en) * 2019-09-06 2021-03-11 华为技术有限公司 Split-screen display method and electronic device
WO2021063074A1 (en) * 2019-09-30 2021-04-08 华为技术有限公司 Method for split-screen display and electronic apparatus

Also Published As

Publication number Publication date
CN114690977A (en) 2022-07-01

Similar Documents

Publication Publication Date Title
CN201156246Y (en) Multiple affair input system
US9104239B2 (en) Display device and method for controlling gesture functions using different depth ranges
CN110058782B (en) Touch operation method and system based on interactive electronic whiteboard
US8890818B2 (en) Apparatus and method for proximity based input
KR101974852B1 (en) Method and apparatus for moving object in terminal having touchscreen
US8212785B2 (en) Object search method and terminal having object search function
CN102119376B (en) Multidimensional navigation for touch-sensitive display
KR20150126494A (en) Mobile terminal and method for controlling the same
CN103729156A (en) Display control device and display control method
US10824323B2 (en) Method and system for controlling device
CN103488394A (en) Method and equipment for executing application operation
CN114690853B (en) Interaction method and interaction panel
US20090283341A1 (en) Input device and control method thereof
CN106909256A (en) Screen control method and device
CN104020874A (en) Display apparatus, input apparatus, and control method thereof
CN114690930A (en) Handwriting processing method and device, interactive panel and storage medium
CN114690977B (en) Interactive calling method and device based on elastic waves
CN102207817A (en) Electronic reading device and cursor control method thereof
CN201804317U (en) Electronic pen used for interactive electronic whiteboard
CN114690931A (en) False touch prevention method and device based on contact tracking, interactive flat plate and storage medium
CN102402360A (en) Handwriting display screen, handwriting input pen and handwriting input method
CN111124149A (en) Input method and electronic equipment
CN110929484A (en) Text processing method, device and storage medium
US20120169627A1 (en) Electronic device and method thereof for transmitting data
WO2023231268A1 (en) Quick annotation method and apparatus, and interactive tablet and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant