CN116661669A - Touch control method and electronic equipment - Google Patents

Touch control method and electronic equipment Download PDF

Info

Publication number
CN116661669A
CN116661669A CN202210145207.2A CN202210145207A CN116661669A CN 116661669 A CN116661669 A CN 116661669A CN 202210145207 A CN202210145207 A CN 202210145207A CN 116661669 A CN116661669 A CN 116661669A
Authority
CN
China
Prior art keywords
display screen
contact
contact pattern
electronic device
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210145207.2A
Other languages
Chinese (zh)
Inventor
丘林
张磊
王培鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210145207.2A priority Critical patent/CN116661669A/en
Publication of CN116661669A publication Critical patent/CN116661669A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay

Abstract

A touch control method and an electronic device. The method comprises the following steps: the electronic device detects that a first object contacts a display screen of the electronic device; detecting that the second object is contacted with the display screen in the process that the first object is contacted with the display screen; determining a control command for the electronic device according to a first contact pattern of the first object and the display screen and a second contact pattern of the second object and the display screen; and executing the control command. That is, the user can contact the display screen through two objects so as to control the electronic equipment, and the operation is convenient.

Description

Touch control method and electronic equipment
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a touch method and an electronic device.
Background
With the development of information technology, interactive terminal devices are increasingly appearing in scenes such as conferences, content creation, and the like, and gradually show a trend of large screens. For example, in a conference where a plurality of large-screen terminal devices are accessed, multimedia information among different large-screen terminal devices can be remotely shared.
At present, the interaction between a user and large-screen terminal equipment is mainly performed through a mobile phone and a tablet. However, this interaction method is limited by the hardware influence (such as insufficient power and small screen size) of the mobile phone and the tablet, and the experience is poor. Therefore, how to flexibly and quickly interact with the large-screen terminal equipment is a problem to be considered.
Disclosure of Invention
The application aims to provide a touch control method and electronic equipment, which are beneficial to improving the interaction experience of a user and large-screen terminal equipment.
In a first aspect, a touch method is provided and applied to an electronic device. The electronic device includes a display screen. By way of example, the electronic device may be a large screen device, such as a smart screen. The method comprises the following steps: detecting that a first object contacts a display screen; detecting that a second object is contacted with the display screen in the process that the first object is contacted with the display screen; determining a control command for the electronic equipment according to a first contact pattern of the first object and the display screen and a second contact pattern of the second object and the display screen; and executing the control command.
Taking a large-screen device (such as an intelligent screen) as an example, a remote control device is generally used for controlling the large-screen device, so that the operation is not convenient enough. Through the touch method provided by the application, a user can contact the first object and the second object with the display screen of the electronic equipment so as to realize control of the electronic equipment, and the operation is convenient.
Illustratively, the first object may include: at least one of palm, wrist, arm, finger joint and elbow joint.
Illustratively, the first object may further comprise: a handwriting pen.
The second object is illustratively a finger or a stylus.
Wherein the first object and the second object may be the same or different. For example, the first object is a left hand palm and the second object is a right hand finger or a stylus. For example, the user's left hand and palm contact the display screen, and the right hand holds the stylus to draw a graphic on the display screen, so that the electronic device can determine a control command through the first contact graphic of the palm and the display screen and the drawing of the second contact graphic of the stylus on the display screen, and execute the control command. The operation mode does not need to use remote control equipment, and is convenient to operate.
In one possible design, executing the control command includes: executing the control command, and controlling at least one of the following display information on the display screen: copying, pasting, cutting, full selection, partial selection, content searching, window minimization and window maximization.
For example, the user's left hand and palm contact the display screen, and the right hand holds the stylus to draw graphics on the display screen. The electronic equipment determines that a first instruction corresponding to a first contact pattern of the palm and the display screen is Ctrl, and a drawing pattern of the handwriting pen on the display screen is letter C, and determines that the control command is 'ctrl+C', so that the electronic equipment copies display information on the display screen. Therefore, taking the electronic equipment as the large-screen equipment as an example, under the condition that the large-screen equipment is not connected with a keyboard, a user can use the habit of the keyboard on the large-screen equipment, and the experience is good.
In one possible design, the determining the control command to the electronic device according to the first contact pattern of the first object and the display screen and the second contact pattern of the second object and the display screen includes: and when the first object is determined to be a preset object and the duration time of the contact of the first object with the display screen is longer than a threshold value, determining a control command for the electronic equipment according to a first contact pattern of the first object with the display screen and a second contact pattern of the second object with the display screen. That is, the electronic device controls the electronic device according to the contact patterns of the two objects with the display screen only when the first object is a preset object and the contact duration with the display screen reaches the threshold value. By the method, the control of the electronic equipment caused by the fact that the first object and the second object are in contact with the display screen in a wrong way can be avoided.
In one possible design, the determining the control command to the electronic device according to the first contact pattern of the first object and the display screen and the second contact pattern of the second object and the display screen includes: determining that the first object is a preset object, and determining a control command for the electronic equipment according to a first contact pattern of the first object with the display screen and a second contact pattern of the second object with the display screen when at least one of the contact position, the contact area and the contact gesture of the first object with the display screen is unchanged between a first moment and a second moment; the first moment is the contact start moment of the second object and the display screen, and the second moment is the contact end moment of the second object and the display screen. That is, the first object remains stationary during the drawing of the pattern on the display screen by the second object, because the electronic device needs to recognize the first contact pattern of the first object with the display screen during this period (the process of drawing the pattern by the second object), and if the first object moves, the first contact pattern may change, the control command cannot be accurately determined. Therefore, the accuracy of the control command can be improved by the mode that the first object keeps still in the process of drawing the graph by the second object.
In one possible design, the determining the control command to the electronic device according to the first contact pattern of the first object and the display screen and the second contact pattern of the second object and the display screen includes: determining an instruction set corresponding to the first object according to the object type of the first object; determining a first instruction corresponding to the first contact pattern in the instruction set according to the first contact pattern; and determining a control command for the electronic equipment according to the first instruction and the second contact pattern. For example, the object types of the first object include two types, i.e., a hand and a stylus, and the instruction set corresponding to the hand is different from the instruction set corresponding to the stylus. The instruction set corresponding to the hand part comprises: the first instruction corresponding to the contact pattern of the palm is Ctrl, and the first instruction corresponding to the contact pattern of the fist is Alt. The instruction set corresponding to the handwriting pen comprises: the first instruction corresponding to the handwriting pen which is in parallel contact with the display screen and the included angle between the handwriting pen and the edge line of the display screen is 90 degrees is Ctrl, and the first instruction corresponding to the handwriting pen which is in parallel contact with the display screen and the included angle between the handwriting pen and the edge line of the display screen is 9 degrees is Alt. Thus, the user may implement different control functions by contacting different types of objects with the display screen.
In one possible design, the first object is a user body part, and the determining the control command to the electronic device according to the first contact pattern of the first object with the display screen and the second contact pattern of the second object with the display screen includes: identifying a user identity of the user; determining an instruction set corresponding to the user according to the user identity; determining a first instruction corresponding to the first contact pattern in the instruction set according to the first contact pattern; and determining a control command for the electronic equipment according to the first instruction and the second contact pattern.
For example, user a is different from the instruction set to which user B corresponds. The instruction set of user a includes: the first instruction corresponding to the contact pattern of the palm and the display screen is Ctrl. The instruction set corresponding to the user B comprises: the first instruction corresponding to the contact pattern of the palm and the display screen is Alt, and the first instruction corresponding to the contact pattern of the fist and the display screen is Ctrl. Assuming that user a wants the electronic device to execute a "copy" control command, the left hand palm of user a may be brought into contact with the display screen, and the right hand draws letter C on the display screen. Because the electronic device identifies the first contact pattern of the left palm of the user a and the display screen, and then determines that the first instruction corresponding to the first contact pattern is Ctrl in the instruction set corresponding to the user a, it can determine that the control command is "ctrl+c", and execute the "ctrl+c" to implement the copy function. Assuming that user B also wants the electronic device to execute a "copy" control command, the left hand fist of user B may be brought into contact with the display screen, and the right hand drawing letter C on the display screen. Because the electronic device identifies the first contact pattern of the fist of the left hand of the user B and the display screen, and then determines that the first instruction corresponding to the first contact pattern is Ctrl in the instruction set corresponding to the user B, the electronic device can determine that the control command is "ctrl+c", and execute the "ctrl+c" to realize the copy function. Thus, for the same control function (e.g., copy function), the manner in which different users contact the display screen is different.
In one possible design, the method further comprises; and receiving input operation of a user, and setting an instruction set corresponding to the user according to the input operation. That is, different users can set corresponding instruction sets according to their own operation habits.
In one possible design, the first object is a user body part, and the determining the control command to the electronic device according to the first contact pattern of the first object with the display screen and the second contact pattern of the second object with the display screen includes: determining a contact gesture of the user with the display screen according to the first contact graph, wherein the contact gesture comprises at least one of fist making, palm opening, palm merging and finger bending; determining the first instruction according to the contact gesture; and determining a control command for the electronic equipment according to the first instruction and the second contact pattern. That is, the contact gesture of the user with the display screen is different, and the corresponding first instruction is different. For example, the first instruction corresponding to the palm and the display screen is Ctrl, and the first instruction corresponding to the fist and the display screen is Alt.
In one possible design, the first object is a stylus, and the determining the control command to the electronic device according to the first contact pattern of the first object with the display screen and the second contact pattern of the second object with the display screen includes: according to the first contact pattern, determining that the handwriting pen is in parallel contact with the display screen, and an included angle between the handwriting pen and an edge line of the display screen is a first angle; determining the first instruction according to the first angle; and determining a control command for the electronic equipment according to the first instruction and the second contact pattern. That is, the stylus is contacted with the display screen in parallel, and the first angles between the stylus and the edge line of the display screen are different, and the corresponding first instructions are different. For example, when the first angle is 180 degrees, the first instruction is Ctrl; when the first angle is 90 degrees, the first instruction is Alt.
In one possible design, the method further comprises: and displaying the icon corresponding to the first instruction. It should be appreciated that the first instruction that the first object corresponds to with the first contact pattern of the display screen is different, for example, the user wants to trigger Ctrl, requiring the palm to contact the display screen, because the first instruction that the palm corresponds to the contact pattern of the display screen is Ctrl; the user wants to trigger Alt, requiring the fist to contact the display because the first instruction corresponding to the contact pattern of the fist with the display is Alt. Therefore, the user needs to memorize the contact patterns corresponding to different instructions, and the requirement on the user is high. In the embodiment of the application, after the first object is contacted with the electronic equipment, the electronic equipment can display the icon of the first instruction corresponding to the first contact graph of the first object and the display screen, and the user can judge whether the first instruction sent by the touch is the instruction wanted by the user or not through the icon, so that the memory pressure of the user is relieved.
In one possible design, when the first instruction includes a plurality of instructions, the method further includes: displaying a plurality of icons corresponding to the plurality of first instructions; determining a control command to the electronic device according to the first instruction and the second contact pattern, including: receiving an operation for selecting a first icon among the plurality of icons; and determining a control command for the electronic equipment based on the first instruction corresponding to the first icon and the second contact graph. That is, when there are a plurality of first instructions corresponding to the first contact pattern, icons corresponding to the plurality of first instructions are displayed respectively, and the user can select the first icon from the plurality of icons, and the first instruction corresponding to the selected first icon is the final first instruction. In this way, the first contact patterns and the first instructions may not be in one-to-one correspondence, so that the user does not need to memorize the first contact pattern corresponding to each first instruction, and the memory pressure of the user is relieved.
In one possible design, the method further comprises: and when the duration of the contact of the first object with the display screen is determined to be longer than the preset duration, waking up a pattern recognition engine in the electronic equipment, wherein the pattern recognition engine is used for recognizing a first contact pattern of the first object with the display screen and a second contact pattern of the second object with the display screen. In this way, the pattern recognition engine in the electronic device does not need to be in a working state all the time, and power consumption is saved.
In one possible design, the method further comprises: when the contact of the first object with the display screen is detected to be stopped, controlling the pattern recognition engine to enter a dormant state; or if the first object is detected to be not contacted with the display screen again within the preset time after the first object is stopped contacting with the display screen, controlling the pattern recognition engine to be in a dormant state. That is, the pattern recognition engine may be turned off immediately when the first object leaves the display screen, or may wait for a period of time, and turn off the pattern recognition engine if the first object does not come into contact with the display screen again during the period of time.
In a second aspect, there is also provided an electronic device, comprising:
the detection unit is used for detecting that the first object contacts with the display screen of the electronic equipment;
the detection unit is also used for detecting that a second object is contacted with the display screen in the process of contacting the first object with the display screen;
and the processing unit is used for determining a control command for the electronic equipment according to the first contact pattern of the first object and the display screen and the second contact pattern of the second object and the display screen, and executing the control command.
Alternatively, the detection unit may be a touch display screen on the electronic device, and the touch display screen may be an infrared touch display screen or a capacitive touch display screen, which is not limited by the present application. The processing unit may be a processor in the electronic device, may be a central processing unit CPU, or a graphics processor GPU, or may be a CPU or other processor other than a GPU, e.g. a graphics recognition engine, which is a device independent of the CPU and GPU. For the description of the pattern recognition engine, please refer to the related description of the first aspect, and the detailed description is not repeated.
In a third aspect, there is also provided an electronic device, comprising:
a processor and a memory;
the memory is for storing one or more computer programs, the one or more computer programs comprising computer-executable instructions that, when executed by the electronic device, cause the electronic device to perform the method as provided in the first aspect above.
Optionally, the electronic device further comprises other components, such as an antenna, an input-output module, an interface, etc. These components may be hardware, software, or a combination of software and hardware.
In a fourth aspect, there is also provided a computer readable storage medium for storing a computer program which, when run on a computer, causes the computer to perform the method as described in the first aspect above.
In a fifth aspect, embodiments of the present application provide a computer program product storing a computer program comprising program instructions which, when executed by a computer, cause the computer to perform the method as described in the first aspect above.
In a sixth aspect, the present application provides a chip system comprising a processor and further comprising a memory for implementing the method according to the first aspect. The chip system may be formed of a chip or may include a chip and other discrete devices.
Advantageous effects of the second to sixth aspects and implementations thereof described above may be referred to the description of advantageous effects of the method and implementations thereof described in any of the first aspects.
Drawings
Fig. 1 is a schematic diagram of an application scenario provided in an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating an operation of a large screen device according to an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating another operation of a large screen device according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a touch method according to an embodiment of the application;
FIG. 5 is a schematic view of a first object contacting a large screen device according to an embodiment of the present application;
FIG. 6 is another schematic view of a first object contacting a large screen device according to an embodiment of the present application;
FIGS. 7-8 are schematic diagrams illustrating a first object contacting a large screen device according to an embodiment of the present application;
fig. 9 is a schematic flow chart of a touch method according to an embodiment of the application;
FIG. 10 is a schematic diagram of a display interface when a large-screen device according to an embodiment of the present application cannot recognize graphics;
fig. 11 and fig. 12 are schematic diagrams illustrating an exemplary user touch procedure according to an embodiment of the application;
fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the application.
Detailed Description
In the following, some terms in the embodiments of the present application are explained for easy understanding by those skilled in the art.
At least one of the embodiments of the present application includes one or more; wherein, a plurality refers to greater than or equal to two. In addition, it should be understood that in the description herein, the words "first," "second," and the like are used solely for the purpose of distinguishing between the descriptions and not necessarily for the purpose of indicating or implying a relative importance or order. For example, the first object and the second object do not represent the importance of both or the order of both, only for distinguishing descriptions. In the embodiment of the present application, "and/or" merely describes the association relationship, which means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
In describing embodiments of the present application, it should be noted that, unless explicitly stated and limited otherwise, the terms "mounted," "connected," and "connected" should be construed broadly, and for example, the terms "connected" may be removably connected or non-removably connected; may be directly connected or indirectly connected through an intermediate medium. References to directional terms in the embodiments of the present application, such as "upper", "lower", "left", "right", "inner", "outer", etc., are merely with reference to the directions of the drawings, and thus, the directional terms are used in order to better and more clearly describe and understand the embodiments of the present application, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus should not be construed as limiting the embodiments of the present application. "plurality" means at least two.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the specification. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The touch control method provided by the embodiment of the application can be applied to electronic equipment. The electronic device may be an electronic device with a touch screen. For example, portable electronic devices such as notebook computers and tablet computers, or intelligent home devices such as televisions, or screen-throwing clients such as intelligent screens can be used. In summary, the embodiment of the application does not limit the type of the electronic device. In some embodiments, the touch method provided by the embodiments of the present application may be applied to a large-screen device (or referred to as a large-screen terminal device), such as a television, a projector, a smart screen, and so on.
The application scenario provided by the embodiment of the application is described below with reference to the accompanying drawings.
Exemplary, as shown in fig. 1, a schematic diagram of an application scenario is provided in an embodiment of the present application. Fig. 1 illustrates a conference scenario, where the conference scenario includes: an electronic device 100 and an electronic device 200; also included are participants 300, as well as other participants. Taking a notebook computer as the electronic device 100 and a screen-projection client (e.g., a smart screen) as the electronic device 200 as an example.
In one possible implementation, the electronic device 100 (notebook computer of the participant 300) may be screen-cast to the electronic device 200, such that the electronic device 200 displays the display content on the electronic device 100, and the participant 300 may conduct the conference explanation through the electronic device 200 with a larger screen.
Generally, the attendees 300 need to operate the electronic device 100 to control the electronic device 200. In one possible implementation, the electronic device 200 may have a touch screen, and the participant 300 may then perform a touch operation on the touch screen of the electronic device 200 without using the electronic device 100 to control the electronic device 200.
For example, referring to fig. 2, a plurality of icons are displayed on the electronic device 200. The user can click on a certain icon to realize the corresponding function. For example, when the electronic device 200 detects that the user clicks on a meeting icon, a meeting may be created or joined. For another example, when the electronic device 200 detects that the user clicks on a whiteboard icon, the user may write or draw on the display screen. For another example, when the electronic device 200 detects that the user clicks the screen-casting icon, the screen is cast to other devices or the screen is cast to other devices is received. It can be understood that, in fig. 2, three icons are shown on the display screen of the electronic device 200 as an example, in fact, more icons may be included, if the number of icons is large, when a user wants to implement a certain function, the user needs to select a target icon capable of implementing the function from the plurality of icons and then click on the target icon, which is complex in operation and affects the user experience.
For another example, referring to fig. 3, a user may perform a gesture operation on a display screen of the electronic device, such as a zoom-in operation (a sliding operation in which two fingers are relatively far apart), a zoom-out operation (a sliding operation in which two fingers are close together), a multi-finger touch operation, a finger joint touch operation, and so on. This method is a gesture operation with one hand, and has limited functions to be realized.
It should be noted that, with continued reference to fig. 1, the user (i.e., the participant 300) needs to control the large-screen device (i.e., the electronic device 200) by using the notebook computer (i.e., the electronic device 100), and if the user does not use the notebook computer but performs an operation on the display screen of the large-screen device, the operation habit on the notebook computer is inevitably followed. For example, users are accustomed to combining commands such as ctrl+ V, ctrl +c, etc. when using a notebook computer. These combined commands will not be available when operating on a large screen device display.
In view of this, the embodiment of the application provides a touch control method. In the method, the electronic equipment detects that a first object contacts the display screen, and in the process that the first object contacts the display screen, the electronic equipment detects that a second object contacts the display screen. Then, the electronic device determines a control command to the electronic device based on the first contact pattern of the first object with the display screen and the second contact pattern of the second object with the display screen, and then executes the control command. In this way, the user can contact two objects with the display screen, forming two contact patterns (a first contact pattern and a second contact pattern), and control of the electronic device is achieved based on the two contact patterns. For example, if the first contact pattern corresponds to ctrl and the second contact pattern is letter C drawn by the user, it is determined that the control command is ctrl+c, and then the electronic device executes the control command, which is ctrl+c, to copy the display information. It can be understood that, taking an example that the electronic device is a large-screen device (e.g., a smart screen), by using the touch method provided by the application, a user can use the operation habit on the notebook computer (e.g., the habit of using the combination command such as ctrl+ V, ctrl +c on the notebook computer) on the large-screen device.
Example 1
Fig. 4 is a schematic flow chart of a touch method according to an embodiment of the application. The method may be performed by an electronic device, which may be a device having a touch display screen, and may be a large screen device, such as electronic device 200 in fig. 1. As shown in fig. 4, the process includes:
s1, detecting that a first object contacts a display screen.
The first object may be a body part of a person or a stylus.
Taking the example that the first object is a body part of a person, at least one of a palm, a wrist, an arm, a knuckle, an elbow joint, and the like may be included. Taking the palm as an example, the palm contacting display screen may include: the palm center of the palm is in one face of the palm or part of the palm, the back of the hand or part of the palm is in contact with the display screen, and the side of the hand or part of the palm is in contact with the display screen.
Taking the first object as a handwriting pen as an example, a common handwriting pen or an electronic handwriting pen can be included. Among them, a common stylus such as a 2B pencil. The electronic stylus may include an electronic stylus that may include an inertial measurement unit (Inertial measurement unit, IMU) capable of detecting a gesture of the electronic stylus.
There are a number of ways in which the electronic device may detect that the first object is in contact with the display screen. Such as at least one of modes 1 and 2. Mode 1, the display screen of the electronic device is an infrared touch display screen. In the infrared touch display screen, infrared matrixes are densely distributed in different directions close to the front of the screen, and whether infrared rays are blocked by an object or not is continuously scanned. If it is detected that the infrared rays are blocked by the object, it is determined that the object touches the display screen. The infrared touch screen can also recognize the circumscribed rectangle of the object and the screen contact portion to recognize what the object is in contact with. Mode 2, the display screen of the electronic device is a capacitive touch display screen. For example, when a palm of a user touches a capacitive touch screen, the user and the touch screen surface form a coupling capacitance due to the conductivity of the human body. The finger touching the display screen draws a small current from the contact point, the current flows from the electrodes at the four corners of the touch screen respectively, the current flowing through the four electrodes is related to the distance from the finger to the four corners, and the electronic equipment can calculate the distance from the finger to the four corners through the four currents, so that the position of the touch point is obtained. It will be appreciated that there may be other ways of detecting that the first object is in contact with the display screen, the application being not by way of example.
S2, in the process that the first object is contacted with the display screen, the second object is detected to be contacted with the display screen.
In this case, the term "during the contact of the first object with the display screen" is understood to mean that the first object is in contact with the display screen without any distance, in other words, the first object is in continuous contact with the display screen and does not leave the display screen.
In some embodiments, the second object may be a body part of a person, or may be a stylus. Taking a human body part as an example, it may include a finger, a finger joint, a wrist, an elbow joint, and the like. Taking a stylus as an example, a normal stylus or an electronic stylus may be included. It should be noted that the first object and the second object may be different objects. Taking the example that the first object is a hand and the second object is a handwriting pen, namely, one hand of a user is contacted with the display screen, and the other hand holds the handwriting pen to be contacted with the display screen in the process of contacting the hand with the display screen. Alternatively, taking the first object as a stylus and the second object as a hand, i.e., the user holds the stylus with one hand to bring the stylus into contact with the display, during which the finger of the other hand contacts the display (e.g., draws a graphic on the display). Alternatively, taking the example that the first object is one hand and the second object is the other hand, for example, the left hand of the user is in contact with the display screen, and the finger of the right hand is in contact with the display screen during the contact of the left hand with the display screen. Alternatively, taking the first object as a stylus and the second object as another stylus, for example, the user holds one stylus with his left hand to make the stylus contact with the display, and holds the other stylus with his right hand to make the stylus contact with the display in the process of making the stylus contact with the display.
S3, determining a control command for the electronic equipment according to the first contact pattern of the first object and the display screen and the second contact pattern of the second object and the display screen.
Taking the example that the second contact pattern is a pattern (for example, letter C or letter V) drawn on the display screen by the second object, S3 may include: the electronic device determines a first instruction according to the first contact pattern and determines a control command according to the first instruction and the second contact pattern. For example, it is determined that the first instruction is Ctrl, the second contact pattern is C, and the control command is ctrl+c according to the first contact pattern.
The manner of determining the first instruction by the electronic device according to the first contact pattern includes, but is not limited to, at least one of the following manners one to four.
The first mode is that the electronic equipment determines an instruction set corresponding to a first object according to the object type of the first object; and determining a first instruction corresponding to the first contact pattern in the instruction set.
That is, the instruction sets corresponding to different object types may be different. Taking two kinds of objects, i.e. a hand and a handwriting pen as examples, the instruction set corresponding to the hand comprises: the first instruction corresponding to the contact pattern of the palm is Ctrl, and the first instruction corresponding to the contact pattern of the fist is Alt. The instruction set corresponding to the handwriting pen comprises: the first instruction corresponding to the handwriting pen which is in parallel contact with the display screen and the included angle between the handwriting pen and the edge line of the display screen is 90 degrees is Ctrl, and the first instruction corresponding to the handwriting pen which is in parallel contact with the display screen and the included angle between the handwriting pen and the edge line of the display screen is 9 degrees is Alt. After the electronic device detects that the first object contacts the display screen, the type of the first object can be determined, then an instruction set corresponding to the first object is determined, and a first instruction corresponding to the first contact pattern is determined in the instruction set.
In other words, the first instruction is related to an object type of the first object. The different kinds of objects are different from the first instructions corresponding to the first contact patterns of the display screen. For example, the first instruction corresponding to the first contact pattern of the display screen by the hand is ctrl, and the first instruction corresponding to the first contact pattern of the display screen by the stylus is Alt. Thus, the user may implement different control functions by contacting different types of objects with the display screen.
Taking the example that the first object is a body part of the user as an example, the electronic equipment can identify the user identity of the user, and determining an instruction set corresponding to the user according to the user identity; a first instruction corresponding to the first contact pattern is determined in the instruction set.
That is, instruction sets corresponding to different users are different. Taking a user a and a user B as an example, the instruction set corresponding to the user a includes: the first instruction corresponding to the contact pattern of the palm and the display screen is Ctrl. The instruction set corresponding to the user B comprises: the first instruction corresponding to the contact pattern of the palm and the display screen is Alt, and the first instruction corresponding to the contact pattern of the fist and the display screen is Ctrl. Therefore, after the electronic device detects that the first object contacts the display screen, the identity of the user can be identified, and then an instruction set corresponding to the user is determined, and a first instruction corresponding to the first contact pattern is determined in the instruction set. There are various ways of user identification, for example, biometric identification, user habit identification, etc. The biological characteristics comprise the biological characteristics of fingerprints, faces, the length, width, shape, size and the like of the palm. Taking the face recognition of the user identity as an example, a camera can be arranged on the electronic equipment, the camera is used for collecting images, then the faces in the images are recognized, and the user identity corresponding to the faces is recognized in a face database. It should be understood that other ways of identification are possible, and the application is not limited to examples.
In some embodiments, the electronic device may receive an input operation by a user, and set an instruction set corresponding to the user according to the input operation. That is, different users can set corresponding instruction sets according to their own operation habits. For example, user a and user B may set their instruction sets, respectively.
Continuing with the example of user A and user B, assuming that user A wants the electronic device to execute a "copy" control command, the left hand palm of user A may be brought into contact with the display screen, and the right hand draws letter C on the display screen. Because the electronic device identifies the first contact pattern between the palm of the left hand of the user a and the display screen, and determines that the first instruction corresponding to the first contact pattern is Ctrl in the instruction set corresponding to the user a, the electronic device can determine that the control command is "ctrl+c", and execute the "ctrl+c" to implement the copy function. Assuming that user B also wants the electronic device to execute a "copy" control command, the left hand fist of user B may be brought into contact with the display screen, and the right hand drawing letter C on the display screen. Because the electronic device identifies the first contact pattern of the fist of the left hand of the user B and the display screen, and determines that the first instruction corresponding to the first contact pattern is Ctrl in the instruction set corresponding to the user B, the control command may be determined to be "ctrl+c", and the "ctrl+c" is executed to implement the copy function. Thus, for the same control function (e.g., copy function), the manner in which different users contact the display screen is different.
In a third mode, taking the first object as an example of a body part of the user, the electronic device may determine a contact gesture between the user and the display screen according to the first contact pattern, and determine the first instruction according to the contact gesture.
That is, the first instruction relates to a gesture of the first object when in contact with the display screen. When the first object contacts the display screen in different postures, the generated first contact patterns are different, and the corresponding first instructions are different. Illustratively, the contact gesture includes at least one of a fist, palm opening, palm merging, finger bending. For example, referring to fig. 5 (a), when the user touches the display screen in the palm open state, the corresponding first instruction is ctrl. For another example, referring to fig. 5 (b), the first instruction corresponding to the touch of the display screen in the palm merging state is Alt. For another example, the corresponding first instruction when the user touches the display screen in a fist-making state (not shown in the figure) is shift.
In a fourth mode, taking the case that the first object is a handwriting pen as an example, the electronic device can determine that the handwriting pen is in parallel contact with the display screen according to the first contact pattern, and an included angle between the handwriting pen and an edge line of the display screen is a first angle; the first instruction is determined according to the first angle.
That is, when the stylus is in parallel contact with the display screen, the angle between the stylus and the edge line of the display screen is different, and the corresponding first instruction is different. The handwriting pen is in parallel contact with the display screen, and the pen holder of the handwriting pen is in parallel contact with the display screen. For example, referring to fig. 6 (a), the stylus is in parallel contact with the display screen, and the angle between the stylus and the edge 601 of the display screen (i.e., the first angle) is 90 degrees, then the corresponding first instruction is ctrl. For another example, referring to fig. 6 (b), the stylus is in parallel contact with the display screen, and the angle between the stylus and the edge 601 of the display screen (i.e. the first angle) is 0 degrees, then the corresponding first command is Alt.
In some embodiments, before performing S3, the method may further include the steps of: and displaying an icon corresponding to the first instruction. For example, referring to fig. 7, if the electronic device detects that the user's hand touches the display screen and determines that the first instruction is ctrl, then a ctrl icon may be displayed. In this way, the user can determine whether the called instruction is accurate, and if the called instruction is not the user's desired instruction, the display screen can be re-touched (e.g., the display screen is touched in another gesture) to call out the desired instruction. In this way, the memory pressure of the user on the contact graph corresponding to the first instruction can be relieved.
It is understood that the first instruction may be one or more, and if the first instruction is a plurality of first instructions, for example ctrl and Alt, a plurality of icons corresponding to the plurality of first instructions may be displayed respectively. For example, referring to fig. 8, when the electronic device detects that the user hand touches the display screen and determines that the corresponding first instruction includes ctrl and Alt, a ctrl icon and an Alt icon are displayed. The user may select a certain icon click from, e.g., the electronic device detects that the user clicks on the ctrl icon, and determines that the first instruction is ctrl.
S4, executing control naming.
Illustratively, the control commands include at least one of the following examples:
for example, the first instruction is ctrl, the second contact pattern is letter C, the control command is ctrl+c, and when the electronic device executes the control command, the display information on the display screen is copied.
For another example, the first instruction is ctrl, the second contact pattern is letter V, the control command is ctrl+v, and when the electronic device executes the control command, the display information is pasted on the display screen.
For another example, the first instruction is ctrl, the second contact pattern is letter X, the control command is ctrl+x, and when the electronic device executes the control command, the display information on the display screen is clipped.
For another example, the first instruction is windows, the second contact pattern is letter F, and the control command is windows+f, and when the electronic device executes the control command, a content search is performed on the display screen, such as searching for a file, searching for a keyword, and the like.
For another example, the first instruction is Alt, the second contact pattern is letter N, and the control command is alt+n, and when the electronic device executes the control command, the window on the display screen is minimized.
For another example, the first instruction is Alt, the second contact pattern is the letter X, the control command is alt+x, and the window on the display screen is maximized when the electronic device executes the control command.
Example two
The second embodiment is a refinement of the first embodiment. Specifically, the electronic device has a pattern recognition engine therein. The pattern recognition engine is used for recognizing a first contact pattern of the first object and the display screen and a second contact pattern of the second object and the display screen. In order to save power consumption, the pattern recognition engine is typically in a sleep state, and the pattern recognition engine is awakened to recognize the first contact pattern and the second contact pattern in case a wake-up condition (introduced in fig. 9) is satisfied.
Exemplary, as shown in fig. 9, another flow chart of the touch method according to the embodiment of the application is shown. As shown in fig. 9, the flow of the method includes:
S901, judging whether a first object is in contact with a display screen; if so, S902 is performed.
S902, it is determined whether or not a wake-up condition is satisfied, and if so, S903 is executed.
As an example, S902 includes: judging whether the first object is a preset object or not; if so, S903 is performed. The electronic device may determine whether the first object is a preset object through the camera. The preset object is, for example, a stylus or a hand.
As another example, S902 includes: judging whether the duration of continuous contact between the first object and the display screen is greater than a threshold value; if so, S903 is performed. For example, the electronic device records that the contact start time is 12:00:00S, taking the threshold value as 10S as an example, if contact is still made at 12:00:10S, determines that the contact time is longer than the threshold value, and performs S903. The threshold may be any duration of 3s, 5s, 10s, etc., and may be factory-carried or user-defined for the electronic device.
That is, the wake-up conditions of the pattern recognition engine include: the first object is a preset object and/or the first object is longer than a threshold value when in contact with the display screen. Optionally, the wake-up condition may further include the first object remaining stationary, e.g., at least one of a contact position, a contact area, a contact gesture of the first object with the display screen is unchanged.
S903, waking up the pattern recognition engine.
Optionally, in order to better enhance the interactive experience, after the electronic device wakes up the pattern recognition engine, a prompt message may be output. For example, the prompt may be a text message, such as: the object is kept stationary and a graphic is drawn on the display screen with another object. The user can perform accurate operation based on the prompt information. It should be understood that the prompt information may be image information, sound, vibration, etc. besides text information, and the specific form is not limited.
S904, judging whether a second object contacts the display screen, and if so, executing S905.
S905, the second object contact start time T1 is recorded.
S906, recording a second contact pattern of the second object.
S907, it is determined whether the first object stops touching the screen, and if so, S908 is performed.
S908, the first object contact stop time T2 is recorded.
S907 to S908 above may be touch screen and/or processor execution.
S909, determining whether the first object is stationary within the T2-T1 period, if so, executing S910.
For example, if at least one of the contact position, the contact area, and the contact posture of the first object with the display screen is unchanged, it is determined that the first object is stationary. One implementation is that the electronic device obtains a first contact pattern of the first object with the display screen, and determines that the first object is stationary if the pattern is unchanged.
It should be noted that, during the process of drawing the pattern on the display screen by the second object, the first object remains stationary, because the electronic device needs to identify the first contact pattern between the first object and the display screen during the period (the process of drawing the pattern by the second object), if the first contact pattern changes due to the movement of the first object, the control command cannot be accurately determined.
S910, identifying a first contact pattern of the first object and a second contact pattern of the second object in the T2-T1 time period. For example, the first contact pattern may be a palm pattern. The second contact pattern is the letter C.
S911, matching the first instruction corresponding to the first contact pattern in the database.
S912, judging whether a control command corresponding to the first instruction and the second contact pattern exists. If yes, S913 is performed, otherwise S914 is performed.
For example, the first instruction is Ctrl, the second contact pattern is pattern C drawn by the second object on the display screen, and the control command is ctrl+c. For another example, the first instruction is Ctrl, the second contact pattern is that the user right hand draws a pattern as a wavy line on the display screen, for example, (a) in fig. 10, and then there is no corresponding control command.
S913, a control command is executed.
S914, outputting prompt information.
For example, referring to fig. 10 (a), the user touches the display screen with the left hand and draws a wavy line on the display screen with the right hand, and the electronic device determines that there is no corresponding control command, and outputs a prompt message, which may be: unable to recognize, please re-enter. Of course, the prompt information can also be in other forms, such as vibration or output of a mark of a fork, etc., and the application is not limited.
One implementation of fig. 9 is described below in exemplary fashion in connection with fig. 11 and 12.
As shown in fig. 11, at time T1, the electronic device detects that the left hand is touching the display screen. Assuming that the electronic device determines that the wake condition is satisfied (see S902 of fig. 9), the pattern recognition engine is awakened at time T2. During the period T1-T2, corresponding to (a) in FIG. 12, when the electronic device displays interface A (including news: XXX, etc. text information in interface A), a left hand touch to the screen is detected and the graphical recognition engine is awakened.
Optionally, after the electronic device wakes up the pattern recognition engine, a certain prompt message may be output. For example, in fig. 12 (a), the presentation information is: please keep the left hand still and draw the graphics with the other hand. Optionally, the electronic device may immediately display the prompt message when the pattern recognition engine wakes up; or when the right hand is detected to contact the display screen, displaying prompt information. In addition, the prompt message may be displayed for a certain period of time, and then the display is canceled, and the specific value of the period of time is not limited by the present application. After the image recognition engine wakes up, it can be used to recognize the contact pattern of objects (left hand, right hand) on the display screen.
At time T3, the electronic device detects that the user's right hand is touching the display screen. The electronic device may record the start time of the right hand touching the display screen, i.e. the time T3, and may also record the second contact pattern drawn by the right hand on the display screen. Illustratively, the right-hand drawing process corresponds to (b) in fig. 12.
At the time T4, the electronic equipment detects that the left hand leaves the display screen, and records the left hand leaving time, namely the time T4. The electronic device may determine whether the left hand is stationary for a period of time T3 to T4, and if so, identify a first contact pattern for the left hand and a second contact pattern for the right hand for a period of time T3 to T4. For example, the electronic device recognizes that the first contact pattern is a palm and the second contact pattern is the letter C. The electronic device may also determine a first instruction corresponding to the first contact pattern, e.g., the first instruction is Ctrl. In this way, the electronic device determines the control command to be ctrl+c.
At time T5, the electronic device executes the control command ctrl+c. Illustratively, T5 corresponds to (c) in fig. 12.
At time T6, the electronic device turns off the pattern recognition engine. In some embodiments, t6=t4, i.e., the pattern recognition engine shuts down immediately when the left hand is away. Or after the left hand leaves, the electronic device can wait for a period of time P, and if the left hand does not contact the display screen again within the period of time P, the pattern recognition engine is closed; otherwise, the pattern recognition engine is not turned off, as the user also needs to then control the electronic device. Thus, turning off the pattern recognition engine at time T6 in FIG. 11 may not be performed (e.g., T7-T4 is less than time period P), and is therefore indicated by the dashed line.
At time T7, the electronic device detects that the left hand again touches the display. If the pattern recognition engine is closed, the pattern recognition engine can be awakened again at the time T8 under the condition that the awakening condition is met; if the pattern recognition engine is not turned off and does not need to wake up again, the "wake up pattern recognition engine" at time T8 in FIG. 11 may not be executed and is therefore indicated by the dashed line. Illustratively, the period T7-T8 corresponds to (d) in FIG. 12.
At the time T9, the electronic device detects that the right hand of the user touches the display screen, and the electronic device can record the starting touch time of the right hand touching the display screen, namely the time T9, and can also record a second touch pattern drawn by the right hand on the display screen. Illustratively, the right-hand drawing process corresponds to (e) in fig. 12.
At the time T10, the electronic equipment detects that the left hand leaves the display screen, and records the leaving time, namely the time T10. The electronic device may determine whether the left hand remains stationary within T9-T10 and if so, identify a first contact pattern for the left hand and a second contact pattern for the right hand within T9-T10. Assuming that the electronic device recognizes that the first contact pattern is a palm, the second contact pattern is the letter V. The electronic device may also determine a first instruction corresponding to the first contact pattern, e.g., the first instruction is Ctrl. Thus, the electronic device determines the control command to be ctrl+v.
At time T11, the electronic device executes ctrl+v, corresponding to (f) in fig. 12.
At time T12, the electronic device turns off the pattern recognition engine.
It should be noted that, in T1-T12 in fig. 11, the distance between two adjacent moments is not representative of the time interval, but is merely an exemplary illustration.
It should be noted that, with continued reference to fig. 11, the left hand may be active during the period of time T2-T3, but the left hand needs to remain inactive during the period of time T3-T4 because the first contact pattern between the left hand and the display screen during the period of time T3-T4 needs to be identified, and if the first object moves during the period of time T3-T4, the first contact pattern changes, the accurate control command cannot be determined. For example, assume that the user's left wrist is in contact with the display screen and the wake-up condition is satisfied within T1-T2 wakes up the pattern recognition engine, the left hand is adjusted to be in palm contact with the display screen (left hand is not out of the display screen) within T2-T3, and palm contact with the display screen is maintained within T3-T4. Assume again that the right hand within T3-T4 draws the letter C on the display. In this way, the electronic device determines that the first instruction is Ctrl based on the first contact pattern of the left hand (i.e. the pattern of the palm contacting the display screen) in the T3-T4, and determines that the control command is ctrl+c based on the first instruction and the C drawn by the right hand.
As is apparent from the above description, the user can contact two objects with the display screen to form two contact patterns (a first contact pattern and a second contact pattern), and the electronic device determines a control command for the electronic device according to a first instruction and the second contact pattern corresponding to the first contact pattern and executes the control command. The electronic equipment is controlled without using remote control equipment in the process, and the operation is convenient and fast through two hands of a user. Further, assuming that the first instruction is ctrl and the second contact pattern is letter C drawn by the user, the control command is ctrl+c, and the electronic device performs ctrl+c to copy the display information. Therefore, through the touch control method provided by the application, a user can follow the operation habit of using the keyboard on the electronic equipment (such as large-screen equipment), and the experience is better.
Fig. 13 is a schematic structural diagram of an electronic device 1300 according to an embodiment of the present application. As shown in fig. 13, an electronic device 1300 may include a detection unit 1301 and a processing unit 1302. The detection unit 1301 is configured to detect that a first object contacts a display screen; and detecting that the second object contacts the display screen in the process that the first object contacts the display screen. That is, the detection unit 1301 is configured to perform S1-S2 in fig. 4. The processing unit 1302 is configured to determine a control command for the electronic device according to a first contact pattern of the first object and the display screen and a second contact pattern of the second object and the display screen; and is also used for executing the control command. That is, the processing unit 1302 is configured to execute S3-S4 in fig. 4.
By way of example, the detection unit 1301 may be a touch display in an electronic device, which may be, for example, an infrared touch display or a capacitive touch display, etc. The processing unit 1302 may be a processor in an electronic device. The processor may be a central processing unit CPU or a graphics processing unit GPU, or other devices, such as other devices independent of the CPU or GPU, e.g. a pattern recognition engine, for which reference is made to the foregoing description.
Optionally, the electronic device 1300 may also include other units, such as a transceiver unit (not shown). The transceiver unit is used for the electronic device 1300 to communicate with other devices, which may be circuits, devices, interfaces, buses, software modules, transceivers or any other means of enabling communication.
All relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
The functional modules in the embodiments of the present application may be integrated into one processor, or may exist alone physically, or two or more modules may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules.
In addition, in the embodiment of the present application, the division of the modules is schematic, and only one logic function is divided, and another division manner may be adopted in actual implementation, for example, please refer to fig. 14, which is another manner of dividing the modules, or may be understood as a schematic structural diagram of another electronic device.
Fig. 14 is a schematic diagram of another structure of an electronic device 1400 according to an embodiment of the present application. The electronic device 1400 may be a system-on-chip. In the embodiment of the application, the chip system can be formed by a chip, and can also comprise the chip and other discrete devices. The electronic device 1400 includes one or more processors 1401; one or more memories 1402; communication interface 1403, and one or more computer programs 1404, which may be connected via one or more communication buses 1405. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one thick line is shown in fig. 14, but not only one bus or one type of bus. Wherein communication interface 1403 is used to enable communication with other devices, such as a transceiver. One or more computer programs are stored in the memory 1402 and configured to be executed by the one or more processors 1401, the one or more computer programs 1404 comprising instructions that can be used to perform the relevant steps as provided in the respective embodiments above, e.g. S1 to S4 in fig. 4.
In an embodiment of the present application, the processor 1401 may be a general purpose processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component, and may implement or perform the methods, steps, and logic blocks disclosed in the embodiment of the present application. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with an embodiment of the present application may be embodied as a hardware processor executing, or as a combination of hardware and software modules executing, in a processor.
In an embodiment of the present application, the memory 1402 may be a nonvolatile memory, such as a hard disk (HDD) or a Solid State Drive (SSD), or may be a volatile memory (RAM). The memory is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory in embodiments of the present application may also be circuitry or any other device capable of performing memory functions for storing program instructions and/or data.
Embodiments of the present application also provide a computer readable storage medium storing a computer program, where the computer program when executed on a computer causes the computer to perform the touch method of any of the embodiments shown in fig. 4 or 9.
In an embodiment of the present application, there is further provided a computer program product storing a computer program, where the computer program includes program instructions, which when executed by a computer, cause the computer to perform the touch method of any of the embodiments shown in fig. 4 or 9.
In the embodiments of the present application described above, the method provided in the embodiments of the present application is described from the point of view of an electronic device (e.g., a large-screen device) as an execution subject. In order to implement the functions in the method provided by the embodiment of the present application, the electronic device may include a hardware structure and/or a software module, where the functions are implemented in the form of a hardware structure, a software module, or a hardware structure plus a software module. Some of the functions described above are performed in a hardware configuration, a software module, or a combination of hardware and software modules, depending on the specific application of the solution and design constraints.
As used in the above embodiments, the term "when …" or "after …" may be interpreted to mean "if …" or "after …" or "in response to determination …" or "in response to detection …" depending on the context. Similarly, the phrase "at the time of determination …" or "if detected (a stated condition or event)" may be interpreted to mean "if determined …" or "in response to determination …" or "at the time of detection (a stated condition or event)" or "in response to detection (a stated condition or event)" depending on the context. In addition, in the above-described embodiments, relational terms such as first and second are used to distinguish one entity from another entity without limiting any actual relationship or order between the entities.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present invention, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), etc. The schemes of the above embodiments may be used in combination without conflict.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (12)

1. A touch method, characterized by being applied to an electronic device, the electronic device including a display screen, the method comprising:
detecting that a first object contacts the display screen;
detecting that a second object is contacted with the display screen in the process that the first object is contacted with the display screen;
determining a control command for the electronic equipment according to a first contact pattern of the first object and the display screen and a second contact pattern of the second object and the display screen;
and executing the control command.
2. The method of claim 1, wherein the determining the control command to the electronic device based on the first contact pattern of the first object with the display screen and the second contact pattern of the second object with the display screen comprises:
and when the first object is determined to be a preset object and the duration time of the contact of the first object with the display screen is longer than a threshold value, determining a control command for the electronic equipment according to a first contact pattern of the first object with the display screen and a second contact pattern of the second object with the display screen.
3. The method of claim 1, wherein the determining the control command to the electronic device based on the first contact pattern of the first object with the display screen and the second contact pattern of the second object with the display screen comprises:
determining that the first object is a preset object, and determining a control command for the electronic equipment according to a first contact pattern of the first object with the display screen and a second contact pattern of the second object with the display screen when at least one of the contact position, the contact area and the contact gesture of the first object with the display screen is unchanged between a first moment and a second moment;
the first moment is the contact start moment of the second object and the display screen, and the second moment is the contact end moment of the second object and the display screen.
4. The method of claim 1, wherein the determining the control command to the electronic device based on the first contact pattern of the first object with the display screen and the second contact pattern of the second object with the display screen comprises:
determining an instruction set corresponding to the first object according to the object type of the first object;
Determining a first instruction corresponding to the first contact pattern in the instruction set according to the first contact pattern;
and determining a control command for the electronic equipment according to the first instruction and the second contact pattern.
5. The method of claim 1, wherein the first object is a user body part, and wherein the determining the control command to the electronic device based on the first contact pattern of the first object with the display screen and the second contact pattern of the second object with the display screen comprises:
identifying a user identity of the user;
determining an instruction set corresponding to the user according to the user identity;
determining a first instruction corresponding to the first contact pattern in the instruction set according to the first contact pattern;
and determining a control command for the electronic equipment according to the first instruction and the second contact pattern.
6. The method of claim 1, wherein the first object is a user body part, and wherein the determining the control command to the electronic device based on the first contact pattern of the first object with the display screen and the second contact pattern of the second object with the display screen comprises:
Determining a contact gesture of the user with the display screen according to the first contact graph, wherein the contact gesture comprises at least one of fist making, palm opening, palm merging and finger bending;
determining the first instruction according to the contact gesture;
and determining a control command for the electronic equipment according to the first instruction and the second contact pattern.
7. The method of claim 1, wherein the first object is a stylus, and wherein determining the control command to the electronic device based on the first contact pattern of the first object with the display screen and the second contact pattern of the second object with the display screen comprises:
according to the first contact pattern, determining that the handwriting pen is in parallel contact with the display screen, and an included angle between the handwriting pen and an edge line of the display screen is a first angle;
determining the first instruction according to the first angle;
and determining a control command for the electronic equipment according to the first instruction and the second contact pattern.
8. The method according to any one of claims 4-7, further comprising:
And displaying the icon corresponding to the first instruction.
9. The method of any of claims 4-7, wherein when the first instruction comprises a plurality, the method further comprises:
displaying a plurality of icons corresponding to the plurality of first instructions;
determining a control command to the electronic device according to the first instruction and the second contact pattern, including:
receiving an operation for selecting a first icon among the plurality of icons;
and determining a control command for the electronic equipment based on the first instruction corresponding to the first icon and the second contact graph.
10. The method according to any one of claims 1-9, wherein the method further comprises:
and when the duration of the contact of the first object with the display screen is determined to be longer than the preset duration, waking up a pattern recognition engine in the electronic equipment, wherein the pattern recognition engine is used for recognizing a first contact pattern of the first object with the display screen and a second contact pattern of the second object with the display screen.
11. An electronic device, comprising:
a processor and a memory;
the memory is configured to store one or more computer programs, the one or more computer programs comprising computer-executable instructions that, when executed by the multimedia information processing apparatus, cause the electronic apparatus to perform the method of any of claims 1-10.
12. A computer readable storage medium for storing a computer program which, when run on a computer, causes the computer to perform the method of any one of claims 1 to 10.
CN202210145207.2A 2022-02-17 2022-02-17 Touch control method and electronic equipment Pending CN116661669A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210145207.2A CN116661669A (en) 2022-02-17 2022-02-17 Touch control method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210145207.2A CN116661669A (en) 2022-02-17 2022-02-17 Touch control method and electronic equipment

Publications (1)

Publication Number Publication Date
CN116661669A true CN116661669A (en) 2023-08-29

Family

ID=87726507

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210145207.2A Pending CN116661669A (en) 2022-02-17 2022-02-17 Touch control method and electronic equipment

Country Status (1)

Country Link
CN (1) CN116661669A (en)

Similar Documents

Publication Publication Date Title
US10649581B1 (en) Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US7813774B2 (en) Contact, motion and position sensing circuitry providing data entry associated with keypad and touchpad
CN109428969B (en) Edge touch method and device of double-screen terminal and computer readable storage medium
US6903730B2 (en) In-air gestures for electromagnetic coordinate digitizers
US9213467B2 (en) Interaction method and interaction device
CN110058782B (en) Touch operation method and system based on interactive electronic whiteboard
US7499035B2 (en) Focus management using in-air points
CN110727369B (en) Electronic device
US9298292B2 (en) Method and apparatus for moving object in terminal having touch screen
WO2019062910A1 (en) Copy and pasting method, data processing apparatus, and user device
US20060209016A1 (en) Computer interaction based upon a currently active input device
US10928948B2 (en) User terminal apparatus and control method thereof
CN105094654B (en) Screen control method and device
US20020057262A1 (en) Mouse input panel and user interface
TWI505155B (en) Touch-control method for capactive and electromagnetic dual-mode touch screen and handheld electronic device
US8830192B2 (en) Computing device for performing functions of multi-touch finger gesture and method of the same
WO2012130156A1 (en) Handwriting input method and apparatus for touch device, and electronic device
TWI502433B (en) Electronic device for interacting with stylus and touch control method thereof
WO2021203815A1 (en) Page operation method and apparatus, and terminal and storage medium
WO2021197487A1 (en) Method and apparatus for controlling terminal screen by means of mouse, mouse and storage medium
US9588678B2 (en) Method of operating electronic handwriting and electronic device for supporting the same
US11216181B2 (en) Device, method, and graphical user interface for simulating and interacting with handwritten text
CN107368205B (en) Handwriting input method and mobile terminal
WO2022134954A1 (en) Webpage bookmarking method, electronic device, and storage medium
CN116661669A (en) Touch control method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination