CN110633044A - Control method, control device, electronic equipment and storage medium - Google Patents
Control method, control device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN110633044A CN110633044A CN201910796890.4A CN201910796890A CN110633044A CN 110633044 A CN110633044 A CN 110633044A CN 201910796890 A CN201910796890 A CN 201910796890A CN 110633044 A CN110633044 A CN 110633044A
- Authority
- CN
- China
- Prior art keywords
- touch
- target
- information
- areas
- control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application discloses a control method, a control device, electronic equipment and a storage medium, wherein the method is applied to the electronic equipment, and the electronic equipment comprises a touch part; the method comprises the following steps: acquiring target touch information acquired by the touch component; the target touch information comprises M touch areas and change information of N touch areas in the M touch areas; wherein M is a positive integer, and N is an integer less than or equal to M; analyzing the target touch information and determining a target touch gesture corresponding to the target touch information; and determining a target control instruction corresponding to the target touch gesture based on a preset mapping relation between the touch gesture and the control instruction, and executing the target control instruction. Therefore, the control diversity of the electronic equipment is increased, and the control experience of a user is improved.
Description
Technical Field
The present disclosure relates to control technologies, and in particular, to a control method, an apparatus, an electronic device, and a storage medium.
Background
With the updating of electronic devices, the control modes of electronic devices tend to be diversified, and at present, the control modes mainly include: touch, remote control, keyboard control, mouse control, and the like. The mouse and the keyboard are main accessories for people to control the computer, and people can complete various control works through the mouse and the keyboard. However, considering that the portable mouse and the keyboard can only be used in a specific environment, at present, in order to improve the portability of electronic devices, many electronic devices have abandoned an external mouse and an external keyboard and replaced with a touch panel or keys integrated on the electronic device, but in some usage scenarios, the replacing component of the mouse is poor in control effect compared with a real mouse, and users are more accustomed to using the real mouse to complete control operations on the electronic devices. Therefore, the control method available in the current electronic device cannot satisfy the best user experience.
Disclosure of Invention
In order to solve the foregoing technical problem, embodiments of the present application desirably provide a control method, an apparatus, an electronic device, and a storage medium.
The technical scheme of the application is realized as follows:
in a first aspect, a control method is provided, which is applied to an electronic device, where the electronic device includes a touch component; the method comprises the following steps:
acquiring target touch information acquired by a touch component; the target touch information comprises M touch areas and change information of N touch areas in the M touch areas; wherein M is a positive integer, and N is an integer less than or equal to M;
analyzing the target touch information and determining a target touch gesture corresponding to the target touch information;
and determining a target control instruction corresponding to the target touch gesture based on the preset mapping relation between the touch gesture and the control instruction, and executing the target control instruction.
In the scheme, M is an integer more than 3; the change information of the N touch areas comprises at least one of the following items: the moving tracks of the N touch areas and the touch times of the N touch areas.
In the above scheme, analyzing the target touch information and determining a target touch gesture corresponding to the target touch information includes: determining a first touch area and a second touch area from the M touch areas; the first touch area is formed by a first touch body contacting with the touch part, and the second touch area is formed by a second touch body contacting with the touch part; determining change information of the first touch area and/or change information of the second touch area based on the change information of the N touch areas; and determining a target touch gesture corresponding to the target touch information based on the change information of the first touch area and/or the change information of the second touch area.
In the above scheme, determining the first touch area and the second touch area from the M touch areas includes: determining a first touch area and a second touch area based on contact areas of the M touch areas and/or intervals between adjacent touch areas; alternatively, the method further comprises: acquiring first indication information and second indication information acquired by a touch component; the first indication information is used for indicating a first touch area, and the second indication information is used for indicating a second touch area;
correspondingly, determining a first touch area and a second touch area from the M touch areas includes: determining a first touch area of the M touch areas based on the first indication information; a second touch area of the M touch areas is determined based on the second indication information.
In the above scheme, before the first indication information and the second indication information are acquired, the method further includes: outputting first prompt information and second prompt information; the first prompt message is used for prompting a user to input first indication information on the touch control component, and the second prompt message is used for prompting the user to input second indication information on the touch control component.
In the above scheme, the electronic device further includes a display unit;
analyzing the target touch information, and determining a target touch gesture corresponding to the target touch information, wherein the target touch gesture comprises: determining a target center position corresponding to the M touch areas based on the position distribution of the M touch areas; determining a cursor position corresponding to the target center position in the display area based on the coordinate mapping relation between the whole touch area of the touch control component and the whole display area of the display component; determining a target control gesture based on the change information of the N touch areas; the target control gesture is used for controlling and operating the cursor.
In the above scheme, the touch control component at least comprises a first working mode and a second working mode; the first working mode is keyboard control, and the second working mode is gesture control;
the method further comprises the following steps: acquiring a switching instruction; the switching instruction is a switching key instruction acquired through a switching key or a switching gesture instruction acquired through a touch component; and controlling the touch control component to switch between the first working mode and the second working mode based on the switching instruction.
In a second aspect, a control device is provided, which is applied to an electronic device, where the electronic device includes a touch component; the device includes:
the acquisition unit is used for acquiring target touch information acquired by the touch component; the target touch information comprises M touch areas and change information of N touch areas in the M touch areas; wherein M is a positive integer, and N is an integer less than or equal to M;
the processing unit is used for analyzing the target touch information and determining a target touch gesture corresponding to the target touch information;
and the processing unit is further used for determining a target control instruction corresponding to the target touch gesture based on the preset mapping relation between the touch gesture and the control instruction, and executing the target control instruction.
In a third aspect, an electronic device is provided, which includes: a processor and a memory configured to store a computer program operable on the processor, wherein the processor is configured to perform the steps of the aforementioned method when executing the computer program.
In a fourth aspect, a computer storage medium is provided, on which a computer program is stored, wherein the computer program, when executed by a processor, implements the steps of the aforementioned method.
By adopting the technical scheme, the touch control gesture of the user is recognized by utilizing the touch control component of the electronic equipment, and when the touch control gesture is the gesture controlled by the user simulation real mouse, the same control operation as the real mouse is executed on the electronic equipment based on the touch control gesture, so that the control diversity of the electronic equipment is increased, and the control experience of the user is improved.
Drawings
FIG. 1 is a schematic flow chart of a control method in an embodiment of the present application;
FIG. 2 is a top view of a mouse gesture in an embodiment of the present application;
FIG. 3 is a side view of a mouse gesture in an embodiment of the present application;
FIG. 4 is a first diagram illustrating a distribution of touch area locations in an embodiment of the present application;
FIG. 5 is a second diagram illustrating a distribution of touch area locations in an embodiment of the present application;
FIG. 6 is a third schematic diagram of a touch area location distribution in the embodiment of the present application;
FIG. 7 is a fourth diagram illustrating a distribution of touch area locations in an embodiment of the present application;
FIG. 8 is a fifth diagram illustrating a distribution of touch area locations in an embodiment of the present application;
FIG. 9 is a schematic view of a first component structure of a notebook computer provided in the present application;
FIG. 10 is a schematic view of a second component structure of a notebook computer provided in the present application;
FIG. 11 is a schematic structural diagram of a control device according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
So that the manner in which the features and elements of the present embodiments can be understood in detail, a more particular description of the embodiments, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings.
Example one
An embodiment of the present application provides a control method, which is applied to an electronic device, and fig. 1 is a schematic flowchart of the control method in the embodiment of the present application, and as shown in fig. 1, the control method may specifically include:
step 101: acquiring target touch information acquired by the touch component; the target touch information comprises M touch areas and change information of N touch areas in the M touch areas; wherein M is a positive integer, and N is an integer less than or equal to M;
step 102: analyzing the target touch information and determining a target touch gesture corresponding to the target touch information;
step 103: and determining a target control instruction corresponding to the target touch gesture based on a preset mapping relation between the touch gesture and the control instruction, and executing the target control instruction.
Here, the execution subject of steps 101 to 103 may be a processor of the electronic device. The electronic device may be a mobile terminal or a fixed terminal. Such as smart phones, personal computers (e.g., tablet, desktop, notebook, netbook, palmtop), e-book readers, portable multimedia players, audio/video players, cameras, virtual reality devices, etc.
Here, the touch component may be a touch screen integrating a display function and a touch function, in which case the electronic device may include a main display screen and a secondary display screen, and the touch screen may serve as the secondary display screen, and may not block a display screen of the main display screen when the electronic device is controlled by a touch gesture. The touch member may also be a touch panel having only a touch function. The main Display screen may be a Liquid Crystal Display (LCD), an Organic Light-Emitting Display (OLED), or the like. The secondary display screen and the primary display screen may also be display screens of the same material or different materials, for example, the secondary display screen may be an Electronic paper display (E-paper).
In practical application, when the touch control component detects that a touch area exists, detecting the change information of M touch areas and N touch areas in a preset time period; for example, the user uses at least one finger to click, long-press, or slide on the touch component, or the user uses a preset gesture to contact the touch component, and controls the electronic device based on a control instruction corresponding to the preset gesture.
In some embodiments, determining whether the target touch information is a preset touch gesture based on the position distribution of the M touch areas; and determining the target touch gesture from preset touch gestures based on the change information of N touch areas in the M touch areas. For example, when M touch areas move as a whole, M is equal to N, and when only a part of the M touch areas move, N is smaller than M.
Specifically, when the target touch information only includes one touch region, the target touch gesture is determined from preset touch gestures based on change information of the touch region, for example, the preset touch gestures include: a tap, long press, or swipe gesture.
When the target touch information includes two or more touch areas, whether the target touch information is a preset control gesture is determined based on the position distribution of the M touch areas, and then the target touch gesture is determined from the preset touch gestures based on the change information of the N touch areas. For example, the preset control gesture includes a simulated mouse control gesture, specifically, a move, a click, a double click, a scroll wheel, and the like.
In practical application, the coordinates can be used for representing different touch areas, the touch control component uploads the collected coordinate information and coordinate change information to the processing unit, and the processing unit determines the touch control gesture according to the coordinate information and the coordinate change information.
In some embodiments, step 102 may specifically include: and determining a target touch gesture corresponding to the target touch information based on the change information of the N touch areas in the M touch areas.
Specifically, a first touch area and a second touch area are determined from the M touch areas; wherein the first touch area is formed by a first touch body contacting the touch part, and the second touch area is formed by a second touch body contacting the touch part; determining change information of the first touch area and/or change information of the second touch area based on the change information of the N touch areas; and determining a target touch gesture corresponding to the target touch information based on the change information of the first touch area and/or the change information of the second touch area.
Here, the first touch region is a region formed by one specific finger contacting the touch member, such as a thumb or an index finger, and the second touch region is a region formed by another specific finger contacting the touch member, such as a middle finger, a ring finger, or a little finger.
Specifically, the determining a first touch area and a second touch area from the M touch areas includes: and determining a first touch area and a second touch area based on the contact areas of the M touch areas and/or the distance between the adjacent touch areas.
In some embodiments, the method further comprises: acquiring touch information corresponding to at least one touch gesture; establishing a touch gesture feature library by using the at least one touch gesture and the corresponding touch information;
correspondingly, determining a target touch gesture corresponding to the target touch information based on the change information of the N touch areas in the M touch areas includes: and determining a target touch gesture corresponding to the target touch information from a touch gesture feature library based on the change information of the N touch areas in the M touch areas.
Specifically, the touch gesture feature library includes at least one of the following: a first touch gesture, a second touch gesture, a third touch gesture and a fourth touch gesture; the first touch information corresponding to the first touch gesture comprises a first position change generated in a first touch area; the second touch information corresponding to the second touch gesture comprises a second position change generated in the second touch area; the third touch information corresponding to the third touch finger comprises a first touch area and/or a second touch area which generates a third position change; and fourth touch information corresponding to the fourth touch gesture comprises fourth position changes generated by the N touch areas.
In some embodiments, M takes an integer greater than 3; the change information of the N touch areas comprises at least one of the following items: the moving tracks of the N touch areas and the touch times of the N touch areas.
That is to say, when gesture control is performed on the electronic device, the gesture control function of the electronic device in the embodiment of the present application can be started only when the number of the touch areas is greater than 3; otherwise, the electronic device adopts a common touch control function. And when the gesture control function of the electronic equipment is determined to be started, determining a target touch gesture through the moving tracks and/or the touch times of the N touch areas.
Illustratively, when the touch gesture is a mouse gesture, the mouse gesture at least includes: a click left mouse button gesture, a click right mouse button gesture, a scroll wheel gesture, a move mouse gesture, and the like.
FIG. 2 is a top view of a mouse gesture in an embodiment of the present application; in fig. 2, the mouse includes a left button located at the upper left, a right button located at the upper right, and a scroll wheel located therebetween, when a user holds the mouse to control the electronic device, an area where a hand contacts a desktop or a surface of the mouse includes a palm root and five fingertip positions, most users click the left button of the mouse through an index finger, click the right button of the mouse through a middle finger, click or scroll the scroll wheel through a middle finger or a finger, and move the cursor on the display screen by moving the mouse.
FIG. 3 is a side view of a mouse gesture in an embodiment of the present application; it can be seen from fig. 3 that the palm root and the little finger are in contact with the desktop, the middle, index and ring fingers are in contact with the mouse surface, and the thumb position is not shown, but the thumb may be in contact with the desktop or the mouse surface according to personal habits.
In the embodiment of the application, based on the real mouse gesture of the user holding the mouse in fig. 2 and 3, the contact area between the user holding the mouse and the desktop or the surface of the mouse is obtained, and the real mouse gesture is simulated according to the area and position distribution of the contact area.
FIG. 4 is a first diagram illustrating a distribution of touch area locations in an embodiment of the present application; as shown in fig. 4, the touch area includes five spaced touch areas formed by five fingertips contacting the touch member, and an area which is located at the bottom and has a larger touch area formed by a palm root contacting the touch member. A first touch area formed by the contact of the fingertip of the index finger and a second touch area formed by the contact of the fingertip of the middle finger can be determined according to the position distribution of the 6 touch areas, so that the operation of a left key, a right key and a scroll wheel of the mouse is simulated according to the change information of the first touch area and the second touch area, and the mouse movement is simulated through the change information of the N touch areas.
In practical applications, according to usage habits of users, mouse gestures of different users differ, for example, users like to tilt a little finger or a big finger, or users are not used to contact a palm root with a touch control part, or the palm root is contacted with the touch control part to form at least two touch areas, but at least 4 touch areas are required to be formed to control the electronic device by using the mouse gestures.
FIG. 5 is a second diagram illustrating a distribution of touch area locations in an embodiment of the present application; as shown in fig. 5, when the small thumb is tilted and does not contact the touch member, the touch region includes four spaced touch regions formed by four fingertips contacting the touch member, and a region which is formed by a palm root contacting the touch member and is located at the bottom and has a larger touch region area.
FIG. 6 is a third schematic diagram of a touch area location distribution in the embodiment of the present application; as shown in fig. 6, when the palm root is not in contact with the touch member, the touch area includes only five touch areas at intervals formed by five fingertips in contact with the touch member.
FIG. 7 is a fourth diagram illustrating a distribution of touch area locations in an embodiment of the present application; as shown in fig. 7, the palm root contacting the touch member forms two touch areas, one is formed by the lower palm root of the thumb, the other is formed by the lower palm root of the little finger, and the touch area further includes four spaced touch areas formed by five finger tips contacting the touch member.
The above description is only exemplary to give four touch area position distributions, and is not intended to limit the mouse gesture according to the embodiment of the present application, and any other mouse gesture that meets the operation habit of the user is included in the protection scope of the present application.
Here, according to most of habits of users in operating a real mouse, the first touch area is determined as an area formed by the index finger contacting the touch control component, the second touch area is determined as an area formed by the middle finger contacting the touch control component, and the method for determining the first touch area and the second touch area can be used for automatically identifying by the electronic device according to the distribution characteristics of the touch areas, and besides, the fingers forming the first touch area and the second touch area can be defined by the users according to the use habits.
In some embodiments, the method further comprises: acquiring first indication information and second indication information acquired by the touch control component; the first indication information is used for indicating a first touch area, and the second indication information is used for indicating a second touch area;
correspondingly, the determining a first touch area and a second touch area from the M touch areas includes: determining a first touch area of the M touch areas based on the first indication information; determining a second touch area of the M touch areas based on the second indication information.
For example, the user designates an area contacted by a thumb as a first touch area according to usage habits, controls a left mouse button with the thumb, sets an area contacted by an index finger as a second touch area, and controls a right mouse button with the index finger.
In some embodiments, before the obtaining the first indication information and the second indication information, the method further comprises: outputting first prompt information and second prompt information; the first prompt message is used for prompting a user to input first indication information on the touch control component, and the second prompt message is used for prompting the user to input second indication information on the touch control component.
That is, before the user inputs the first indication information and the second indication information, the electronic device may output the first prompt information and the second prompt information to prompt the user to perform the indication operation. The prompt message may be one or more of a text prompt, an image prompt, and a voice prompt.
In practical application, the electronic equipment further comprises a display component; the analyzing the target touch information and determining a target touch gesture corresponding to the target touch information includes: determining a target center position corresponding to the M touch areas based on the position distribution of the M touch areas; determining a cursor position corresponding to the target center position in the display area based on a coordinate mapping relation between the whole touch area of the touch control component and the whole display area of the display component; determining the target control gesture based on the change information of the N touch areas; the target control gesture is used for carrying out control operation on the cursor.
That is, in controlling the movement of the cursor on the display unit through the touch gesture, the position of the cursor on the display unit should be determined first according to the touch information.
FIG. 8 is a fifth diagram illustrating a distribution of touch area locations in an embodiment of the present application; as shown in fig. 8, the touch member includes a touch area, the hand of the user is in contact with the display area to form 6 touch areas, and the display member includes a display area, and the display area displays a cursor. 6 touch areas in the touch area are distributed on the edge of a circle, the circle center O is a target center position corresponding to the 6 touch areas, and the moving track of the target center position O can be mapped into the moving track of a cursor on the display component, so that a user can control the movement of the display cursor by moving a hand in the touch area, and the movement control of a real mouse on the cursor is simulated.
In some embodiments, the touch-sensitive member comprises at least a first operating mode and a second operating mode; the first working mode is a keyboard control mode, and the second working mode is a gesture control mode;
the method further comprises the following steps: acquiring a switching instruction; the switching instruction is a switching key instruction acquired through a switching key or a switching gesture instruction acquired through the touch control component; and controlling the touch control component to switch between the first working mode and the second working mode based on the switching instruction.
That is, the operation mode of the touch control component may be automatically switched by the electronic device when the switching gesture is detected, or switched when the switching key is detected to be pressed. For example, when the user touches the touch component with a specific start gesture, the second working mode is switched; when the hand of the user is separated from the touch control component, returning to the first working mode, or when the user contacts the touch control component with a specific closing gesture, returning to the first working mode; when the user presses the switching key, the first working mode is switched to the second working mode, and when the user presses the switching key again, the second working mode is switched to the first working mode.
A specific implementation scenario is also provided in the embodiment of the present application, and when the electronic device is a notebook computer, fig. 9 is a schematic view of a first composition structure of the notebook computer provided by the present application; as shown in fig. 9, the notebook computer includes a main display 901 and a sub-display 902; the display area of the main display 901 is used for displaying the content to be controlled, and updating the display of the content to be controlled according to the touch operation of the content to be controlled sent by the secondary display, where the main display 901 may be a touch screen or a non-touch screen; the secondary display screen 902 is a touch screen (i.e., a touch component), a touch area of the secondary display screen is used for displaying a virtual keyboard, a user can input a control instruction of the electronic device through the virtual keyboard, configuration of physical keyboard hardware is omitted, and a screen occupation ratio of the secondary display screen can be the same as or different from that of the primary display screen.
FIG. 10 is a schematic view of a second component structure of a notebook computer provided in the present application; as shown in fig. 10, the secondary display 902 may also be used as a touch panel for obtaining touch information of a user, so as to control the electronic device by using a touch gesture.
Here, the sub display may be controlled to operate in the keyboard control mode shown in fig. 9 or the gesture control mode shown in fig. 10 by a switching command.
By adopting the technical scheme, the touch control gesture of the user is recognized by utilizing the touch control component of the electronic equipment, and when the touch control gesture is the gesture controlled by the user simulation real mouse, the same control operation as the real mouse is executed on the electronic equipment based on the touch control gesture, so that the control diversity of the electronic equipment is increased, and the control experience of the user is improved.
An embodiment of the present application provides a control device, which is applied to an electronic device, where the electronic device includes a touch component, and as shown in fig. 11, the control device includes:
an obtaining unit 1101, configured to obtain target touch information acquired by the touch component; the target touch information comprises M touch areas and change information of N touch areas in the M touch areas; wherein M is a positive integer, and N is an integer less than or equal to M;
the processing unit 1102 is configured to analyze the target touch information and determine a target touch gesture corresponding to the target touch information;
the processing unit 1102 is further configured to determine a target control instruction corresponding to the target touch gesture based on a preset mapping relationship between the touch gesture and the control instruction, and execute the target control instruction.
In some embodiments, M takes an integer greater than 3; the change information of the N touch areas comprises at least one of the following items: the moving tracks of the N touch areas and the touch times of the N touch areas.
In some embodiments, the processing unit 1102 is specifically configured to determine a first touch area and a second touch area from the M touch areas; wherein the first touch area is formed by a first touch body contacting the touch part, and the second touch area is formed by a second touch body contacting the touch part; determining change information of the first touch area and/or change information of the second touch area based on the change information of the N touch areas; and determining a target touch gesture corresponding to the target touch information based on the change information of the first touch area and/or the change information of the second touch area.
In some embodiments, the processing unit 1102 is specifically configured to determine the first touch area and the second touch area based on the contact areas of the M touch areas and/or the distances between adjacent touch areas;
or, in some embodiments, the obtaining unit 1101 is further configured to obtain first indication information and second indication information acquired by the touch component; the first indication information is used for indicating a first touch area, and the second indication information is used for indicating a second touch area;
correspondingly, the processing unit 1102 is specifically configured to determine a first touch area of the M touch areas based on the first indication information; determining a second touch area of the M touch areas based on the second indication information.
In some embodiments, the electronic device further includes an output component, and the processing unit 1102 is further configured to control the output component to output the first prompt message and the second prompt message; the first prompt message is used for prompting a user to input first indication information on the touch control component, and the second prompt message is used for prompting the user to input second indication information on the touch control component.
In some embodiments, the electronic device further comprises a display component;
the processing unit 1102 is specifically configured to determine a target center position corresponding to the M touch areas based on the position distribution of the M touch areas; determining a cursor position corresponding to the target center position in the display area based on a coordinate mapping relation between the whole touch area of the touch control component and the whole display area of the display component; determining the target control gesture based on the change information of the N touch areas; the target control gesture is used for carrying out control operation on the cursor.
In some embodiments, the touch-sensitive member comprises at least a first operating mode and a second operating mode; the first working mode is a keyboard control mode, and the second working mode is a gesture control mode;
the processing unit 1102 is further configured to obtain a switching instruction; the switching instruction is a switching key instruction acquired through a switching key or a switching gesture instruction acquired through the touch control component; and controlling the touch control component to switch between the first working mode and the second working mode based on the switching instruction.
In practical application, the control device is applied to electronic equipment and used for realizing the gesture control function of the electronic equipment.
An embodiment of the present application further provides an electronic device, as shown in fig. 12, the electronic device includes: a processor 1201 and a memory 1202 configured to store a computer program capable of running on the processor; the processor 1201, when running the computer program in the memory 1202, performs the following steps:
acquiring target touch information acquired by the touch component; the target touch information comprises M touch areas and change information of N touch areas in the M touch areas; wherein M is a positive integer, and N is an integer less than or equal to M;
analyzing the target touch information and determining a target touch gesture corresponding to the target touch information;
and determining a target control instruction corresponding to the target touch gesture based on a preset mapping relation between the touch gesture and the control instruction, and executing the target control instruction.
In some embodiments, M takes an integer greater than 3; the change information of the N touch areas comprises at least one of the following items: the moving tracks of the N touch areas and the touch times of the N touch areas.
In some embodiments, the processor 1201, when running the computer program in the memory 1202, implements the following steps: determining a first touch area and a second touch area from the M touch areas; wherein the first touch area is formed by a first touch body contacting the touch part, and the second touch area is formed by a second touch body contacting the touch part; determining change information of the first touch area and/or change information of the second touch area based on the change information of the N touch areas; and determining a target touch gesture corresponding to the target touch information based on the change information of the first touch area and/or the change information of the second touch area.
In some embodiments, the processor 1201, when running the computer program in the memory 1202, implements the following steps: determining a first touch area and a second touch area based on the contact areas of the M touch areas and/or the distance between adjacent touch areas;
alternatively, the processor 1201, when running the computer program in the memory 1202, further performs the following steps: acquiring first indication information and second indication information acquired by the touch control component; the first indication information is used for indicating a first touch area, and the second indication information is used for indicating a second touch area; determining a first touch area of the M touch areas based on the first indication information; determining a second touch area of the M touch areas based on the second indication information.
In some embodiments, the processor 1201, when running the computer program in the memory 1202, further performs the steps of: outputting first prompt information and second prompt information; the first prompt message is used for prompting a user to input first indication information on the touch control component, and the second prompt message is used for prompting the user to input second indication information on the touch control component.
In some embodiments, the electronic device further comprises a display component; the processor 1201, when running the computer program in the memory 1202, specifically implements the following steps: determining a target center position corresponding to the M touch areas based on the position distribution of the M touch areas; determining a cursor position corresponding to the target center position in the display area based on a coordinate mapping relation between the whole touch area of the touch control component and the whole display area of the display component; determining the target control gesture based on the change information of the N touch areas; the target control gesture is used for carrying out control operation on the cursor.
In some embodiments, the touch-sensitive member comprises at least a first operating mode and a second operating mode; the first working mode is a keyboard control mode, and the second working mode is a gesture control mode;
the processor 1201, when running the computer program in the memory 1202, further performs the steps of: acquiring a switching instruction; the switching instruction is a switching key instruction acquired through a switching key or a switching gesture instruction acquired through the touch control component; and controlling the touch control component to switch between the first working mode and the second working mode based on the switching instruction.
Of course, in actual practice, the various components in the electronic device are coupled together by a bus system 1203 as shown in FIG. 12. It will be appreciated that the bus system 1203 is used to implement the connection communication between these components. The bus system 1203 includes a power bus, a control bus, and a status signal bus in addition to the data bus. But for the sake of clarity the various busses are labeled in figure 12 as the bus system 1203.
In practical applications, the processor may be at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, and a microprocessor. It is understood that the electronic devices for implementing the above processor functions may be other devices, and the embodiments of the present application are not limited in particular.
The Memory may be a volatile Memory (volatile Memory), such as a Random-Access Memory (RAM); or a non-volatile Memory (non-volatile Memory), such as a Read-Only Memory (ROM), a flash Memory (flash Memory), a Hard Disk (HDD), or a Solid-State Drive (SSD); or a combination of the above types of memories and provides instructions and data to the processor.
By adopting the technical scheme, the touch control gesture of the user is recognized by utilizing the touch control component of the electronic equipment, and when the touch control gesture is the gesture controlled by the user simulation real mouse, the same control operation as the real mouse is executed on the electronic equipment based on the touch control gesture, so that the control diversity of the electronic equipment is increased, and the control experience of the user is improved.
The embodiment of the application also provides a computer readable storage medium for storing the computer program.
Optionally, the computer-readable storage medium may be applied to any electronic device in the embodiments of the present application, and the computer program enables a computer to execute corresponding processes implemented by a processor in the methods in the embodiments of the present application, which are not described herein again for brevity.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may be separately used as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit. Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: a mobile storage device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The methods disclosed in the several method embodiments provided in the present application may be combined arbitrarily without conflict to obtain new method embodiments.
Features disclosed in several of the product embodiments provided in the present application may be combined in any combination to yield new product embodiments without conflict.
The features disclosed in the several method or apparatus embodiments provided in the present application may be combined arbitrarily, without conflict, to arrive at new method embodiments or apparatus embodiments.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.
Claims (10)
1. A control method is applied to electronic equipment, and the electronic equipment comprises a touch control component; the method comprises the following steps:
acquiring target touch information acquired by the touch component; the target touch information comprises M touch areas and change information of N touch areas in the M touch areas; wherein M is a positive integer, and N is an integer less than or equal to M;
analyzing the target touch information and determining a target touch gesture corresponding to the target touch information;
and determining a target control instruction corresponding to the target touch gesture based on a preset mapping relation between the touch gesture and the control instruction, and executing the target control instruction.
2. The method of claim 1, wherein M is an integer greater than 3; the change information of the N touch areas comprises at least one of the following items: the moving tracks of the N touch areas and the touch times of the N touch areas.
3. The method of claim 1, wherein the analyzing the target touch information and determining a target touch gesture corresponding to the target touch information comprises:
determining a first touch area and a second touch area from the M touch areas; wherein the first touch area is formed by a first touch body contacting the touch part, and the second touch area is formed by a second touch body contacting the touch part;
determining change information of the first touch area and/or change information of the second touch area based on the change information of the N touch areas;
and determining a target touch gesture corresponding to the target touch information based on the change information of the first touch area and/or the change information of the second touch area.
4. The method of claim 3, wherein determining a first touch area and a second touch area from the M touch areas comprises: determining a first touch area and a second touch area based on the contact areas of the M touch areas and/or the distance between adjacent touch areas;
alternatively, the method further comprises: acquiring first indication information and second indication information acquired by the touch control component; the first indication information is used for indicating a first touch area, and the second indication information is used for indicating a second touch area;
correspondingly, the determining a first touch area and a second touch area from the M touch areas includes: determining a first touch area of the M touch areas based on the first indication information; determining a second touch area of the M touch areas based on the second indication information.
5. The method of claim 4, wherein prior to obtaining the first indication information and the second indication information, the method further comprises:
outputting first prompt information and second prompt information; the first prompt message is used for prompting a user to input first indication information on the touch control component, and the second prompt message is used for prompting the user to input second indication information on the touch control component.
6. The method of claim 1, wherein the electronic device further comprises a display component;
the analyzing the target touch information and determining a target touch gesture corresponding to the target touch information includes:
determining a target center position corresponding to the M touch areas based on the position distribution of the M touch areas;
determining a cursor position corresponding to the target center position in the display area based on a coordinate mapping relation between the whole touch area of the touch control component and the whole display area of the display component;
determining the target control gesture based on the change information of the N touch areas; the target control gesture is used for carrying out control operation on the cursor.
7. The method of claim 1, wherein the touch-sensitive component comprises at least a first operating mode and a second operating mode; the first working mode is a keyboard control mode, and the second working mode is a gesture control mode;
the method further comprises the following steps:
acquiring a switching instruction; the switching instruction is a switching key instruction acquired through a switching key or a switching gesture instruction acquired through the touch control component;
and controlling the touch control component to switch between the first working mode and the second working mode based on the switching instruction.
8. A control device is applied to electronic equipment, and the electronic equipment comprises a touch control component; the device comprises:
the acquisition unit is used for acquiring target touch information acquired by the touch component; the target touch information comprises M touch areas and change information of N touch areas in the M touch areas; wherein M is a positive integer, and N is an integer less than or equal to M;
the processing unit is used for analyzing the target touch information and determining a target touch gesture corresponding to the target touch information;
the processing unit is further configured to determine a target control instruction corresponding to the target touch gesture based on a preset mapping relationship between the touch gesture and the control instruction, and execute the target control instruction.
9. An electronic device, the electronic device comprising: a processor and a memory configured to store a computer program capable of running on the processor,
wherein the processor is configured to perform the steps of the method of any one of claims 1 to 7 when running the computer program.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910796890.4A CN110633044B (en) | 2019-08-27 | 2019-08-27 | Control method, control device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910796890.4A CN110633044B (en) | 2019-08-27 | 2019-08-27 | Control method, control device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110633044A true CN110633044A (en) | 2019-12-31 |
CN110633044B CN110633044B (en) | 2021-03-19 |
Family
ID=68969237
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910796890.4A Active CN110633044B (en) | 2019-08-27 | 2019-08-27 | Control method, control device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110633044B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111638809A (en) * | 2020-05-22 | 2020-09-08 | 讯飞幻境(北京)科技有限公司 | Method, device, equipment and medium for acquiring touch information |
CN113138670A (en) * | 2021-05-07 | 2021-07-20 | 郑州捷安高科股份有限公司 | Touch screen interaction gesture control method and device, touch screen and storage medium |
CN113548061A (en) * | 2021-06-25 | 2021-10-26 | 北京百度网讯科技有限公司 | Man-machine interaction method and device, electronic equipment and storage medium |
CN113741782A (en) * | 2021-07-21 | 2021-12-03 | 西安闻泰信息技术有限公司 | Method and device for generating control instruction, electronic equipment and storage medium |
CN116198525A (en) * | 2023-02-21 | 2023-06-02 | 广州小鹏汽车科技有限公司 | Vehicle-mounted system control method, vehicle and storage medium |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030222856A1 (en) * | 2002-01-29 | 2003-12-04 | Fedorak Mark V. | Computer pointer control |
CN101339453A (en) * | 2008-08-15 | 2009-01-07 | 广东威创视讯科技股份有限公司 | Simulated mouse input method based on interactive input apparatus |
CN101609388A (en) * | 2008-06-20 | 2009-12-23 | 义隆电子股份有限公司 | But the touch sensitive surface module of interpreting multi-object gestures and method of operating thereof |
CN102591497A (en) * | 2012-03-16 | 2012-07-18 | 上海达龙信息科技有限公司 | Mouse simulation system and method on touch screen |
CN102830819A (en) * | 2012-08-21 | 2012-12-19 | 曾斌 | Method and equipment for simulating mouse input |
CN102955590A (en) * | 2011-08-19 | 2013-03-06 | 中国移动通信集团公司 | Device and method for positioning cursor displayed on touch screen |
CN103345312A (en) * | 2013-07-03 | 2013-10-09 | 张帆 | System and method with intelligent terminal as host, mouse and touch panel at the same time |
CN103472931A (en) * | 2012-06-08 | 2013-12-25 | 宏景科技股份有限公司 | Method for operating simulation touch screen by mouse |
CN105739781A (en) * | 2016-01-29 | 2016-07-06 | 深圳天珑无线科技有限公司 | Method and system for quickly and accurately positioning character cursor through pressure touch technology |
US10137363B2 (en) * | 2013-06-20 | 2018-11-27 | Uday Parshionikar | Gesture based user interfaces, apparatuses and control systems |
-
2019
- 2019-08-27 CN CN201910796890.4A patent/CN110633044B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030222856A1 (en) * | 2002-01-29 | 2003-12-04 | Fedorak Mark V. | Computer pointer control |
CN101609388A (en) * | 2008-06-20 | 2009-12-23 | 义隆电子股份有限公司 | But the touch sensitive surface module of interpreting multi-object gestures and method of operating thereof |
CN101339453A (en) * | 2008-08-15 | 2009-01-07 | 广东威创视讯科技股份有限公司 | Simulated mouse input method based on interactive input apparatus |
CN102955590A (en) * | 2011-08-19 | 2013-03-06 | 中国移动通信集团公司 | Device and method for positioning cursor displayed on touch screen |
CN102591497A (en) * | 2012-03-16 | 2012-07-18 | 上海达龙信息科技有限公司 | Mouse simulation system and method on touch screen |
CN103472931A (en) * | 2012-06-08 | 2013-12-25 | 宏景科技股份有限公司 | Method for operating simulation touch screen by mouse |
CN102830819A (en) * | 2012-08-21 | 2012-12-19 | 曾斌 | Method and equipment for simulating mouse input |
US10137363B2 (en) * | 2013-06-20 | 2018-11-27 | Uday Parshionikar | Gesture based user interfaces, apparatuses and control systems |
CN103345312A (en) * | 2013-07-03 | 2013-10-09 | 张帆 | System and method with intelligent terminal as host, mouse and touch panel at the same time |
CN105739781A (en) * | 2016-01-29 | 2016-07-06 | 深圳天珑无线科技有限公司 | Method and system for quickly and accurately positioning character cursor through pressure touch technology |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111638809A (en) * | 2020-05-22 | 2020-09-08 | 讯飞幻境(北京)科技有限公司 | Method, device, equipment and medium for acquiring touch information |
CN113138670A (en) * | 2021-05-07 | 2021-07-20 | 郑州捷安高科股份有限公司 | Touch screen interaction gesture control method and device, touch screen and storage medium |
CN113138670B (en) * | 2021-05-07 | 2022-11-18 | 郑州捷安高科股份有限公司 | Touch screen interaction gesture control method and device, touch screen and storage medium |
CN113548061A (en) * | 2021-06-25 | 2021-10-26 | 北京百度网讯科技有限公司 | Man-machine interaction method and device, electronic equipment and storage medium |
WO2022267354A1 (en) * | 2021-06-25 | 2022-12-29 | 北京百度网讯科技有限公司 | Human-computer interaction method and apparatus, and electronic device and storage medium |
CN113741782A (en) * | 2021-07-21 | 2021-12-03 | 西安闻泰信息技术有限公司 | Method and device for generating control instruction, electronic equipment and storage medium |
CN116198525A (en) * | 2023-02-21 | 2023-06-02 | 广州小鹏汽车科技有限公司 | Vehicle-mounted system control method, vehicle and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN110633044B (en) | 2021-03-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110633044B (en) | Control method, control device, electronic equipment and storage medium | |
US9035883B2 (en) | Systems and methods for modifying virtual keyboards on a user interface | |
CN105204744B (en) | Method and device for starting application program and electronic equipment | |
US20160210012A1 (en) | Terminal, and Method for Controlling Terminal Screen Display Information | |
US20090109187A1 (en) | Information processing apparatus, launcher, activation control method and computer program product | |
US20110060986A1 (en) | Method for Controlling the Display of a Touch Screen, User Interface of the Touch Screen, and an Electronic Device using The Same | |
US20050162402A1 (en) | Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback | |
US20100253630A1 (en) | Input device and an input processing method using the same | |
US8830192B2 (en) | Computing device for performing functions of multi-touch finger gesture and method of the same | |
MX2008014057A (en) | Multi-function key with scrolling. | |
CN106126034B (en) | A kind of keypress function setting method and mobile terminal | |
CN103246382A (en) | Control method and electronic equipment | |
JP2015022745A (en) | Determining input received via tactile input device | |
US20110285625A1 (en) | Information processing apparatus and input method | |
US20130278565A1 (en) | Method and apparatus for providing graphic keyboard in touch-screen terminal | |
CN101482799A (en) | Method for controlling electronic equipment through touching type screen and electronic equipment thereof | |
EP2846244A1 (en) | Information processing device with a touch screen, control method and program | |
TW201520882A (en) | Input device and input method thereof | |
JP2011159089A (en) | Information processor | |
JP2011134127A (en) | Information processor and key input method | |
US20150042585A1 (en) | System and electronic device of transiently switching operational status of touch panel | |
CN105930085A (en) | Input method and electronic device | |
CN105975832A (en) | Control method of mobile terminal and mobile terminal | |
KR101365595B1 (en) | Method for inputting of device containing display unit based on GUI and apparatus thereof | |
JP5414134B1 (en) | Touch-type input system and input control method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |