CN116483246A - Input control method and device, electronic equipment and storage medium - Google Patents

Input control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116483246A
CN116483246A CN202310413474.8A CN202310413474A CN116483246A CN 116483246 A CN116483246 A CN 116483246A CN 202310413474 A CN202310413474 A CN 202310413474A CN 116483246 A CN116483246 A CN 116483246A
Authority
CN
China
Prior art keywords
touch
display area
virtual mouse
set display
event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310413474.8A
Other languages
Chinese (zh)
Inventor
廖先翔
洪千茹
张菁惠
游惠雯
蒋嫒霞
高彦彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Lianbao Information Technology Co Ltd
Original Assignee
Hefei Lianbao Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Lianbao Information Technology Co Ltd filed Critical Hefei Lianbao Information Technology Co Ltd
Priority to CN202310413474.8A priority Critical patent/CN116483246A/en
Publication of CN116483246A publication Critical patent/CN116483246A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Abstract

The application provides an input control method, an input control device, electronic equipment and a storage medium; the method comprises the following steps: monitoring touch gestures of a set display area of the screen of the electronic equipment; displaying a virtual mouse interface in a set display area under the condition that the touch gesture accords with the triggering condition of the virtual mouse; detecting a touch event based on a set display area; and generating a control instruction corresponding to the touch event. Therefore, when a user inputs characters, the user needs to perform character input position selection or other operations which need to be realized by means of the virtual mouse function, the virtual mouse interface is directly displayed in the set display area to provide the virtual mouse function for the user, and compared with the scheme that the hands of the user leave the set display area to perform the operations which need to be completed by means of the virtual mouse function, such as character input position selection by touching other areas or additionally arranged touch pads, the user hands are prevented from repeatedly leaving the set display area, the input efficiency is improved, and the user experience is improved.

Description

Input control method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of electronic device applications, and in particular, to an input control method, an input control device, an electronic device, and a storage medium.
Background
With the development and popularization of intelligent terminals, touch-control electronic devices, such as folding touch PCs or tablet computers, are becoming popular. For the touch electronic device, when operations such as document processing or text input are required to be executed, the touch electronic device may be set to a mode in which the upper half screen of the display screen is a content display area and the lower half screen is a virtual keyboard interactive interface. However, when performing operations such as document processing or text input, operations such as text input position and font size selection may be required, which only depend on the virtual keyboard to fail to perform the corresponding operations.
In view of the above, related operations such as selection of a text input position and a font size may be performed by directly touching the content display area with a hand, or operations such as selection of a text input position and a font size may be performed by using a touch pad directly using a virtual keyboard having both a virtual keyboard and a touch pad function interface. However, in the process of touching the content display area on the vertical surface with a finger or using the touch pad, the finger and the wrist must leave the virtual keyboard interactive interface, so that the document processing efficiency or the text input efficiency is low, and the long-time operation can bring burden to the hand, and the user experience is affected.
Disclosure of Invention
The embodiment of the application provides an input control method, an input control device, electronic equipment and a storage medium.
According to a first aspect of the present application, there is provided an input control method, the method comprising: monitoring touch gestures of a set display area of the screen of the electronic equipment; displaying a virtual mouse interface in the set display area under the condition that the touch gesture accords with the triggering condition of the virtual mouse; detecting a touch event based on the set display area; and generating a control instruction corresponding to the touch event.
According to an embodiment of the present application, the monitoring a touch gesture of a set display area of a screen of an electronic device includes: monitoring touch data of the set display area; determining the number of touch signals in the touch data; and determining the number of touch points generating touch operation according to the number of the touch signals.
According to an embodiment of the present application, the detecting a touch event based on setting a display area includes: acquiring a plurality of touch points of the set display area according to the touch data, and determining a control area; detecting whether the control area has position change or not; in the case of a change in position of the control area, a movement event is determined to occur.
According to an embodiment of the present application, the generating a control instruction corresponding to the touch event includes: and generating a movement instruction corresponding to the movement event so as to control the cursor of the virtual mouse of the user display interface to move.
According to an embodiment of the present application, the touch event includes a key event; correspondingly, the detecting the touch event based on the set display area comprises the following steps: judging whether the number of touch points of the set display area changes according to the touch data; determining that a key action occurs under the condition that the number of the touch points is changed in the setting; judging the action type of the key action according to the touch data at the set moment, wherein the action type comprises a left click action and a right click action; a determination is made that a key event corresponding to the action type occurred.
According to an embodiment of the present application, the determining, according to the touch data at the set time, the action type of the key action includes: determining finger coordinates corresponding to a plurality of touch points of the set display area according to the touch data at the set moment; sorting the plurality of touch points according to the finger coordinates to obtain a finger coordinate sequence; determining a target touch point where a key action occurs according to the generation time of the touch signals of the plurality of touch points; determining the arrangement sequence of the target touch points in the finger coordinate sequence; and determining the key action type according to the arrangement sequence.
According to an embodiment of the present application, the generating a control instruction corresponding to the touch event includes: generating a left click command under the condition that the action type is a left click action so as to control the virtual mouse to perform a first processing operation corresponding to the left click command on a user display interface; and generating a right click command under the condition that the action type is a left click action so as to control the virtual mouse to perform a second processing operation corresponding to the right click command on the user display interface.
According to a second aspect of the present application, there is provided an input control device comprising: the monitoring module is used for monitoring touch gestures of a set display area of the screen of the electronic equipment; the switching module is used for displaying a virtual mouse interface in the set display area under the condition that the touch gesture accords with the triggering condition of the virtual mouse; the detection module is used for detecting a touch event based on the set display area; and the instruction generation module is used for generating a virtual mouse control instruction corresponding to the touch event.
According to a third aspect of the present application, there is provided an electronic device comprising:
at least one processor; and
A memory communicatively coupled to the at least one processor; wherein, the liquid crystal display device comprises a liquid crystal display device,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the methods described herein.
According to a fourth aspect of the present application, there is provided a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method described herein.
According to the method, touch gestures of a set display area of the screen of the electronic equipment are monitored; displaying a virtual mouse interface in the set display area under the condition that the touch gesture accords with the triggering condition of the virtual mouse; detecting a touch event based on the set display area; and generating a control instruction corresponding to the touch event. By detecting the touch gesture, under the condition that the touch gesture meets the triggering condition, the set display area of the screen of the electronic equipment can directly display the virtual mouse interface to trigger the virtual mouse so as to provide a virtual mouse function, and a control instruction is generated based on a touch event detected by the virtual mouse interface so as to realize virtual mouse functions such as input and the like.
It should be understood that the teachings of the present application are not required to achieve all of the above-described benefits, but rather that certain technical solutions may achieve certain technical effects, and that other embodiments of the present application may also achieve benefits not mentioned above.
Drawings
The above, as well as additional purposes, features, and advantages of exemplary embodiments of the present application will become readily apparent from the following detailed description when read in conjunction with the accompanying drawings. Several embodiments of the present application are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings, in which:
in the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
Fig. 1 is a schematic implementation flow chart of an input control method according to an embodiment of the present application;
fig. 2 is a schematic implementation flow diagram of a touch gesture detection method of the input control method according to the embodiment of the present application;
fig. 3 is a schematic implementation flow diagram of a touch event detection method of the input control method according to the embodiment of the present application;
fig. 4 shows a second implementation flow chart of a touch event detection method of the input control method provided in the embodiment of the present application;
fig. 5 is a schematic implementation flow chart of an action type judging method of the input control method according to the embodiment of the present application;
Fig. 6 shows a schematic structural diagram of an input control device according to an embodiment of the present application;
fig. 7 shows a schematic diagram of a composition structure of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, features and advantages of the present application more obvious and understandable, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
Fig. 1 shows a schematic implementation flow chart of an input control method provided in an embodiment of the present application.
Referring to fig. 1, an embodiment of the present application provides an input control method, including: operation 101, monitoring touch gestures of a set display area of an electronic device screen; operation 102, displaying a virtual mouse interface in a set display area under the condition that the touch gesture accords with the triggering condition of the virtual mouse; operation 103, detecting a touch event based on the set display area; in operation 104, a control command corresponding to the touch event is generated.
In operation 101, a touch gesture of a set display area of a screen of an electronic device is monitored.
Specifically, the electronic device in the embodiment of the present application may be a touch electronic device such as a folding PC or a tablet PC, or other suitable electronic devices, which is not specifically limited in this application.
For the touch electronic device, when document processing or text input is required, the touch electronic device is configured in a mode that the upper half screen of the display screen is a content display area, and the lower half screen is a virtual keyboard interactive interface, so that operations such as document processing or text input are performed through the virtual keyboard, but operations such as text input position and text format selection cannot be completed through the virtual keyboard. Therefore, when operations such as text input position and text format selection are required, a virtual mouse is required to be called to complete the corresponding operations.
In this embodiment of the application, the triggering condition of the virtual mouse may be preconfigured, and in the process of using the virtual keyboard, the touch gesture of the virtual keyboard interactive interface is monitored in real time, and the virtual mouse is activated to provide the virtual mouse function when the touch gesture meets the triggering condition. When performing operations such as document processing or text input, the set display area of the electronic device is first displayed as a virtual keyboard interactive interface, and for convenience of further explanation of the technical scheme of the application, the set display area under the condition that the virtual mouse is not activated is generally referred to as a virtual keyboard interactive interface for description.
Further, the touch gesture may be represented by a number of touch points, and accordingly, the trigger condition of the virtual mouse may be configured to detect a set number of touch points.
In this embodiment of the application, the display screen of the electronic device is configured with a touch sensor, and the touch sensor can sense touch contact of the user on the display screen, and the number of touch points can be automatically detected by the electronic device by adopting a touch display screen detection method commonly used in the art. When a finger touches the electronic device screen, the electronic device can detect the number of touch points. In the embodiment of the application, one touch point represents the touch of one finger, and the number of fingers in the set display area can be determined through the number of touch points.
In operation 102, when the touch gesture meets the triggering condition of the virtual mouse, the virtual mouse interface is displayed in the set display area.
Specifically, when the touch gesture satisfies the triggering condition of the virtual mouse, the virtual mouse interface is displayed in the set display area to trigger the virtual mouse function.
For example, taking the triggering condition of 4 touch points as an example, in the process of word processing or word input by using the virtual keyboard, the number of touch points of the virtual keyboard interaction interface is monitored in real time, and under the condition that 4 touch points exist in the virtual keyboard interaction interface, the virtual mouse interface is directly displayed on the virtual keyboard interaction interface so as to provide a virtual mouse function for a user.
In operation 103, a touch event based on the set display area is detected.
Specifically, after the virtual mouse interface is displayed in the set display area, the operation of the user on the virtual mouse interface needs to be detected in real time, so as to realize the expectation that the user needs to reach by means of the virtual mouse function according to the operation of the user.
Further, the touch electronic device is generally configured with a touch interaction function module, and multiple functional components in the touch interaction function module can detect an operation of a user on a screen of the electronic device to determine that a touch event corresponding to the user operation occurs, where the user operation may include a finger press or a finger movement, and accordingly, the touch event corresponding to the finger press is a key event and the touch event corresponding to the finger movement is a movement event.
In this embodiment of the present application, the detection of the touch point and the touch event is performed by the touch sensor and the touch interaction functional module of the electronic device, and the specific detection process may refer to the following description of the detection of the touch gesture and the touch event in fig. 2 to 4, which is not repeated herein.
In operation 104, a control command corresponding to the touch event is generated.
Specifically, after the touch event is detected, a control instruction corresponding to the touch event is generated according to the touch event, so as to control a cursor of the virtual mouse to perform an operation corresponding to the control instruction.
For example, taking a touch event as a movement event as an example, when the movement event is detected, the user is represented to perform a movement operation on the virtual mouse interface, and a control instruction for controlling the cursor of the virtual mouse to perform a movement operation corresponding to the movement event is generated according to the movement event, so that the cursor of the virtual mouse can be controlled to perform a corresponding movement operation.
Therefore, in the text input process of the user, when the text input position selection is needed or other operations which need to be realized by means of the virtual mouse function are needed, under the condition that the triggering condition of the virtual mouse is met, the virtual mouse interface is directly displayed in the setting display area so as to provide the virtual mouse function for the user, compared with the existing virtual mouse functions such as input and the like which can be completed directly based on the setting display area, the touch pad which is additionally configured is clicked by the hand to leave the setting display area or controlled by the hand so as to realize the virtual mouse function such as input and the like, the scheme that the hand of the user leaves the setting display area repeatedly is avoided, the input efficiency is improved, the burden of the hand is reduced, and the user experience is improved.
Fig. 2 is a schematic implementation flow diagram of a touch gesture detection method of the input control method according to the embodiment of the present application.
In an embodiment of the present application, referring to fig. 2, the monitoring a touch gesture of a set display area of a screen of an electronic device in operation 101 includes: operation 201, monitoring touch data of a set display area; operation 202, determining a number of touch signals in the touch data; in operation 203, the number of touch points generating the touch operation is determined according to the number of touch signals.
In operation 201, touch data of a set display area is monitored.
Specifically, a touch sensor built in the electronic equipment reflects the touch condition in the set display area in real time and transmits the touch condition to a touch interaction function module of the electronic equipment for recording, so that touch data are obtained. The touch sensor may be a pressure sensor or a capacitive touch sensor, among others.
In operation 202, the number of touch signals in the touch data is determined.
Specifically, when a finger touches a set display area, a touch signal is generated, and the touch signal is recorded as touch data. The touch data includes touch data of a user on a set display area at a plurality of moments during the operation of the electronic device. The touch data corresponding to each moment may include information such as the number of touch signals, a touch position where a user touches the screen of the electronic device, and the touch position may be recorded in a coordinate form.
Further, the type of the touch signal is determined by the type of the touch sensor built in the electronic device, and the touch signal may be represented as a pressure signal, a current signal, or the like. For example, in the case where the built-in touch sensor of the electronic device is a pressure sensor, the touch signal is a pressure signal; in the case where the touch sensor built in the electronic device is a capacitive touch sensor, the touch signal is a current signal.
The touch signal is not particularly limited in the present application, and any touch signal that can respond to a touch point is within the scope of the present application.
In operation 203, the number of touch points generating the touch operation is determined according to the number of touch signals.
Specifically, one touch signal represents one touch point, and the number of touch points is determined by determining the number of touch signals.
Fig. 3 is a schematic implementation flow diagram of a touch event detection method of the input control method according to the embodiment of the present application.
In an embodiment of the present application, referring to fig. 3, the detecting a touch event based on the set display area in operation 103 includes: operation 301, acquiring a plurality of touch points of a set display area according to touch data, and determining a control area; an operation 302 of detecting whether a position change occurs in a control area; in operation 303, in the case where the control area is changed in position, it is determined that a movement event occurs.
In operation 301, a plurality of touch points of a set display area are acquired according to touch data, and a control area is determined.
Specifically, when the trigger condition is reached to activate the virtual mouse, touch data at the activation time is first acquired, positions of a plurality of touch points in the set display area are determined according to the positions of a plurality of touch signals in the touch data corresponding to the activation time, and the control area is determined according to the positions of the plurality of touch points. The control area indicates a combination position of the plurality of touch points, for example, a center of a minimum circle including the plurality of touch points, and the control area is not particularly limited as long as the area capable of representing the plurality of touch points can be used as the control area.
In operation 302, it is detected whether a change in position of the control area has occurred.
Specifically, the movement operation of the user in the set display area is detected by detecting the movement of the control area.
Further, whether the control area moves or not is detected by detecting whether the position of the control area changes or not.
In this manner, the position change of the control area may be detected by the touch interaction function module of the electronic device. Taking an electronic device of a Windows system as an example, the electronic device of the Windows system is provided with a Windows ui.input API (touch interactive function module), and functional components such as management (touch operation) and Gesture Recognizer (gesture recognition) of the Windows ui.input API can recognize the position change of the control area.
The present application is not particularly limited to the identification of the change in the position of the control area, and may be applied as long as the change in the position of the control area can be identified.
In operation 303, in the event that a change in position of the control area occurs, it is determined that a movement event has occurred.
Specifically, the occurrence of the position change in the control area indicates that the control area has moved, and the occurrence of the movement event is determined on the basis of the movement of the finger corresponding to the touch point in the set display area.
In an embodiment of the present application, after operations 301-303, a movement instruction corresponding to the movement event is also generated to control the cursor of the virtual mouse of the user display interface to move.
Specifically, the virtual mouse interface corresponds to a user display interface, the user display interface displays a cursor of the virtual mouse, and a control instruction is generated under the condition that a movement event is determined to occur, so as to control the cursor to move corresponding to the position change of the control area on the user display interface. The control command is used for controlling the cursor to rotate and move on the user display interface.
Continuing with the example of the electronic device as a Windows system, the move events include a ges event and manipulation events (move operation event). When the movement event is a creation event, a control instruction corresponding to the creation event is generated, and the control instruction is used for controlling a cursor of the user display interface to perform a corresponding rotation operation according to the rotation angle identified by Gesture Recognizer. When the movement event is manipulation events, a control instruction corresponding to manipulation events is generated, and the control instruction is used for controlling the cursor of the user display interface to perform corresponding position movement according to the displacement value identified by the management.
Fig. 4 shows a second implementation flow chart of a touch event detection method of the input control method provided in the embodiment of the present application.
In an embodiment of the present application, referring to fig. 4, the detecting a touch event based on the set display area in operation 103 includes: operation 401, determining whether a set change occurs in the number of touch points of the set display area according to the touch data; operation 402, determining that a key action occurs in the case that the number of touch points is changed in a setting; operation 403, judging the action type of the key action according to the touch data at the set time, wherein the action type comprises a left click action and a right click action; operation 404 determines that a key event corresponding to the type of action has occurred.
In operation 401, it is determined whether or not a set change has occurred in the number of touch points of the set display area based on the touch data.
Specifically, when the user clicks on the set display area, the finger of the user touching the set display area changes, and accordingly, the number of touch points also changes with the change of the finger touched.
For example, in the process of using the mouse, the user is usually used to place 4 or 5 fingers of one hand on the mouse, and the embodiment of the application may set the triggering condition of the virtual mouse to be that the number of touch points is 4 and 5, which corresponds to 4 fingers and 5 fingers. After the virtual mouse is activated, clicking action is required when the user presses a button with fingers, and the number of touch points in the set display area is changed along with clicking.
Further, a setting change can be configured in advance, whether the number of touch points is changed or not can be judged in real time, and whether the number of touch points is changed or not can be determined in real time according to the touch data of the setting display area. In the case where the trigger condition is that the number of touch points is 4 and 5, the corresponding setting condition may be configured such that the number of touch points is 4 to 5.
In operation 402, in the case where a set change occurs in the number of touch points, it is determined that a key operation has occurred.
Specifically, when the number of touch points is changed, a key action is presented on behalf of the finger, and a key event is determined to occur.
In operation 403, the action type of the key action is determined according to the touch data at the set time, and the action type includes a left click action and a right click action.
Specifically, the mouse operation is divided into a left click and a right click, where the left click and the right click correspond to different functions, for example, the left click mouse supports a selection operation, and the right click mouse supports a list function selection, so that an action type of the key action needs to be determined to determine whether a function corresponding to the left click action or a function corresponding to the right click action is required by the user.
Further, when the number of touch points is judged to be changed, the touch point with the key action can be determined according to the touch data when the number of touch points is changed, and whether the action type of the key action is left click action or right click action is further determined according to the related information of the touch point. The time when the number of touch points is judged to be changed is the set time.
Operation 404 determines that a key event corresponding to the type of action has occurred.
Specifically, after determining the action type of the button, it is determined that a key event corresponding to the action type occurs.
Fig. 5 is a schematic implementation flow chart of an action type judging method of the input control method according to the embodiment of the present application.
In an embodiment of the present application, referring to fig. 5, the determining, in operation 403, the action type of the key action according to the touch data at the set time includes: operation 501, determining finger coordinates corresponding to a plurality of touch points of a set display area according to touch data of a set time; an operation 502, sorting a plurality of touch points according to finger coordinates to obtain a finger coordinate sequence; operation 503, determining a target touch point where a key action occurs according to the generation time of the touch signals of the plurality of touch points; operation 504, determining an arrangement order of the target touch points in the finger coordinate sequence; in operation 505, a key action type is determined according to the arrangement order.
In operation 501, finger coordinates corresponding to a plurality of touch points of a set display area are determined according to touch data at a set time.
Specifically, the set time is the time when the key event is determined to occur, a plurality of touch signals in the touch data at the set time are determined as touch points according to the touch data at the set time, and the coordinates of the plurality of touch signals are determined as the coordinates of fingers of the plurality of touch points.
Further, the coordinates of the touch signal are determined by the touch interaction function module of the electronic device and are stored as touch data, the coordinates of the touch signal comprise X-axis coordinates and Y-axis coordinates, the X-axis is the transverse direction of the screen of the electronic device, the Y-axis is the longitudinal direction of the screen of the electronic device, and the origin of coordinates can be configured at the lower left corner of the screen of the electronic device. It should be noted that, the specific touch signal coordinate determining process may refer to a manner of determining the touch signal coordinate by using a common touch screen, which is not described herein.
In operation 502, a plurality of touch points are ordered according to finger coordinates, resulting in a finger coordinate sequence.
Specifically, it is common practice for a user to click a mouse left by an index finger and right by a middle finger, and the positions of a plurality of fingers of one hand have a fixed arrangement order, so that by detecting the relative positions of the clicked fingers among the plurality of fingers, it is possible to detect whether the fingers are the index finger or the middle finger.
After a plurality of finger coordinates are obtained, the plurality of finger coordinates are arranged to obtain a finger coordinate sequence. The plurality of fingers are transversely arranged, so that the finger coordinates can be arranged from small to large according to the X-axis coordinate values of the finger coordinates.
In operation 503, a target touch point at which a key action occurs is determined according to the generation time of the touch signals of the plurality of touch points.
Specifically, when a key action occurs, the finger touching the screen at last is the finger for performing the key, and each finger corresponds to a corresponding touch point, so that the finger for performing the key can be determined by determining the generation time of a plurality of touch points.
Further, the touch point generated by the last touch can be determined according to the generation time of the touch signals corresponding to the plurality of touch points, and the touch point corresponding to the touch signal generated finally is determined as the target touch point for generating the key action by determining the touch signal generated finally.
In operation 504, a ranking order of the target touch points in the finger coordinate sequence is determined.
Specifically, the arrangement sequence of the target touch points in the finger coordinate sequence is determined according to the X-axis coordinates of the target touch points.
In operation 505, a key action type is determined according to the arrangement order.
Specifically, the plurality of fingers are arranged transversely in one hand, so that whether the finger corresponding to the target touch point is an index finger or a middle finger can be determined according to the arrangement sequence of the X-axis coordinate values of the target touch point in the finger coordinate sequence, and the action type of the pressing piece is left click action or right click action.
For example, taking a right-hand virtual mouse as an example, the index finger and the middle finger of a user who is used to the right-hand virtual mouse are respectively arranged at the 2 nd or 3 rd positions of a plurality of fingers, correspondingly, the X-axis coordinates of the target touch point are respectively corresponding to the index finger and the middle finger of the finger corresponding to the target touch point at the 2 nd position and the 3 rd position, the index finger corresponds to the left click, and the middle finger corresponds to the right click. It should be noted that, the method for determining the type of the key action of the virtual mouse used by the left hand is similar to that of the right hand, and thus will not be described herein.
Further, by whether the left-hand or right-hand virtual mouse is configured in advance in the electronic device, a user who is used to the left-hand virtual mouse can configure the virtual mouse in advance into the left-hand mode, and a user who is used to the right-hand virtual mouse can configure the virtual mouse in advance into the right-hand mode.
In an embodiment of the present application, when the action type of the key action is a left click action, a left click command for controlling the virtual mouse to perform a first processing operation on the user display interface is generated; and generating a right click command for controlling the virtual mouse to perform a second processing operation on the user display interface under the condition that the action type is a left click action.
Specifically, in the case that the key action is a left click action, it is determined that the user needs an operation similar to the left button of the mouse, so that a left click instruction for controlling the virtual mouse to perform an operation similar to the left button of the mouse on the user display interface is generated. In the case that the key press action is a right click action, it is determined that the user needs an operation similar to the right click action of the mouse, and therefore a right click instruction for controlling the virtual mouse to perform an operation similar to the right click action of the mouse on the user display interface is generated.
In an embodiment of the present application, when it is detected that the touch gesture of the set display area does not meet the triggering condition of the virtual mouse, the set display area is displayed as the virtual keyboard interactive interface.
Specifically, when the document is processed or the text is input, the set display area of the electronic device is displayed as a virtual keyboard interactive interface for the user to use the virtual keyboard function under the condition that the touch gesture does not accord with the triggering condition of the virtual mouse.
Further, based on detection of a touch gesture to the set display area, the touch gesture does not conform to a trigger condition of the virtual mouse, and the set display area displays the virtual keyboard interactive interface.
Therefore, by detecting the touch gesture, under the condition that the touch gesture meets the triggering condition, the set display area of the screen of the electronic device can directly display the virtual mouse interface to trigger the virtual mouse so as to provide the virtual mouse function, and a control instruction is generated based on the touch event detected by the virtual mouse interface so as to realize the virtual mouse function such as input according to the control instruction.
Fig. 6 shows a schematic diagram of the composition structure of an input control device provided in an embodiment of the present application.
Based on the above input control method, the embodiment of the present application further provides an input control device, where the device 60 includes: the monitoring module 601 is configured to monitor a touch gesture of a set display area of the electronic device screen; the switching module 602 is configured to display a virtual mouse interface in a set display area when the touch gesture meets a trigger condition of the virtual mouse; a detection module 603, configured to detect a touch event based on a set display area; the instruction generating module 604 is configured to generate a virtual mouse control instruction corresponding to the touch event.
In one embodiment of the present application, the monitoring module includes: the monitoring sub-module is used for monitoring touch data of the set display area by a user; a first determination sub-module for determining the number of touch signals in the touch data; and the second determining submodule is used for determining the number of touch points generating touch operation according to the number of touch signals.
In one embodiment of the present application, the detection module includes: the acquisition sub-module is used for acquiring a plurality of touch points of the set display area according to the touch data and determining a control area; the detection submodule is used for detecting whether the control area has position change or not; and the event determination submodule is used for determining that a movement event occurs under the condition that the position of the control area changes.
In an embodiment of the present application, the instruction generating module is configured to generate a movement instruction corresponding to the movement event, so as to control a cursor of a virtual mouse of the user display interface to move.
In an embodiment of the present application, the touch event includes a key event; correspondingly, the detection module comprises: the first judging submodule is used for judging whether the number of touch points of the set display area changes or not according to the touch data; the first determining submodule is used for determining that key actions occur under the condition that the number of the touch points is set to change; the second judging sub-module is used for judging the action type of the key action according to the touch data at the set moment, wherein the action type comprises a left click action and a right click action; and the second determining submodule is used for determining that a key event corresponding to the action type occurs.
In an embodiment of the present application, the second judging submodule includes: a first determining unit, configured to determine finger coordinates corresponding to a plurality of touch points of the set display area according to touch data at a set time; the sorting unit is used for sorting the plurality of touch points according to the finger coordinates to obtain a finger coordinate sequence; a second determining unit, configured to determine a target touch point at which a key action occurs according to generation times of touch signals of the plurality of touch points; the third determining unit is used for determining the arrangement sequence of the target touch points in the finger coordinate sequence; and the fourth determining unit is used for determining the key action type according to the arrangement sequence.
In one embodiment of the present application, the instruction generating module includes: the first generation sub-module is used for generating a left click command under the condition that the action type is a left click action so as to control the virtual mouse to perform a first processing operation corresponding to the left click command on the user display interface; and the second generation module is used for generating a right click command under the condition that the action type is a left click action so as to control the virtual mouse to perform a second processing operation corresponding to the right click command on the user display interface.
It should be noted that, the description of the apparatus in the embodiment of the present application is similar to the description of the embodiment of the method described above, and has similar beneficial effects as the embodiment of the method, so that a detailed description is omitted. The technical details of the driving assistance device provided in the embodiment of the present application may be understood from the description of any one of fig. 1 to 5.
According to embodiments of the present application, there is also provided an electronic device and a non-transitory computer-readable storage medium.
Fig. 7 shows a schematic block diagram of an example electronic device 70 that may be used to implement embodiments of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the application described and/or claimed herein.
As shown in fig. 7, the electronic device 70 includes a computing unit 701 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 702 or a computer program loaded from a storage unit 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data required for the operation of the electronic device 70 may also be stored. The computing unit 701, the ROM 702, and the RAM 703 are connected to each other through a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
Various components in the electronic device 70 are connected to the I/O interface 705, including: an input unit 706 such as a keyboard, a mouse, etc.; an output unit 707 such as various types of displays, speakers, and the like; a storage unit 708 such as a magnetic disk, an optical disk, or the like; and a communication unit 709 such as a network card, modem, wireless communication transceiver, etc. The communication unit 709 allows the electronic device 70 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The computing unit 701 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 701 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 701 performs the respective methods and processes described above, for example, an input control method. For example, in some embodiments, the input control method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 708. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 70 via the ROM 702 and/or the communication unit 709. When the computer program is loaded into RAM 703 and executed by computing unit 701, one or more steps of the method described above may be performed. Alternatively, in other embodiments, the computing unit 701 may be configured to perform the method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present application may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this application, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, provided that the desired results of the technical solutions disclosed in the present application can be achieved, and are not limited herein.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. An input control method, the method comprising:
monitoring touch gestures of a set display area of the screen of the electronic equipment;
displaying a virtual mouse interface in the set display area under the condition that the touch gesture accords with the triggering condition of the virtual mouse;
detecting a touch event based on the set display area;
and generating a control instruction corresponding to the touch event.
2. The method of claim 1, wherein monitoring the touch gesture of the set display area of the electronic device screen comprises:
monitoring touch data of the set display area;
determining the number of touch signals in the touch data;
and determining the number of touch points generating touch operation according to the number of the touch signals.
3. The method of claim 2, wherein the detecting a touch event based on the set display area comprises:
acquiring a plurality of touch points of the set display area according to the touch data, and determining a control area;
detecting whether the control area has position change or not;
in the case of a change in position of the control area, a movement event is determined to occur.
4. The method of claim 3, wherein the generating a control instruction corresponding to the touch event comprises:
and generating a movement instruction corresponding to the movement event so as to control the cursor of the virtual mouse of the user display interface to move.
5. The method of claim 2, wherein the touch event comprises a key event; in a corresponding manner,
The detecting a touch event based on setting a display area includes:
judging whether the number of touch points of the set display area changes according to the touch data;
determining that a key action occurs under the condition that the number of the touch points is changed in the setting;
judging the action type of the key action according to the touch data at the set moment, wherein the action type comprises a left click action and a right click action;
a determination is made that a key event corresponding to the action type occurred.
6. The method according to claim 5, wherein the determining the action type of the key action according to the touch data at the set time includes:
determining finger coordinates corresponding to a plurality of touch points of the set display area according to the touch data at the set moment;
sorting the plurality of touch points according to the finger coordinates to obtain a finger coordinate sequence;
determining a target touch point where a key action occurs according to the generation time of the touch signals of the plurality of touch points;
determining the arrangement sequence of the target touch points in the finger coordinate sequence;
and determining the key action type according to the arrangement sequence.
7. The method of claim 6, wherein generating the control instruction corresponding to the touch event comprises:
generating a left click command under the condition that the action type is a left click action so as to control the virtual mouse to perform a first processing operation corresponding to the left click command on a user display interface;
and generating a right click command under the condition that the action type is a left click action so as to control the virtual mouse to perform a second processing operation corresponding to the right click command on the user display interface.
8. An input control device, the device comprising:
the monitoring module is used for monitoring touch gestures of a set display area of the screen of the electronic equipment;
the switching module is used for displaying a virtual mouse interface in the set display area under the condition that the touch gesture accords with the triggering condition of the virtual mouse;
the detection module is used for detecting a touch event based on the set display area;
and the instruction generation module is used for generating a virtual mouse control instruction corresponding to the touch event.
9. An electronic device, comprising:
at least one processor; and
A memory communicatively coupled to the at least one processor; wherein, the liquid crystal display device comprises a liquid crystal display device,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
10. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1-7.
CN202310413474.8A 2023-04-12 2023-04-12 Input control method and device, electronic equipment and storage medium Pending CN116483246A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310413474.8A CN116483246A (en) 2023-04-12 2023-04-12 Input control method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310413474.8A CN116483246A (en) 2023-04-12 2023-04-12 Input control method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116483246A true CN116483246A (en) 2023-07-25

Family

ID=87217125

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310413474.8A Pending CN116483246A (en) 2023-04-12 2023-04-12 Input control method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116483246A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117369669A (en) * 2023-12-08 2024-01-09 深圳市华腾智能科技有限公司 Processing method and system for programming touch type energy switch panel

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117369669A (en) * 2023-12-08 2024-01-09 深圳市华腾智能科技有限公司 Processing method and system for programming touch type energy switch panel
CN117369669B (en) * 2023-12-08 2024-03-08 深圳市华腾智能科技有限公司 Processing method and system for programming touch type energy switch panel

Similar Documents

Publication Publication Date Title
US9146672B2 (en) Multidirectional swipe key for virtual keyboard
US9152529B2 (en) Systems and methods for dynamically altering a user interface based on user interface actions
US8847904B2 (en) Gesture recognition method and touch system incorporating the same
US9195386B2 (en) Method and apapratus for text selection
US20140306897A1 (en) Virtual keyboard swipe gestures for cursor movement
US20140306898A1 (en) Key swipe gestures for touch sensitive ui virtual keyboard
EP2660697B1 (en) Method and apparatus for text selection
EP2660727B1 (en) Method and apparatus for text selection
WO2013164013A1 (en) Method and apparatus for text selection
US20170068374A1 (en) Changing an interaction layer on a graphical user interface
CN106445956B (en) Shopping data management method and device
US20200142582A1 (en) Disambiguating gesture input types using multiple heatmaps
JP2021192114A (en) Voice interaction method, device, electronic device, computer readable storage medium and computer program
CN116483246A (en) Input control method and device, electronic equipment and storage medium
US10345932B2 (en) Disambiguation of indirect input
JP2013114688A (en) Processing method of touch signal and electronic computer of the same
US20140105664A1 (en) Keyboard Modification to Increase Typing Speed by Gesturing Next Character
CN107092433B (en) Touch control method and device of touch control all-in-one machine
CN104750401A (en) Touch method and related device as well as terminal equipment
CN114510308B (en) Method, device, equipment and medium for storing application page by mobile terminal
KR101837830B1 (en) Short cut input device and method of mobile terminal using 3d touch input type in vdi environments
CN112162689B (en) Input method and device and electronic equipment
KR20150111651A (en) Control method of favorites mode and device including touch screen performing the same
JP2020160712A (en) Touch position detection system
WO2013164012A1 (en) Method and apparatus for text selection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination