CN111840990B - Input control method and device and electronic equipment - Google Patents

Input control method and device and electronic equipment Download PDF

Info

Publication number
CN111840990B
CN111840990B CN202010706568.0A CN202010706568A CN111840990B CN 111840990 B CN111840990 B CN 111840990B CN 202010706568 A CN202010706568 A CN 202010706568A CN 111840990 B CN111840990 B CN 111840990B
Authority
CN
China
Prior art keywords
touch
event
peripheral input
application
input event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010706568.0A
Other languages
Chinese (zh)
Other versions
CN111840990A (en
Inventor
黄世光
张思航
冯荣峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202010706568.0A priority Critical patent/CN111840990B/en
Publication of CN111840990A publication Critical patent/CN111840990A/en
Application granted granted Critical
Publication of CN111840990B publication Critical patent/CN111840990B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application discloses an input control method, an input control device and electronic equipment, wherein the method comprises the following steps: intercepting at least one first peripheral input event sent to an application by peripheral input equipment of electronic equipment; constructing at least one touch event which can be recognized by the application based on the at least one first peripheral input event, wherein the touch event comprises a touch action and touch parameters of the touch action; the at least one touch event is transmitted to the application so that the application responds to a touch action of the touch event based on the touch parameters of the touch event. The scheme of this application can reduce the complexity of input operation, improves the convenience of input operation.

Description

Input control method and device and electronic equipment
Technical Field
The present disclosure relates to the field of control technologies, and in particular, to an input control method and apparatus, and an electronic device.
Background
In many scenarios, it may be inconvenient for a user to perform an input operation by using a touch screen of an electronic device, and in such a case, the user may wish to perform the input operation by using an input component (such as a keyboard, a mouse, or a handle) externally connected to the electronic device. For example, in a game scene, due to the fact that the electronic device is large in size and not suitable for being held for a long time, or due to the fact that the electronic device display interface is too small and not suitable for input operation while a game is displayed, a user may want to select a keyboard or a handle of an external device of the electronic device to control and operate the game. For another example, in a virtual reality scenario, it may be desirable to control the virtual reality scenario through an externally connected handle.
In many applications, the touch event may only be supported, and in such a case, the user cannot perform an input operation by using an input component externally connected to the electronic device, which may cause inconvenience in the input operation and high complexity.
Disclosure of Invention
The application provides an input control method and device and electronic equipment.
The input control method comprises the following steps:
intercepting at least one first peripheral input event sent to an application by peripheral input equipment of electronic equipment;
building at least one touch event which can be recognized by the application based on the at least one first peripheral input event, wherein the touch event comprises a touch action and touch parameters of the touch action;
transmitting the at least one touch event to the application so that the application responds to the touch action of the touch event based on the touch parameters of the touch event.
Preferably, the constructing at least one touch event recognizable by the application based on the at least one first peripheral input event includes:
according to the event relation mapping table corresponding to the application, at least one touch event which can be identified by the application and corresponds to the at least one first peripheral input event is established, wherein the event relation mapping table comprises: touch event information corresponding to different peripheral input event sets, wherein the peripheral input event sets comprise at least one peripheral input event, and the touch event information comprises touch parameters of touch actions in the touch events.
Preferably, before the step of constructing the at least one touch event that can be recognized by the application and corresponds to the at least one first peripheral input event according to the event map table corresponding to the application, the method further includes:
determining a device type of a peripheral input device that sent the at least one first peripheral input event;
and determining an event relation mapping table corresponding to the equipment type and the application.
Preferably, the first peripheral input event includes at least: an operation object and an input action aiming at the operation object, wherein the operation object belongs to an operable component in the peripheral input equipment;
the constructing at least one touch event which can be identified by the application and corresponds to the at least one first peripheral input event according to the event relation mapping table corresponding to the application comprises the following steps:
determining at least one touch action triggered to be executed by the at least one first peripheral input event based on the operation object of the at least one first peripheral input event and the input action;
according to the operation object of the at least one first peripheral input event, touch parameters corresponding to touch actions triggered and executed by the at least one first peripheral input event are inquired from an event relation mapping table corresponding to the application, the touch parameters of the at least one touch action corresponding to different operation object sets are stored in the event relation mapping table, and the operation object set is composed of at least one operation object corresponding to the at least one peripheral input event in the peripheral input event set;
and constructing at least one touch event according to each touch action triggered by the at least one first peripheral input event and the touch parameters of each touch action.
Preferably, the determining, based on the operation object of the first peripheral input event and the input action, at least one touch action triggered to be performed by the first peripheral input event includes:
if at least two second peripheral input events of which the operation objects belong to a set type exist in the at least one first peripheral input event and a third peripheral input event does not exist currently, determining at least one touch action triggered and executed by the at least two second peripheral input events based on the operation objects and the input actions of the at least two second peripheral input events, wherein the third peripheral input event is a peripheral input event which is intercepted before the second peripheral input event, of which the operation objects belong to the set type and are not released;
if at least one second peripheral input event that the operation object belongs to the set type exists in the at least one first peripheral input event and the third peripheral input event exists currently, determining at least one touch action based on the third peripheral input event and the operation object and the input action of the at least one second peripheral input event;
and if the operation object of the first peripheral input event does not belong to the set type, or the operation object of the first peripheral input event belongs to the set type but does not have the third peripheral input event, determining at least one touch action triggered and executed by the first peripheral input event based on the operation object of the first peripheral input event and the input action.
Preferably, the querying, according to the operation object of the at least one first peripheral input event, the touch parameter of the touch action triggered and executed by the at least one first peripheral input event from the event relationship mapping table corresponding to the application includes: if at least one second peripheral input event with an operation object belonging to a set type exists in the at least one first peripheral input event and a third peripheral input event exists currently, based on the operation object which is not released in the at least one second peripheral input event and the third peripheral input event, touch parameters of a touch action triggered and executed by the at least one second peripheral input event are inquired from the event relation mapping table corresponding to the application.
Preferably, the touch parameters of the touch action in the event relation mapping table include touch coordinates of the touch action;
before intercepting at least one first peripheral input event sent by a peripheral input device of the electronic device to an application, the method further comprises the following steps:
intercepting interface data to be rendered of the application;
determining coordinate information of each operation control in the interface of the application based on the interface data;
and determining touch coordinates of touch actions corresponding to the operation objects in the event relation mapping table according to the corresponding relation between the operation controls and the operation objects of the peripheral input equipment and the determined coordinate information of each operation control.
Preferably, the method further comprises the following steps:
transmitting the interface data to the application;
when an application shows an interface corresponding to the interface data, according to the corresponding relation between the operation object and the operation object of the peripheral input equipment and the coordinate information of each operation control in the application, marking the information of the operation object of the peripheral input equipment corresponding to the operation control at the position of the operation control in the interface.
Wherein, an input control device includes:
the event intercepting unit is used for intercepting at least one first peripheral input event sent to an application by peripheral input equipment of the electronic equipment;
the event construction unit is used for constructing at least one touch event which can be recognized by the application based on the at least one first peripheral input event, and the touch event comprises a touch action and touch parameters of the touch action;
and the event transmission unit is used for transmitting the at least one touch event to the application so that the application responds to the touch action of the touch event based on the touch parameters of the touch event.
Wherein, an electronic equipment includes:
a memory and a processor, wherein the processor is capable of,
wherein the processor is configured to execute the input control method as described in any one of the above;
the memory is used for storing programs required by the processor to perform operations.
According to the scheme, the electronic equipment intercepts the peripheral input event sent by the peripheral input equipment to the application, at least one touch event which can be identified by the application is established based on the intercepted at least one peripheral input event, and the established at least one touch event is transmitted to the application.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flowchart of an embodiment of an input control method provided in the present application;
FIG. 2 is a schematic flow chart diagram illustrating another embodiment of an input control method provided in the present application;
FIG. 3 is a schematic diagram of an application interface used in the present application;
FIG. 4 is a schematic diagram of a touch sliding trajectory of a touch point generated based on a mouse wheel scroll in the present application;
FIG. 5 is a flow diagram illustrating one implementation of determining touch coordinates for a touch action in an event relationship table according to the present application;
FIG. 6 is a diagram illustrating information identifying an operation object in a peripheral input device at an operation control of an application interface;
FIG. 7 is a schematic diagram illustrating an exemplary input control device according to the present disclosure;
fig. 8 is a schematic diagram illustrating a structure of an electronic device according to the present application.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be practiced otherwise than as specifically illustrated.
Detailed Description
The scheme of the application is suitable for any scene needing to use the peripheral input equipment of the electronic equipment to control the application in the electronic equipment, and the application which does not support the peripheral input event can also make corresponding response to the peripheral input event through the scheme of the application.
The electronic device applicable to the application can be a mobile phone, a tablet computer, a notebook computer and the like, and is not limited thereto.
The application which is required to be manipulated based on the peripheral input device of the electronic device can be any application which determines to execute the action based on the external input event. For example, the application may be a game application, a virtual reality application, or an input application, etc., without limitation.
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without inventive step, are within the scope of the present disclosure.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an input control method according to an embodiment of the present disclosure. The embodiment of the application can be applied to any electronic equipment mentioned above.
The method of the embodiment may include:
s101, intercepting at least one first peripheral input event sent to an application by peripheral input equipment of electronic equipment.
The peripheral input device is an input device which is arranged outside the electronic device, is connected with the electronic device in a wired or wireless mode and can realize input operation. For example, the peripheral input device may include: a mouse, a keyboard, an operating handle or a remote control device, etc.
Accordingly, the peripheral input event is an input event generated through the peripheral input device. For example, the peripheral input event may be an input event such as pressing or lifting a key in a keyboard, an input event such as clicking a keypad of a mouse or rolling and sliding the mouse, or an input event such as rotating a handle in an operation handle or pressing or lifting a key in an operation handle.
If the application is an application running on an operating system, the operating system of the electronic device may sense input events present on the electronic device. In the embodiment of the application, after detecting the peripheral input event sent to the application, the operating system of the electronic device does not directly forward the peripheral input event, but intercepts the peripheral input event first, so as to convert the peripheral input event into a touch event which can be recognized by the application.
It should be noted that, in the embodiment of the present application, each peripheral input event represents an input action for one operation object in the peripheral input device.
In this embodiment, in order to facilitate distinguishing a certain type of peripheral input event extracted from the intercepted peripheral input events, the intercepted peripheral input event is referred to as a first peripheral input event.
S102, at least one touch event which can be recognized by an application is constructed based on at least one first peripheral input event.
The touch event comprises a touch action and touch parameters of the touch action.
The touch action of the touch event may be a touch point press, a touch point lift, a touch slide, or the like. The touch parameters of the touch action are parameters which represent specific action characteristics of the touch action. And the touch parameters of the touch action at least comprise touch coordinates of the touch action. For example, if the touch action is a touch point press, the touch parameters may include touch coordinates of the touch point press. For another example, if the touch action is a touch slide, the touch parameter is a touch coordinate representing a touch slide direction and/or distance.
In a possible implementation manner, the operating system of the electronic device may configure a conversion program for converting different peripheral input events into touch events, and on this basis, may invoke at least one conversion program matched with the at least one first peripheral input event, and construct at least one touch event corresponding to the at least one first peripheral event by using the at least one conversion program.
For example, some applications may involve a smaller variety of input operations, in which case the conversion routines corresponding to different peripheral input events may be configured. Correspondingly, the conversion program can be respectively called according to the intercepted peripheral input event, so that the touch event corresponding to the peripheral input event is constructed through the conversion program corresponding to the peripheral input event. Certainly, a conversion program corresponding to a peripheral input event combination formed by a plurality of peripheral input events can be constructed, and when the plurality of intercepted peripheral input events belong to the peripheral input event combination, the conversion program corresponding to the peripheral input event combination can be called to construct a corresponding touch event.
For example, assuming that the application is an input application, the operating system may construct a touch down event for converting a down event for a space bar into a touch down event for an input method switching button in the input application in advance, and when the operating system intercepts a peripheral input event in which the space bar is down in the keyboard, the operating system may construct a touch down event for a coordinate corresponding to the input method switching button through a conversion program for the peripheral input event, where the coordinate corresponding to the input method switching button may be configured in advance in the conversion program.
In yet another possible case, the application may also configure different event mapping tables for different applications. The event relation mapping table comprises: and inputting touch event information corresponding to the event sets by different peripherals. Wherein each set of peripheral input events includes at least one peripheral input event. The touch event information includes at least touch parameters of a touch action in the touch event. As before, the touch parameters include at least touch coordinates of the touch action.
For example, each set of peripheral input events in the event map may be identified by characteristic information of the respective peripheral input events in the set of peripheral input events.
The feature information of the peripheral input event may be an operation object of the peripheral input event, or may be an operation object of the peripheral input event and an input action for the operation object. For example, the peripheral input event set may be characterized by an operation object of each peripheral input event in the peripheral input event set and an input action for the operation object, and the peripheral input event set 1 includes: the peripheral input event 1 (the operation object 1 and the input action 1) and the peripheral input event 2 (the operation object 2 and the input action 2), and the peripheral input event set 1 corresponds to the touch action being a touch click with touch coordinates of (x1, y 1).
Correspondingly, at least one touch event which can be identified by the application and corresponds to the at least one first peripheral input event can be constructed according to the event relation mapping table corresponding to the application.
For example, for the application, each peripheral input event is independent of each other, and the touch parameters of the touch action in at least one touch event corresponding to the feature information of different peripheral input events in the event map table are the same, that is, each set of peripheral input events in the event map table only includes one peripheral input event. Correspondingly, for each first peripheral input event, a touch event containing a specific touch action and touch parameters is inquired; or, determining the type of the touch event according to the peripheral input event, and then constructing the touch event of the corresponding type based on the touch parameters of the touch action in the touch event corresponding to the peripheral input event in the event relation mapping table.
For another example, if some peripheral input event sets in the event map may be combinations of multiple peripheral input events for the application, on this basis, it may be determined which first peripheral input events may be combined into a peripheral input event set in the event map, and then a corresponding touch event may be constructed based on the touch parameters of the touch action in the touch event corresponding to the peripheral input event set. For those that cannot form a set of peripheral input events with other first peripheral input events, the touch event corresponding to the single first peripheral input event may be constructed based on the event map, similar to above.
It will be appreciated that there are many possible types of peripheral input devices to which the electronic device can be connected, for example, the peripheral input devices may be a keyboard and a mouse, or an operating handle. For the same application, the peripheral input events input by different peripheral input devices may be greatly different. Therefore, in order to conveniently construct the touch event corresponding to the peripheral input event, an event relation mapping table corresponding to the device types of different peripheral input devices can be constructed for the same application.
Correspondingly, the device type of the peripheral input device sending the at least one first peripheral input event can be determined. Then, an event map table corresponding to the device type and the application is determined.
And S103, transmitting at least one touch event to the application so that the application responds to the touch action of the touch event based on the touch parameters of the touch event.
It is understood that, since the electronic device (e.g., an operating system of the electronic device) reconstructs that the touch event is a touch operation that can be recognized by the application based on the peripheral input event, after at least one touch event is transmitted to the application, the application responds to the touch event, so that the application can respond accordingly to the peripheral input event.
It can be understood that, in the case that there are a plurality of touch events constructed based on at least one peripheral input event, the plurality of touch events may have an execution sequence, and accordingly, the plurality of touch events may be sequentially transmitted to the application according to the execution sequence of the plurality of touch events.
For example, assume that for an application, the operating system of the electronic device configures the pressing of the "up" key in the keyboard to correspond to a touch click and an upward touch swipe in the interface of the application. On the basis, when the operating system of the electronic equipment intercepts that the peripheral input event is the pressing of an 'up' key, touch clicking and upward touch sliding on corresponding coordinate positions in an application interface are constructed. The operating system sends a touch event of touch clicking of a corresponding coordinate position to the application, and then sends a touch event of upward touch sliding, so that the application responds to the touch clicking first and then responds to the touch sliding, and accordingly corresponding input is completed by combining the application running state and the corresponding touch event.
Therefore, in the application, the electronic device intercepts the peripheral input event sent by the peripheral input device to the application, at least one touch event which can be identified by the application is constructed based on the intercepted at least one peripheral input event, and the constructed at least one touch event is transmitted to the application.
As can be appreciated from the foregoing, there are many possibilities for constructing a touch event based on at least one peripheral input event. For ease of understanding, the following description will be made in detail with reference to an event map table of an application to construct at least one touch event corresponding to at least one peripheral input event.
As shown in fig. 2, which shows a schematic flow chart of another embodiment of the input control method of the present application, the method of the present embodiment may include:
s201, intercepting at least one first peripheral input event sent to an application by peripheral input equipment of the electronic equipment.
In this embodiment, the first peripheral input event at least includes: an operation object and an input action for the operation object.
The operation object belongs to an operable component in the peripheral input device, and for example, when the peripheral input device is a keyboard or a mouse, the operable control may include: individual keys in the keyboard; and a left button, a right button, a scroll wheel and the like in the mouse. The peripheral input device is a control handle, and the operable space is a handle rotating rod, a key and the like.
S202, determining at least one touch action triggered to be executed by at least one first peripheral input event based on the operation object and the input action of the at least one first peripheral input event.
In the embodiment, when determining the touch action, the touch action in the touch event triggered by the peripheral input event is determined based on the specific characteristics of the peripheral input event without combining the event relation mapping table.
It can be understood that, for one type of peripheral input device, the types of the operation objects corresponding to the peripheral input events input by the peripheral input device are relatively few, and the types of the input actions are also limited. On the basis, the operating system of the electronic equipment can configure the conversion relation between different operation objects and input actions and touch actions, so that at least one touch action is determined based on the operation objects and the input actions of the intercepted at least one first peripheral input event.
In one possible case, an event conversion program may be run in an operating system of the electronic device, and the event conversion program may determine touch motions corresponding to different operation objects and input motions. On the basis, the corresponding at least one touch action can be determined by a touch conversion program in the operating system based on the operation object and the input action of the peripheral input event.
For example, taking peripheral input devices such as a keyboard and a mouse as an example, the operation objects of the peripheral input events only include two major categories of keys (including a mouse keypad and keyboard keys) and a mouse wheel; accordingly, the input actions may include three of key press, key lift, and scroll wheel scroll. On the basis of the touch action, the pressing of the key can be determined by a configuration program as the pressing of the touch point. The configuration program may also determine that the pressing of the key is a touch action of pressing a touch point, or two touch actions of pressing the touch point and sliding the touch point according to different operation objects. Similarly, for a key-up configuration, it is defined as touch point up, while for scroll wheel scrolling, the program may be configured to define touch point down and touch point down.
The process of operating the handle as the peripheral input device is similar and will not be described herein again.
It is understood that the input operations involved in the application are relatively simple, such as the input operations may involve only a single-point or multi-point touch-and-click operation, and relatively simple touch-and-slide operations, and the electronic device may be configured with a key to trigger a touch action. For example, in the case of an application as an input method application, the input operation that can be recognized by the application is only a click on a key on a virtual keyboard or a click on an input method switching identifier. Or, the input actions triggered by different peripheral input events intercepted at the same time in the configuration of the electronic device do not affect each other.
In this case, the different peripheral input events may individually trigger the corresponding touch actions. Accordingly, at least one touch action may be determined for each first peripheral input event based on the operation object and the input action of the peripheral input event. The process of specifically determining the touch action is the same as described above. Correspondingly, if a plurality of peripheral input events are intercepted, a plurality of touch actions are respectively determined.
In contrast, for applications such as games that are complicated, the types of touch input operations involved are many. For example, in the case of a game application, the interface of the game application may include control buttons for triggering an attack, an explosion, and the like, in addition to the operation wheel for displaying the control direction. On this basis, in order to better fit the operation habit of the user on the application, it may be necessary to configure that there is a correlation between certain types of operation objects, and therefore, if at least two input events corresponding to the types of operation objects are intercepted at the same time, it is necessary to comprehensively determine the touch action by combining the input events of the types of operation objects.
For example, assuming that "W", "S", "a", and "D" in the keyboard respectively indicate control keys in up, down, left, and right directions in an application such as a game, in this case, if pressing operations of two keys among the four keys are detected at the same time, the control directions cannot be determined individually, but need to be determined in conjunction with the detected pressing operations of the two keys. For example, the detection of the pressing "W" and "S" should actually control the movement of the upper left, and the corresponding touch action should actually be the sliding of one touch point to the upper left, rather than the sliding of two touch points to different directions.
The operation object belonging to the setting type may be set according to the application and the actual application scenario, and is not described herein again.
As can be seen from the above analysis, if some operation objects of the peripheral input device are set as operation objects of a set type, after at least one first peripheral input event is intercepted, if a plurality of first peripheral input events exist, it is necessary to first detect whether at least two peripheral input events corresponding to the operation objects of the set type exist, and to comprehensively determine at least one touch action by combining the at least two peripheral input events corresponding to the operation objects of the set type.
Specifically, the method comprises the following steps:
if at least two second peripheral input events of which the operation objects belong to the set type exist in the at least one first peripheral input event and no third peripheral input event exists currently, determining at least one touch action triggered and executed by the at least two second peripheral input events based on the operation objects and the input actions of the at least two second peripheral input events; the third peripheral input event is a peripheral input event which is intercepted before the second peripheral input event, an operation object belongs to a set type and is not released;
if at least one second peripheral input event that the operation object belongs to the set type exists in the at least one first peripheral input event and a third peripheral input event exists currently, determining at least one touch action based on the third peripheral input event and the operation object and the input action of the at least one second peripheral input event;
and if the operation object of the first peripheral input event does not belong to the set type, or the operation object of the first peripheral input event belongs to the set type but does not have the third peripheral input event, determining at least one touch action triggered and executed by the first peripheral input event based on the operation object and the input action of the first peripheral input event.
For convenience of distinguishing, the first peripheral input event of which the corresponding operation object belongs to the set type is called a second peripheral input event, and obviously, the second peripheral input event belongs to the intercepted first peripheral input event.
Accordingly, a peripheral input event which is intercepted before the second input event, for which the operation object belongs to the set type and has not been released is referred to as a third peripheral input event. The peripheral input event that is not released means that the peripheral input event has not been ended, and it also indicates that the operation object corresponding to the peripheral input event is still in the operated state and has not been released. For example, after a key is pressed, if the key has not been lifted, the key has not been released, and accordingly, the pressing event of the key has not been released.
As can be seen, for the case that there are multiple second peripheral input events among the multiple first peripheral input events and there is no third peripheral input event, it is necessary to determine at least one touch action in combination with the multiple second peripheral input events. For the first peripheral input event not belonging to the second peripheral input event, a corresponding touch action may be determined for each first peripheral input event.
For example, still taking the game application as an example, assuming that "W", "S", "a" and "D" in the keyboard are set types of keys, on this basis, if the peripheral input event intercepted at the same time includes: the key "W" is pressed, the key "A" is pressed, and the space bar is pressed for these three peripheral input events. The operating system of the electronic device may determine that two peripheral input events, namely, a button "W" press and a button "a" press, belong to a second peripheral input event, and then, in combination with a touch point corresponding to the two second input events and a button action being a press, it may be determined that the touch action is: touch point depression and touch point sliding. And the touch action determined for the space key press is the touch point press.
Similarly, in order to facilitate understanding of a case where there is at least one second peripheral input event in which the operation object belongs to the set type among the plurality of first peripheral input events, and there is a third peripheral input event currently, the above game application is still exemplified.
Assuming that the pressing of the key "W" in the keyboard is detected, the peripheral input event that the key "a" is pressed is detected before the peripheral input event that the key "W" is pressed is intercepted, and the peripheral input event that the key "a" is lifted is not detected yet, since the pressing of the key "a" corresponds to a touch point, after the pressing of the key "W" is detected, the touch action can be determined as the touch sliding of the touch point in combination with the pressing of the key "a".
S203, according to the operation object of the at least one first peripheral input event, touch parameters corresponding to the touch action triggered and executed by the at least one first peripheral input event are inquired from the event relation mapping table corresponding to the application.
The event relation mapping table stores touch parameters of at least one touch action corresponding to different operation object sets. That is, in this embodiment, the touch event information corresponding to the peripheral input event set is specifically touch event information of an operation object set corresponding to the peripheral input event set. The operation object set corresponding to the peripheral input event is composed of at least one operation object corresponding to at least one peripheral input event in the peripheral input event set.
For example, in a possible case, for each first peripheral input event, according to an operation object of the first peripheral input event, touch event information matching the operation object and the operation object may be queried from an event relation mapping table corresponding to an application, and then touch parameters of touch actions triggered by the first peripheral input event may be queried from the touch event information.
See, e.g., table 1 below, which shows a schematic diagram of an event RelationMap table in the case where the peripheral input device is a mouse and keyboard.
TABLE 1
Figure BDA0002594979050000131
Figure BDA0002594979050000141
If it is assumed that the first peripheral input event is that a key a is pressed, it is determined that the touch actions triggered by the pressing of the key a include touch point pressing and touch sliding, touch parameters of each touch action corresponding to the key a can be queried based on table 1, and based on this, it can be obtained that touch coordinates corresponding to the pressing of the touch point, that is, down, are (x0, y0), and touch coordinates corresponding to the touch point sliding move are (x0, 100).
In yet another possible scenario, if the peripheral input device has a set type of operand therein, then if at least two of the intercepted first peripheral input events have a second peripheral input event in which the operand belongs to the set type, and there is no third peripheral input event, the at least two peripheral input events actually constitute a set of peripheral input events. Correspondingly, based on the operation objects of the at least two second external input events, touch parameters of at least one touch action triggered by the at least two second external input events are inquired from the event relation mapping table.
As described in conjunction with table 1, assuming that the second peripheral input event of the key a depression and the key W depression is intercepted, after it is determined that the touch action includes a touch point depression and a touch point slide, table 1 may be queried according to the two operation objects of the key a and the key W, so that the touch coordinate of the touch point depression may be queried as (x0, y0), and the touch coordinate of the touch point slide may be queried as (-100, 100).
Specifically, if at least one second peripheral input event of which the operation object belongs to the set type exists in the intercepted at least one first peripheral input event, and a third peripheral input event which is intercepted before the second peripheral input event and is not released currently exists, based on the at least one second peripheral input event and the operation object which is not released in the third peripheral input event, the touch parameter of the touch action which is triggered and executed by the at least one second peripheral input event is inquired from the event relation mapping table corresponding to the application.
For example, still taking the example that the third peripheral input event of pressing the key "a" is detected before the second peripheral input event of pressing the key "W" is intercepted, and the lifting of the key "a" is not detected yet, after the touch action is determined to be the touch swipe based on the pressing of the key "W" and the pressing of the key "a", the table 1 may be queried based on the operation objects "W" and "a" keys of the two peripheral input events, and the touch coordinate of the touch swipe may be (-100, 100).
S204, at least one touch event is constructed according to each touch action triggered by at least one first peripheral input event and the touch parameters of each touch action.
Each touch event is composed of one or more touch actions and touch parameters corresponding to the one or more touch actions.
It will be appreciated from the foregoing that the at least one first peripheral input event may be divided into at least one set of peripheral input events, each set of peripheral input events corresponding to at least one touch action being triggered. If each peripheral input event set only corresponds to one touch action, a touch event containing each touch action and corresponding touch parameters can be directly constructed.
If at least two touch actions triggered by at least one peripheral input event set exist in at least one peripheral input event set corresponding to the at least one first peripheral input event, the execution sequence of each touch action may also be determined in step S202. Wherein, there may be a plurality of touch actions in the same execution sequence. On the basis, a touch event can be formed based on at least one touch action and the touch parameter of the at least one touch action in the same execution sequence according to the execution sequence of the touch actions.
For example, for ease of understanding, the first peripheral input event still being intercepted at the same time as before includes: key "W" press, key "a" press, and space key press. Then based on key W pressing and key a pressing, the touch action may be: touch point down and touch point swipe, where touch point down is a first execution order and touch point swipe is a second execution order. And the touch action determined by pressing the space key is used as the touch point pressing, and the touch point pressing is a first execution sequence because only one touch action is performed. Based on the touch action and the corresponding touch parameters, the touch action comprising the first touch point which is triggered by the press of the key W and the press of the key A and the corresponding touch events of the touch action set corresponding to the touch parameters, the touch point which is triggered by the press of the key W and the press of the key A and the touch events which are triggered by the press of the key W and the press of the key A and the corresponding touch events of the first touch point are constructed. Wherein the execution order of the first touch events precedes the execution order of the second touch events. Accordingly, the first touch event and the second touch event are sequentially sent to the application, so that the application responds to the first touch event and then responds to the second touch event.
Each touch action in the touch event corresponds to a touch point in the application, and it can be understood that each touch point has two attributes of a touch point identifier and an index number. In the application, in order to distinguish the touch points, a unique touch point identifier may be assigned to each touch point of the touch action, and the touch point identifier is used for uniquely identifying the touch point, and before the touch point is lifted, the identifier of the touch point is fixed and unchanged. Meanwhile, an index number can be allocated to each touch point according to the touch points existing in the application, and the index numbers are sequentially numbered for each touch point according to the number of the touch points currently existing in the application.
For example, assuming that there are three touch points in the application, the touch point identification of any one of the three touch points is not changed as long as the touch point is not lifted. If a certain touch point is lifted, the touch point identification of the touch point is vacant.
However, if a certain touch point of the three touch points is lifted up, the index number of the currently existing touch point needs to be adjusted, for example, if 3 touch points are pressed down, the index numbers of the three touch points are sequentially 0,1, and 2, and if the touch point with the index number of 1 is lifted up, the index number corresponding to the touch point with the index number of 2 is changed from 2 to 1, and the sequential padding is used for replacing the up padding.
And S205, transmitting at least one touch event to the application so that the application responds to the touch action of the touch event based on the touch parameters of the touch event.
As previously described, if multiple touch events are constructed, which will have a corresponding execution order, the multiple touch events may be sequentially transmitted to the application in the execution order.
The specific implementation of step S205 may refer to the related description of the foregoing embodiments, and details thereof are not repeated.
It can be understood that fig. 2 is an implementation manner of constructing a touch event corresponding to at least one intercepted peripheral input event based on an event relation mapping table, in this embodiment, touch parameters of a touch action are stored in the event relation mapping table, and after determining the touch action of the touch event based on the peripheral input event, the electronic device obtains the touch parameters required by the touch action by querying the table. However, if the event relation mapping table includes touch actions and touch parameters of the touch actions corresponding to different peripheral input events or peripheral input event combinations (determined by the operation object and the input action), the touch actions and touch parameters corresponding to different peripheral input events or peripheral input event combinations may also be determined directly based on the event relation mapping table.
For the convenience of understanding the embodiments of the present application, the following description is made in conjunction with an application scenario.
Still taking an application as an example of a game application, as shown in fig. 3, some operational controls operable in an application interface of the game application of the present application are shown.
As can be seen from fig. 3, various operation controls are displayed in an application interface of the game application, for example, an operation wheel 301 and a plurality of skill buttons 302 are displayed in the interface.
Wherein different skill keys are used for releasing different skills, for example, the skill keys in fig. 3 may include a shooting skill key for shooting a shot, an attack gun skill key for shooting an attack gun, and the like.
For the game application, the corresponding relation between the operation objects such as different keys in different keyboards and mice and the like and the operation wheel and the skill key in the game application can be determined according to the operation wheel and the skill key in the game application. And then, combining the corresponding relation and the coordinate positions of the operating wheel disc and the skill key to construct an event relation mapping table.
It is assumed that the keyboard is provided, and the keys W, S, A, D correspond to four directions of up, down, left, and right in the operation wheel, respectively. The space key corresponds to a shooting skill key. Of course, other key buttons may correspond to other skill buttons such as an attack gun skill button, which are similar to the relationship between the space key and the shooting skill button, and are not described herein again. Based on the corresponding relationship, in order to click a corresponding key on the physical keyboard and enable the application to obtain a touch operation on a wheel disc or a skill key in an application interface of the application, it is necessary to determine touch coordinates of a touch action triggered by an operation object such as a key and a wheel in the physical keyboard and the mouse according to coordinate positions of the wheel disc and the skill key in the application interface, and generate an event relation mapping table.
In fig. 3, it is assumed that the center coordinate position of the operation wheel in the game application interface is (x0, y0), and based on the center of the operation wheel, the position directly above the center is the positive direction of the y axis, the position directly below the wheel is the negative direction of the y axis, the position directly to the left of the wheel is the negative direction of the x axis, and the position directly to the right of the wheel is the positive direction of the x axis, as shown by the broken line coordinate system in fig. 3.
Therefore, if the user slides to the right upper direction along the center of the operation wheel, the touch slide from the center of the operation wheel to the right direction of the y axis is detected in the application interface. Therefore, in order to enable the application to obtain the positive touch sliding from the center of the operation wheel to the y-axis direction through the physical key W, the operating system of the electronic device may preset an event conversion program, and the event conversion program may convert the pressing action of the "W" key in the keyboard into two touch actions, namely, a touch click and a touch sliding action.
On this basis, in order to map the translated touch click and touch slide into: if the touch sliding of the center of the operation wheel in the positive y-axis direction is performed in the application interface of the game, it needs to be determined that the coordinate position of the converted touch click is the coordinate position of the center of the operation wheel, that is, the touch coordinate of the touch click is (x0, y 0); while the coordinate of the touch slide can be expressed in the y-axis direction, in the embodiment of the present application, the amplitude of the coordinate of the touch slide sliding upwards can be set to be 100 as required, and then the touch coordinate of the touch slide sliding upwards is (x0,100), that is, 100 coordinate points are slid upwards from the center of the operating wheel.
This indicates the touch parameter, which is the touch coordinate of the two touch operations, i.e., the touch click and the touch slide, converted from the pressing operation of the W key. On the basis, the step of constructing the touch event information corresponding to the operation object 'W' key in the event relation mapping table comprises the following steps: the touch coordinates of the touch click (x0, y0) and the touch coordinates of the touch slide are (x0,100), as shown in the first row of table 1.
Correspondingly, for single operation objects such as other physical keys or a scroll wheel, and possible key combinations of the four physical keys W, A, S and D, touch parameters of each touch action that may be triggered by different operation object sets in the event relation mapping table may be determined in a similar manner as described above, and are specifically shown in table 1.
On the basis of the above, the following description is divided into several cases:
in the first case:
the peripheral input event generated by the keys (the operation objects of the set type) on the keyboard corresponding to the operation wheel in the application is introduced:
if the user presses the space key, the electronic equipment detects that the space key is pressed by a corresponding touch point, meanwhile, the event relation mapping table in the lookup table 1 knows that the touch coordinate of the pressed touch point is (x1, y1), a touch event down (x1, y1) can be constructed, on the basis, the operating system transmits the touch event down (x1, y1) to the application, and the application can respond to the input operation of the touch press of the shooting skill key based on the fact that the touch coordinate in the touch event is the press of the shooting skill key.
In the second case:
the case that there are input events of keys on the keyboard corresponding to the operation wheel and input events not belonging to the keys corresponding to the operation wheel is described:
if the user presses the W key and the space key, the electronic equipment can confirm that the W key is in two touch actions of pressing one touch point and sliding the touch point, the touch point is pressed in a first sequence, the touch point is slid in a second sequence, and the space key triggers the touch action of pressing the other touch point. Based on table 1, the electronic apparatus recognizes that the touch point coordinates corresponding to the down pressing of the touch point corresponding to the W key are (x0, y0), the touch point coordinates corresponding to the touch point sliding move are (x0,100), and the touch point coordinates corresponding to the space key are (x1, y 1).
Based on the above, the electronic device may construct a first touch event as down (x0, y0) of touch point 1 and down (x1, y1) of touch point 2, and construct a second touch event as move (x0,100), and then sequentially send the two touch events to the application, so that the application responds to the pressing of each touch point of the chain, that is, the touch point pressing operation exists on the center point of the operating wheel and the shooting skill key, respectively, and responds to the touch event that the touch point on the operating wheel slides upwards from the center.
For the case of pressing W and a simultaneously, W and D, S and D, or S and a simultaneously, reference may be made to the related descriptions in the foregoing examples, and details are not repeated herein.
It will be appreciated that if both W and S, or a and D, are pressed simultaneously, the electronic device will recognize an invalid input without performing an event transition process, since the two sets of combinations represent opposite directions in the operating wheel, respectively.
In addition, for the keys W and a, or S and D, if the key a (or the key W) is pressed before the key W (or the key a) is pressed, the processing manner can also be referred to the related description above, and similarly for the case where S and D are not pressed simultaneously.
In the third case:
adjusting the visual angle of a display picture in the game application by mouse rolling, and introducing the conversion process of a peripheral input event triggered by the rolling of a mouse roller:
when the electronic device detects that the mouse wheel rolls, it is determined that two touch actions, namely a touch point click and a touch point slide, need to be converted, and then the lookup table 1 can obtain that the coordinates corresponding to the touch point click are (x3, y3), and each time the touch point moves, x4 coordinates need to be added in the x-axis direction, and y4 coordinates need to be added in the y-axis direction.
On the basis, a down (x3, y3) touch event and a move (x3+ x4, y3+ y4) touch event are successively constructed, the two touch events are sequentially sent to the application, the application determines the click of a touch point according to the pressed coordinates of the touch point, and the touch point slides in the display interface of the application in response to the move (x3+ x4, y3+ y4) so as to realize the picture angle of the interface in the dragging application.
It will be appreciated that the electronic device will continually detect scroll wheel scrolling if the mouse continues to scroll before the touch point is released. On this basis, the electronic device increases the x coordinate value of the coordinate position by x3 coordinates according to table 1 based on the coordinate position where the touch point generated for the wheel was located last time, and increases the y coordinate value in the coordinate position by y3, and so on, which is repeated continuously, so that the application continuously receives the touch slide event, and further continuously adjusts the view angle of the application interface.
Fig. 4 is a schematic diagram showing a touch trajectory based on touch events that are continuously generated by mouse wheel scrolling.
The point on the touch trajectory in fig. 4 represents the touch origin of the touch point at different touch slide events.
It can be understood that, in the case that the event map table stores touch parameters of at least one touch action corresponding to different operation object sets, and the touch parameters of the touch action include touch coordinates of the touch action, the event map table may be pre-constructed and stored in the electronic device.
In practical applications, coordinate positions of operation controls in the applications may be adjusted according to usage habits during use of the applications by users, for example, in a virtual reality scene, the users may adjust display positions of operation keys in a display interface. For another example, in a gaming application, the user may adjust the display position of the operating wheel, as shown in FIG. 3, the user may adjust the operating wheel from the left side to the right side of the application interface.
In order to enable a user to realize that a touch operation is input to an application by using a peripheral input device after the coordinate position of an operable object in an application interface is adjusted by the user, the touch coordinate of a touch action in an event relation mapping table can be adjusted according to application interface data before the application displays the application interface (at this time, the electronic device does not intercept a first peripheral input event).
For example, referring to fig. 5, which shows a schematic flow chart of an implementation of determining touch coordinates of a touch action in an event relationship table in the present application, the flow of this embodiment may include:
s501, intercepting interface data to be rendered of the application.
It will be appreciated that the interface data is loaded and rendered before the application presents the application interface. In the application, an electronic device (such as an operating system of the electronic device) may capture interface data to be rendered by an application, so as to determine a touch coordinate corresponding to a touch action converted from a peripheral input event through subsequent steps before the application loads an application interface.
And S502, determining coordinate information of each operation control in the application interface based on the interface data.
The operation control is a control used for carrying out operation control on the application in an application interface of the application. For example, the operation control can be various keys in an application interface of the application, an operation wheel or a control icon, and the like.
The coordinate information of the operation control refers to coordinate position information of the operation control in an application interface of the application.
It can be understood that, before the application is loaded out of the application interface, if the user adjusts the display coordinates of the operation control in the interface, the coordinate information of the operation control determined according to the interface data is the coordinate information of the operation control adjusted by the user.
And S503, determining touch coordinates of touch actions corresponding to the operation objects in the event relation mapping table according to the corresponding relation between the operation controls and the operation objects of the peripheral input equipment and the coordinate information of each operation control in the application.
In the embodiment of the present application, the correspondence between the operation control in the application and the operation object of the peripheral input event is fixed, as in the previous example of the game application, the space bar corresponds to the shooting skill button in the game, and the buttons W, A, S and D correspond to the four control directions of up, down, left and right of the operation wheel in the game application.
For example, when the correspondence between the operation control and the operation object of the peripheral input device is known, for any one operation object set in the event relationship mapping table, according to the coordinate information of the touch control in the application corresponding to the operation object set, the coordinate related to the touch control in the touch parameter corresponding to the operation object set in the event relationship mapping table is determined.
For ease of understanding, it is still assumed that the application is a gaming application, and that an event map has been constructed, and that the event map is table 1.
In the game application, the electronic device sets the space key in the keyboard to correspond to the shooting skill key in the game application, and on the basis of the space key, the electronic device determines the coordinates of the shooting skill key to be (x6, y6) based on the interface data of the game application. As can be seen from table 1, the coordinates of the touch point pressed by the touch point triggered by the space key are (x1, y1), which do not match the current actual coordinates (x6, y6) of the shooting skill key, so it is necessary to adjust the touch coordinates of the touch point pressed by the space key in table 1 to (x6, y6), that is, down (x6, y 6).
Similarly, assuming that the center coordinates of the operating wheel are determined to be (x7, y7) based on the intercepted interface data, keys W, A, S and D in the keypad correspond to different directions of the operating wheel, and accordingly, the coordinates corresponding to keys W, A, S and D or combinations of keys W, A, S and D in table 1 that characterize the center of the operating wheel need to be adjusted.
For example, the touch point pressed by the key W in table 1 has coordinates (x0, y0), which do not match the center coordinates of the operation wheel in the current application. And the coordinates of the touch slide are (X0,100), wherein the Y-axis movement coordinates of the touch slide are set fixed values, and the X-axis coordinates are the X-axis coordinates X0 of the center of the operating wheel, which also do not coincide with X7. Therefore, it is necessary to modify the coordinates of the touch point depression corresponding to the key W to (X7, Y7), and the Y-axis coordinates among the coordinates of the touch slide to be a set fixed value without modification, while the X-axis coordinates need to be adjusted from X0 to X7, thereby obtaining down (X7, Y7), move (X7, 100).
Similarly, if the touch coordinates of the key W and the key a corresponding to the touch point pressed are (x0, y0), and the center coordinates of the operation wheel are (x7, y7), the touch coordinates of the key W and the key a corresponding to the touch point pressed are adjusted from (x0, y0) to (x7, y 7).
It can be understood that, in the embodiment, by intercepting the interface data to be rendered by the application, the coordinate information of each operation control in the interface to be displayed in the application can be determined, and based on this, the touch coordinates of the touch operation corresponding to different operation objects in the event relation mapping table can be determined by combining the corresponding relation between the operation object in the peripheral input equipment and the operation control in the application, and the touch coordinates of different touch operations involved in the event relation mapping table are consistent with the coordinates of each operation control in the application interface, so that, even if the user adjusts the position of the operation control in the application according to the use habit, the event relation mapping table can still be applied to the application through the embodiment, therefore, the touch event for controlling the application can be constructed based on the event relation mapping table and the subsequent intercepted peripheral input event.
S504, the interface data is transmitted to the application.
After the interface data is transmitted to the application, the application can normally load and render the interface of the application based on the interface data.
And S505, when the application shows the interface corresponding to the interface data, marking the information of the operation object of the peripheral input equipment corresponding to the operation control at the position of the operation control in the interface according to the coordinate information of each operation control in the corresponding relation of the operation object and the operation object of the peripheral input equipment.
If the operation control in the application interface marks the identifier of the operation object in the corresponding peripheral input device, for example, if the operation object is a key on a keyboard, the character displayed on the key may be marked.
The specific manner of identifying the information of the corresponding operation object at the operation control can be various. For example, a transparent layer may be constructed, and information of an operation object corresponding to an operation control is marked at a coordinate corresponding to the operation control of the application interface in the transparent layer. Of course, there are other possible implementations, which are not limited.
For ease of understanding, still taking an application as an example of a game application, reference may be made to fig. 6, where fig. 6 is a schematic diagram illustrating information of an operation object in a peripheral input device marked at an operation control of an application interface.
As can be seen from a comparison between fig. 3 and fig. 6, an identifier of an operation object is marked at each operation control in fig. 6. For example, the four operating regions of the game application, i.e., the upper, lower, left and right operating regions of the operating wheel, correspond to W, S, A, D keys on the keyboard, and thus W, S, A, D is indicated in each of the four directions of operating the wheel 301. Similarly, in game applications, a "space" is marked next to the shooting skill button, and other skill buttons are marked with the labels of different buttons on the keyboard or mouse, respectively.
The information of the operation object in the peripheral input equipment corresponding to the operation control in the application is displayed in the application interface, so that a user can directly determine how to control each operation control in the application through the peripheral input equipment, and the application can be controlled more efficiently and conveniently by the user.
It should be noted that step S505 is an optional step, and is intended to enable a user to more intuitively determine a specific implementation of controlling an application by using a peripheral input device, and in an actual application, if the user already knows or otherwise learns a corresponding relationship between the peripheral input device and an operation control in the application, step S505 may not be executed.
The application also provides an input control device corresponding to the input control method. As shown in fig. 7, which shows a schematic structural diagram of an embodiment of an input control device according to the present application, the device of the present embodiment may include:
the event intercepting unit 701 is used for intercepting at least one first peripheral input event sent by peripheral input equipment of the electronic equipment to an application;
an event construction unit 702, configured to construct at least one touch event recognizable by the application based on the at least one first peripheral input event, where the touch event includes a touch action and a touch parameter of the touch action;
an event transmission unit 703 is configured to transmit the at least one touch event to the application, so that the application responds to a touch action of the touch event based on the touch parameter of the touch event.
In a possible implementation manner, the event constructing unit is specifically configured to construct, according to an event relation mapping table corresponding to the application, at least one touch event that can be identified by the application and corresponds to the at least one first peripheral input event, where the event relation mapping table includes: touch event information corresponding to different peripheral input event sets, wherein the peripheral input event sets comprise at least one peripheral input event, and the touch event information comprises touch parameters of touch actions in the touch events.
Optionally, the apparatus further comprises:
a type determination unit for determining a device type of a peripheral input device that transmitted the at least one first peripheral input event;
a table determining unit, configured to determine an event relationship mapping table corresponding to the device type and the application.
Optionally, the first peripheral input event intercepted by the event interception unit at least includes: an operation object and an input action aiming at the operation object, wherein the operation object belongs to an operable component in the peripheral input equipment;
the event construction unit comprises:
the action determining subunit is used for determining at least one touch action triggered and executed by the at least one first peripheral input event based on the operation object of the at least one first peripheral input event and the input action;
the parameter query subunit is configured to query, according to the operation object of the at least one first peripheral input event, a touch parameter corresponding to a touch action triggered and executed by the at least one first peripheral input event from an event relation mapping table corresponding to the application, where the touch parameter of the at least one touch action corresponding to different operation object sets is stored in the event relation mapping table, and the operation object set is composed of at least one operation object corresponding to the at least one peripheral input event in the peripheral input event set;
and the event construction subunit is used for constructing at least one touch event according to each touch action triggered by the at least one first peripheral input event and the touch parameters of each touch action.
Optionally, the action determining subunit includes:
a first action determining subunit, configured to determine, if at least two second peripheral input events, for which an operation object belongs to a set type, exist in the at least one first peripheral input event and a third peripheral input event does not exist currently, at least one touch action triggered and executed by the at least two second peripheral input events based on the operation object and the input action of the at least two second peripheral input events, where the third peripheral input event is a peripheral input event that is intercepted before the second peripheral input event and for which the operation object belongs to the set type and has not been released;
a second action determining subunit, configured to determine, if at least one second peripheral input event exists in the at least one first peripheral input event, where an operation object belongs to a set type, and the at least one second peripheral input event exists currently, at least one touch action based on the third peripheral input event and the operation object and the input action of the at least one second peripheral input event;
and the third action determining subunit is configured to determine, if the operation object of the first peripheral input event does not belong to the set type, or if the operation object of the first peripheral input event belongs to the set type but the third peripheral input event does not exist, at least one touch action triggered and executed by the first peripheral input event based on the operation object of the first peripheral input event and the input action.
Optionally, the event constructing subunit includes:
and the auxiliary construction subunit is configured to, if at least one second peripheral input event exists in the at least one first peripheral input event, where the operation object belongs to a set type, and a third peripheral input event exists currently, query, based on the operation object that has not been released in the at least one second peripheral input event and the third peripheral input event, a touch parameter of a touch action that is triggered and executed by the at least one second peripheral input event from the event relationship mapping table corresponding to the application.
In yet another possible implementation manner, the touch parameter of the touch action in the event relation mapping table includes a touch coordinate of the touch action;
the device also includes:
the interface data intercepting unit is used for intercepting interface data to be rendered of the application before the event intercepting unit intercepts at least one first peripheral input event sent to the application by peripheral input equipment of the electronic equipment;
the coordinate determination unit is used for determining the coordinate information of each operation control in the interface of the application based on the interface data;
and the table information determining unit is used for determining the touch coordinates of the touch action corresponding to the operation object in the event relation mapping table according to the corresponding relation between the operation control and the operation object of the peripheral input equipment and the determined coordinate information of each operation control.
Optionally, the apparatus further comprises:
the data transmission unit is used for transmitting the interface data to the application;
and the object identification unit is used for marking the information of the operation object of the peripheral input equipment corresponding to the operation control in the interface according to the corresponding relation between the operation object and the operation object of the peripheral input equipment and the coordinate information of each operation control in the application when the application shows the interface corresponding to the interface data.
On the other hand, the application also provides an electronic device. As shown in fig. 8, which shows a schematic view of a composition structure of an electronic device according to the present application, the electronic device of the present embodiment at least includes: a processor 801 and a memory 802.
The memory stores data of an operating system and applications that the electronic device needs to run.
The processor is configured to perform the input control method as described in any of the above embodiments.
The memory is also used for storing programs needed by the processor to perform operations.
It will be appreciated that the electronic device may also include other components, as shown in FIG. 8, a display 803, an electronic device connected peripheral input device 804, and a communication bus 805. The processor, memory and display and peripheral input devices may be connected by a communications bus.
Of course, the electronic device may also include more or fewer components than those shown in fig. 8, which is not limiting.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. An input control method comprising:
intercepting at least one first peripheral input event sent to an application by peripheral input equipment of electronic equipment;
converting the at least one first peripheral input event into at least one touch event recognizable by the application based on the at least one first peripheral input event, wherein the touch event comprises a touch action and touch parameters of the touch action; the touch event is an event in an interface of the application;
and transmitting the at least one touch event to the application so that the application responds to the touch action of the touch event based on the touch parameters of the touch event, so that the application which does not support the peripheral input event can respond to the peripheral input event correspondingly.
2. The method of claim 1, said converting the at least one first peripheral input event to at least one touch event recognizable by the application based on the at least one first peripheral input event, comprising:
converting at least one touch event which can be identified by the application and corresponds to the at least one first peripheral input event according to an event relation mapping table corresponding to the application, wherein the event relation mapping table comprises: touch event information corresponding to different peripheral input event sets, wherein the peripheral input event sets comprise at least one peripheral input event, and the touch event information comprises touch parameters of touch actions in the touch events.
3. The method of claim 2, further comprising, prior to converting the at least one touch event recognizable by the application and corresponding to the at least one first peripheral input event according to the event map corresponding to the application:
determining a device type of a peripheral input device that sent the at least one first peripheral input event;
and determining an event relation mapping table corresponding to the equipment type and the application.
4. The method of claim 2, the first peripheral input event comprising at least: an operation object and an input action aiming at the operation object, wherein the operation object belongs to an operable component in the peripheral input equipment;
converting at least one touch event which can be identified by the application and corresponds to the at least one first peripheral input event according to the event relation mapping table corresponding to the application, wherein the converting comprises the following steps:
determining at least one touch action triggered to be executed by the at least one first peripheral input event based on the operation object of the at least one first peripheral input event and the input action;
according to the operation object of the at least one first peripheral input event, touch parameters corresponding to touch actions triggered and executed by the at least one first peripheral input event are inquired from an event relation mapping table corresponding to the application, the touch parameters of the at least one touch action corresponding to different operation object sets are stored in the event relation mapping table, and the operation object set is composed of the at least one operation object corresponding to the at least one peripheral input event in the peripheral input event set;
and obtaining at least one touch event according to each touch action triggered by the at least one first peripheral input event and the touch parameters of each touch action.
5. The method of claim 4, the determining, based on the operand of the first peripheral input event and the input action, at least one touch action triggered to be performed by the first peripheral input event, comprising:
if at least two second peripheral input events of which the operation objects belong to the set type exist in the at least one first peripheral input event and a third peripheral input event does not exist currently, determining at least one touch action triggered and executed by the at least two second peripheral input events based on the operation objects and the input actions of the at least two second peripheral input events, wherein the third peripheral input event is a peripheral input event which is intercepted before the second peripheral input event and is not released and of which the operation objects belong to the set type;
if at least one second peripheral input event that the operation object belongs to the set type exists in the at least one first peripheral input event and the third peripheral input event exists currently, determining at least one touch action based on the third peripheral input event and the operation object and the input action of the at least one second peripheral input event;
if the operation object of the first peripheral input event does not belong to the set type, or the operation object of the first peripheral input event belongs to the set type but does not have the third peripheral input event, determining at least one touch action triggered and executed by the first peripheral input event based on the operation object of the first peripheral input event and the input action.
6. The method of claim 5, wherein the querying, from the event relationship mapping table corresponding to the application, the touch parameter of the touch action triggered to be executed by the at least one first peripheral input event according to the operation object of the at least one first peripheral input event includes: if at least one second peripheral input event with an operation object belonging to a set type exists in the at least one first peripheral input event and a third peripheral input event exists currently, based on the operation object which is not released in the at least one second peripheral input event and the third peripheral input event, touch parameters of a touch action triggered and executed by the at least one second peripheral input event are inquired from the event relation mapping table corresponding to the application.
7. The method of claim 4, wherein the touch parameters of the touch action in the event map include touch coordinates of the touch action;
before intercepting at least one first peripheral input event sent by a peripheral input device of the electronic device to an application, the method further comprises the following steps:
intercepting interface data to be rendered of the application;
determining coordinate information of each operation control in the interface of the application based on the interface data;
and determining touch coordinates of touch actions corresponding to the operation objects in the event relation mapping table according to the corresponding relation between the operation controls and the operation objects of the peripheral input equipment and the determined coordinate information of each operation control.
8. The method of claim 7, further comprising:
transmitting the interface data to the application;
when an application shows an interface corresponding to the interface data, according to the corresponding relation between the operation object and the operation object of the peripheral input equipment and the coordinate information of each operation control in the application, marking the information of the operation object of the peripheral input equipment corresponding to the operation control at the position of the operation control in the interface.
9. An input control device comprising:
the event capture unit is used for capturing at least one first peripheral input event sent to the application by peripheral input equipment of the electronic equipment;
the event construction unit is used for converting the at least one first peripheral input event into at least one touch event which can be recognized by the application based on the at least one first peripheral input event, wherein the touch event comprises a touch action and touch parameters of the touch action; the touch event is an event in an interface of the application;
the event transmission unit is used for transmitting the at least one touch event to the application so that the application responds to the touch action of the touch event based on the touch parameter of the touch event, and the application which does not support the peripheral input event can make a corresponding response to the peripheral input event.
10. An electronic device, comprising:
a memory and a processor, wherein the processor is capable of,
wherein the processor is configured to perform the input control method of any one of claims 1 to 8;
the memory is used for storing programs needed by the processor to perform operations.
CN202010706568.0A 2020-07-21 2020-07-21 Input control method and device and electronic equipment Active CN111840990B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010706568.0A CN111840990B (en) 2020-07-21 2020-07-21 Input control method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010706568.0A CN111840990B (en) 2020-07-21 2020-07-21 Input control method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN111840990A CN111840990A (en) 2020-10-30
CN111840990B true CN111840990B (en) 2022-08-19

Family

ID=73001441

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010706568.0A Active CN111840990B (en) 2020-07-21 2020-07-21 Input control method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111840990B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115686302A (en) * 2021-07-29 2023-02-03 华为技术有限公司 Input conversion method, electronic device and readable medium
CN117348785A (en) * 2022-06-29 2024-01-05 华为技术有限公司 Control method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106730820A (en) * 2016-12-12 2017-05-31 苏州蜗牛数字科技股份有限公司 A kind of method and android terminal device for being adapted to various game paddles
CN108854054A (en) * 2018-06-15 2018-11-23 苏州运智互动科技有限公司 Role and lens control method with touch plate type somatosensory handle
CN109568942A (en) * 2017-09-28 2019-04-05 腾讯科技(成都)有限公司 Handle peripheral hardware, virtual object control method and device
CN110109557A (en) * 2019-05-06 2019-08-09 原点显示(深圳)科技有限公司 The method that switching keyboard mode is adapted to mobile terminal
CN110368676A (en) * 2019-07-16 2019-10-25 Oppo广东移动通信有限公司 Control method, device, storage medium and the electronic equipment of touch information

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106730820A (en) * 2016-12-12 2017-05-31 苏州蜗牛数字科技股份有限公司 A kind of method and android terminal device for being adapted to various game paddles
CN109568942A (en) * 2017-09-28 2019-04-05 腾讯科技(成都)有限公司 Handle peripheral hardware, virtual object control method and device
CN108854054A (en) * 2018-06-15 2018-11-23 苏州运智互动科技有限公司 Role and lens control method with touch plate type somatosensory handle
CN110109557A (en) * 2019-05-06 2019-08-09 原点显示(深圳)科技有限公司 The method that switching keyboard mode is adapted to mobile terminal
CN110368676A (en) * 2019-07-16 2019-10-25 Oppo广东移动通信有限公司 Control method, device, storage medium and the electronic equipment of touch information

Also Published As

Publication number Publication date
CN111840990A (en) 2020-10-30

Similar Documents

Publication Publication Date Title
WO2022111180A1 (en) Operation control method and apparatus, storage medium, and electronic device
WO2020098444A1 (en) Object display method and device, and storage medium and electronic device
CN107704082B (en) Method of operating display unit and terminal supporting the same
CN105159505B (en) A kind of interface operation method and terminal
EP3098526A1 (en) Customized control method and system for air conditioner operation mode
US20090160814A1 (en) Hot function setting method and system
CN111840990B (en) Input control method and device and electronic equipment
US20160054815A1 (en) Method and mobile terminal for processing touch input
KR20180111397A (en) The virtual controller creating and mapping method for touch control of a mobile device by a external input device
KR102641922B1 (en) Object positioning methods and electronic devices
CN109701263B (en) Operation control method and operation controller
CN110737374A (en) Operation method and electronic equipment
CN113209601B (en) Interface display method and device, electronic equipment and storage medium
US20090021482A1 (en) Virtually multiple wheels and method of manipulating multifunction tool icons thereof
CN109634438B (en) Input method control method and terminal equipment
US9606633B2 (en) Method and apparatus for input to electronic devices
CN111414115A (en) Key control method, computer readable storage medium and terminal thereof
KR20090124135A (en) Method, system for inputting used stylus pen, terminal and computer readable record-medium on which program for executing method thereof
CN111274054B (en) Message processing method and electronic equipment
WO2023143380A1 (en) Input methods and apparatus, electronic device, and readable storage medium
CN104461296B (en) A kind of method and device that mobile terminal is shared at PC ends
CN111273831A (en) Method for controlling electronic equipment and electronic equipment
CN113797527B (en) Game processing method, device, equipment, medium and program product
US20140143726A1 (en) Method of choosing software button
JP5449088B2 (en) Information input device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant