CN111857496A - Operation execution method and device and electronic equipment - Google Patents

Operation execution method and device and electronic equipment Download PDF

Info

Publication number
CN111857496A
CN111857496A CN202010615539.3A CN202010615539A CN111857496A CN 111857496 A CN111857496 A CN 111857496A CN 202010615539 A CN202010615539 A CN 202010615539A CN 111857496 A CN111857496 A CN 111857496A
Authority
CN
China
Prior art keywords
input
target
touch operation
target area
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010615539.3A
Other languages
Chinese (zh)
Inventor
汤琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202010615539.3A priority Critical patent/CN111857496A/en
Publication of CN111857496A publication Critical patent/CN111857496A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The application discloses an operation execution method, an operation execution device and electronic equipment, and belongs to the technical field of communication. The method comprises the following steps: receiving a first input to a target area; responding to the first input, and determining target touch operation corresponding to a target area according to an incidence relation between the target area and the touch operation; and executing a function corresponding to the target touch operation. According to the method and the device, the corresponding function can be executed through the target area, and no extra peripheral equipment is required to be carried, so that convenience is provided for a user.

Description

Operation execution method and device and electronic equipment
Technical Field
The application belongs to the technical field of communication, and particularly relates to an operation execution method and device and electronic equipment.
Background
With the continuous development of scientific technology, electronic devices (such as mobile phones, tablet computers and the like) have gradually become indispensable tools in people's life and work.
When a user uses the electronic equipment to play a game on a horizontal screen, the holding gestures of the two hands can only be performed by using the thumb for touch screen operation, and the forefinger is generally placed behind the mobile phone or put on the upper edge of the mobile phone due to the inconvenience in touch screen operation. For many games or applications, it may be less convenient to operate with the thumb alone, while the index finger cannot be sent to the venue. On the other hand, the index finger can easily click on some keys (such as volume, power-on, etc.) on the side of the mobile phone in the holding posture, but the keys have special purposes. Therefore, a method for conveniently operating the screen even with the index finger in this holding posture is required.
The existing method mainly clamps the peripheral on the side edge of the mobile phone in a physical peripheral mode, and then a forefinger clicks a button of the peripheral on the upper edge of the mobile phone to achieve the effect equivalent to clicking a screen clamping position.
However, the mode of physical peripheral only can be equivalent to clicking the screen position clamped by the peripheral and the mobile phone screen, so that the functions operable to the application program are less, and the peripheral needs to be carried about, which brings inconvenience to the user.
Disclosure of Invention
The embodiment of the application aims to provide an operation execution method, an operation execution device and electronic equipment, and the problems that the number of functions which can be operated on an application program in a physical peripheral mode is small, and a user is inconvenient to carry are solved.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an operation execution method, where the method includes:
receiving a first input to a target area;
responding to the first input, and determining target touch operation corresponding to a target area according to an incidence relation between the target area and the touch operation;
and executing a function corresponding to the target touch operation.
In a second aspect, an embodiment of the present application provides an operation execution apparatus, including:
The first input receiving module is used for receiving a first input of the target area;
the target operation determining module is used for responding to the first input and determining target touch operation corresponding to a target area according to the incidence relation between the target area and the touch operation;
and the function execution module is used for executing the function corresponding to the target touch operation.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the operation execution method according to the first aspect.
In a fourth aspect, the present embodiments provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the operation execution method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the operation execution method according to the first aspect.
In the embodiment of the application, a first input to a target area is received, a target touch operation corresponding to the target area is determined according to an association relation between the target area and the touch operation in response to the first input, and a function corresponding to the target touch operation is executed. According to the embodiment of the application, the association relation between the target area and the target touch operation is stored in advance, and the function corresponding to the target touch operation can be executed after the target area is triggered, so that the corresponding function can be executed through the target area, no additional peripheral equipment is required to be carried, and convenience is brought to a user.
Drawings
FIG. 1 is a flowchart illustrating steps of an operation execution method according to an embodiment of the present application;
fig. 2 is a schematic diagram of a customized touch screen area provided in an embodiment of the present application;
fig. 3 is a schematic diagram of a custom sliding track provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of an operation execution device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The following describes in detail an operation execution scheme provided by the embodiments of the present application through specific embodiments and application scenarios thereof with reference to the accompanying drawings.
Referring to fig. 1, a flowchart illustrating steps of an operation execution method provided in an embodiment of the present application is shown, and as shown in fig. 1, the operation execution method may specifically include the following steps:
step 101: a first input to a target area is received.
The embodiment of the application can be applied to a scene that the corresponding function is executed by triggering the set area of the electronic equipment.
The target area refers to an area of the electronic device that can execute a corresponding function, in this embodiment, the target area may be a side area of the electronic device, such as a side area of a mobile phone, or an area corresponding to a side key, or the like.
The first input is input to the target area when the user needs to execute the corresponding function. In this embodiment, the first input may be an input formed by a click operation performed by a user on the target area, an input formed by a double click operation performed by the user on the target area, an input formed by a sliding track input by the user in the target area, and the like, and specifically, the first input may be determined according to a business requirement, which is not limited in this embodiment.
It should be understood that the above examples are only examples for better understanding of the technical solutions of the embodiments of the present application, and are not to be taken as the only limitation to the embodiments.
After receiving the first input to the target area, step 102 is performed.
Step 102: and responding to the first input, and determining target touch operation corresponding to the target area according to the incidence relation between the target area and the touch operation.
The target touch operation refers to a touch operation corresponding to the target area.
The association relationship between the preset area and the touch operation is pre-stored in the electronic device system, as shown in table 1 below:
preset area Touch operation
Side regions (0, 10) - (0, 20) Touch operation 1
Side regions (0, 30) - (0, 40) Touch operation 2
In this example, a certain section of the side area of the electronic device may be used as one preset area, specifically, a side vertex when the electronic device is displaying in the forward direction may be used as an origin, and the preset area may be defined by a coordinate section, as shown in table 1, where there is an association relationship between the side areas (0, 10) - (0, 20) and the touch operation 1, and there is an association relationship between the side areas (0, 30) - (0, 40) and the touch operations 2 and 3, and when a target area where a user performs a first input is the side area (0, 10) - (0, 20), a target touch operation corresponding to the target area is the touch operation 1; when the target area for the user to perform the first input is the side areas (0, 30) - (0, 40), the target touch operations corresponding to the target area are touch operation 2 and touch operation 3.
It should be understood that the above examples are only examples for better understanding of the technical solutions of the embodiments of the present application, and are not to be taken as the only limitation to the embodiments.
In this embodiment, a target touch operation corresponding to the target area may be created in advance, and specifically, the detailed description may be described in conjunction with the following specific implementation manner.
In a specific implementation manner of this embodiment, before the step 101, the method may further include:
step A1: a second input by the user is received.
In this embodiment, the second input refers to an input performed by a user on a screen of the electronic device, and the second input may be a click operation performed by the user on the screen, an operation of a sliding track performed by the user on the screen, a combination of a series of operations performed by the user on the screen, or the like.
When the target touch operation corresponding to the target area needs to be set, the second input of the user may be received first, and then step a2 is executed.
Step A2: and responding to the second input, and establishing an incidence relation between the target area and the target touch operation based on the input parameters of the second input.
In this embodiment, the input parameter may include at least one of input position, input trajectory, input type, and duration of input duration.
The input position may be used to determine a position corresponding to the touch operation. When the input parameter of the second input is the input position, the association relationship between the target area and the target touch operation may be established according to the input position, as shown in fig. 2, the target area is an area P1 where the side key is located, and when the input parameter of the second input is the position where the input position is corresponding to clicking the circular area in fig. 2, at this time, the association relationship between the target area and the target touch operation may be established, and when the input to P1 is subsequently performed, the operation corresponding to clicking the circular area may be triggered.
The input trajectory may be used to determine a trajectory corresponding to the touch operation, as shown in fig. 3, the target area is an area P2 where the side key is located, and when the input parameter of the second input is a trajectory drawn on the screen of the electronic device (as the trajectory shown in fig. 3), an association relationship between the target area and the target touch operation may be established, that is, when the input to P2 is subsequently specified, the operation of drawing the trajectory shown in fig. 3 may be triggered.
The input type may be used to determine the type of touch operation, such as long press, single click, double click, slide, and other operation types. For example, when the second input is an operation of pressing a certain position on the screen of the electronic device for a long time, the association relationship between the operation and the target area may be established, and when an input to the target area is subsequently received, an operation corresponding to pressing the certain position for a long time may be triggered.
The duration of the input duration may be used to determine the duration of the touch operation, for example, when the touch operation is an operation of pressing a certain position for a long time, the duration corresponding to the long-press operation may be preset, and different functions may be triggered according to the difference of the long time. And when the input of the target area is received subsequently, triggering the operation corresponding to the preset time length of the position. For example, when the time length for executing the long press operation is 5s, the function of sending information may be executed, and when the time length for executing the long press operation is 10s, the function of closing the session interface may be executed.
It should be understood that the above examples are only examples for better understanding of the technical solutions of the embodiments of the present application, and are not to be taken as the only limitation to the embodiments.
When the second input includes at least two sub-inputs, the target touch operation corresponding to the target area may be determined based on the input parameter, the input order, and the input time interval according to the input order and the input time interval of the at least two sub-inputs, and in particular, the following detailed description may be made in conjunction with the following specific implementation manner.
In another specific implementation manner of the present application, before the step a2, the method may further include:
step S1: and acquiring the input parameter, the input sequence and the time interval between two adjacent sub-inputs of each sub-input.
In this embodiment, the input order refers to an order corresponding to each sub-input performed by the user.
In the process of running the target application program in the foreground, a start recording button can be clicked to record a screen, and after the start of recording, sub-input which is required to be saved can be made for the target application program, such as clicking, sliding and the like.
In the process of foreground running of the target application program, at least two sub-inputs executed on the target application program can be obtained, and an input parameter and an input sequence corresponding to each sub-input and a time interval between two adjacent sub-inputs are obtained. For example, after the screen recording function of the electronic device is started, two sub-inputs performed by the user on the screen of the electronic device may be recorded: and recording input parameters of the two sub-inputs, such as input positions of the click input and the slide input, and acquiring time intervals of the two sub-inputs, such as 5s interval between the click input and the slide input. Of course, when the second input includes more than two sub-inputs, it is necessary to record the time interval between two adjacent sub-inputs when recording the time interval, for example, when the second input includes three sub-inputs, the input order of the three sub-inputs is: the time interval between the click input and the long press input, and the time interval between the long press input and the slide input need to be recorded.
The step a1 may include:
sub-step M1: and determining a target touch operation corresponding to the target area based on the input parameter and the input sequence of each sub-input and the time interval between two adjacent sub-inputs.
After the input parameter, the input sequence, and the time interval between two adjacent sub-inputs of each sub-input are obtained, the target touch operation corresponding to the target area may be determined based on the input parameter, the input sequence, and the time interval between two adjacent sub-inputs of each sub-input. For example, in the second input, two sub-inputs are included: the click input and the long press input may be performed according to input parameters (input positions), an input sequence and a time interval of the click input and the slide input, respectively, where, for example, an input position of the click input is a first position on a screen, an input position of the slide input is a second position on the screen, and an input sequence of the two sub-inputs is: and firstly, executing the click input and then executing the sliding input, wherein the time interval between the two sub-inputs is 5s, then after receiving the input of the user in the target area, executing the click input at the first position, and after the interval of 5s, executing the sliding input at the second position on the screen.
In this embodiment, an association relationship between the target application and the target touch operation may also be established, and specifically, the detailed description may be described in conjunction with the following specific implementation manner.
In a specific implementation manner of the present application, before the step 101, the method may further include:
step B1: a target application is determined.
In the present embodiment, the target application refers to an application installed in the electronic device.
When the association relationship between the target application program and the target touch operation needs to be established, the application program which needs to establish the association relationship can be selected from application programs which are installed in the electronic equipment in advance by a user to serve as the target application program.
It can be understood that, in this embodiment, the target application may be one application or may be multiple applications, and specifically, the target application may be determined according to business requirements, and this embodiment is not limited thereto.
After the target application is determined, step B2 is performed.
Step B2: and establishing an incidence relation between the target touch operation and the target application program, wherein the target touch operation corresponds to a target function of the target application program.
After the target application is determined, an association relationship between the target touch operation and the target application may be established, where the target touch operation may correspond to a target function of the target application, for example, after the association relationship between the target touch operation and the target application is established, the target function may be executed on the target application by a first input to the target area. Specifically, in the process of running the target application program, a first input to the target area may be received, and then, according to the first input to the target area, a target function of the target application program corresponding to the target touch operation is executed. Specifically, when setting the association relationship between the target area and the touch operation corresponding to the target application, one target area may be selected on the corresponding setting interface, and then the correspondence relationship between the target area and the touch operation of the target application is set. For example, as shown in fig. 2, a user may draw a circle during the running of the application program on the electronic device, and the user may drag the circular area or adjust the size of the circular area. After the setting is completed, clicking the side button P1 is equivalent to clicking the screen of the circular area, i.e., setting the association between the side area and the operation of clicking the circular area. For example, as shown in fig. 3, the user performs a sliding operation on the screen, the sliding trajectory is displayed after the action is completed, the user can move and adjust the position of the trajectory diagram, and after the setting is completed, clicking the side button P2 is equivalent to performing a sliding operation on the screen at the trajectory position, that is, setting the association relationship between the side area and the sliding trajectory.
Of course, the target touch operation may trigger different functions in different applications, for example, the target touch operation is an area in the center of a click screen, the function of making a call to the contact 1 may be triggered in the application 1, the return function may be triggered in the application 2, the function of sending information may be triggered in the application 3, and the like.
After determining the target touch operation corresponding to the target area according to the association relationship between the target area and the touch operation in response to the first input, step 103 is executed.
Step 103: and executing a function corresponding to the target touch operation.
In this embodiment, the target touch operation corresponds to a set function in the electronic device, for example, the target touch operation may correspond to a function of starting a certain application program, or may correspond to one or more functions of a certain application program, and specifically, the present embodiment is not limited thereto.
After determining the target touch operation corresponding to the target area, a function corresponding to the target touch operation may be executed.
Specifically, after the association relationship between the target touch operation and the target application is established in advance, a corresponding function may be executed on the target application according to the input to the target area, and specifically, the detailed description may be described in conjunction with the following specific implementation manner.
In a specific implementation manner of the present application, the step 103 may include:
substep D1: and responding to the first input, and executing a target function of the target application program corresponding to the target touch operation.
In the process of running the target application program, after receiving the first input to the target area, the target function of the target application program corresponding to the target touch operation may be executed, for example, a return operation executed on a certain interface of the target application program, a click operation executed on a certain interface of the target application program, or the like.
In this embodiment, the first input may be one input, and at this time, one or more functions may be triggered by the first input of the target area by the user, for example, the function of sending information may be triggered by the user performing the first input in the target area. Or, the user may execute the first input in the target area, and may trigger the function of sending information and the function of displaying the expression browsing interface, and certainly, when a plurality of functions are triggered, each function may be executed in sequence according to the input sequence of the pre-recorded touch operation and the input time interval.
Of course, the first input may also include a plurality of first sub-inputs, each of which may trigger a corresponding function, and in a specific implementation, the corresponding function may be executed on the target application program according to the input parameters, the input sequence, and the input time interval of the plurality of first sub-inputs. In this case, the user may execute the two first sub-inputs in the target area sequentially, and then may trigger the corresponding function according to the two sub-inputs.
In another specific implementation manner of the present application, the target touch operation includes at least two sub-operations, the sub-operations correspond to the sub-inputs one to one, and each sub-operation is determined based on an input parameter of one sub-input, where the step 103 may include:
substep N1: and sequentially executing the functions corresponding to the sub-operations according to the input sequence and the time interval.
In this embodiment, when the target touch operation includes at least two sub-operations, the at least two sub-operations are respectively in one-to-one correspondence with the at least two sub-inputs, and each sub-operation is determined based on an input parameter of one sub-input. For example, the target application program takes a social application program as an example, the at least two sub-inputs include a click input at a first position, a click input at a second position and a click input at a third position, in a target interface of the target application program, the click input at the first position corresponds to an operation of clicking a display expression interface in the target interface of the target application program, the click input at the second position corresponds to an operation of selecting a target expression, the click input at the third position corresponds to an operation of sending information, and when the first input of a user in a target area is received, the operation of clicking the display expression interface may be triggered to display the expression interface and select the target expression, and after a preset time interval, the operation of sending information is performed.
When the obtained first input includes a plurality of first sub-inputs, an input parameter, an input order, and an input time interval of two adjacent first sub-inputs of each first sub-input may be obtained, then, a sub-operation corresponding to the plurality of first sub-inputs is determined according to the input parameter, and then, according to the input order and the time interval, a function corresponding to the sub-operation is sequentially executed. For example, the at least two first sub-inputs are user inputs in the target area, and include a long press input and a click input, the long press input corresponds to an operation of clicking a first position of the screen of the electronic device, the click input corresponds to an operation of drawing a track in a certain area of the screen of the electronic device, and the operation of clicking the first position and the operation of drawing the track in the certain area can be sequentially executed when the long press input and the click input of the user in the target area are received.
According to the function execution method provided by the embodiment of the application, the first input to the target area is received, the first input is responded, the target touch operation corresponding to the target area is determined according to the incidence relation between the target area and the touch operation, and the function corresponding to the target touch operation is executed. According to the embodiment of the application, the association relation between the target area and the target touch operation is stored in advance, and the function corresponding to the target touch operation can be executed after the target area is triggered, so that the corresponding function can be executed through the target area, no additional peripheral equipment is required to be carried, and convenience is brought to a user.
It should be noted that, in the operation execution method provided in the embodiment of the present application, the execution main body may be an application operation execution device, or a control module in the execution device for executing the operation execution method. In the embodiment of the present application, an operation execution device executing an operation execution method is taken as an example to describe the operation execution device provided in the embodiment of the present application.
Referring to fig. 4, a schematic structural diagram of an operation execution device provided in an embodiment of the present application is shown, and as shown in fig. 4, the operation execution device may specifically include the following modules:
a first input receiving module 410 for receiving a first input to the target area;
a target operation determining module 420, configured to determine, in response to the first input, a target touch operation corresponding to a target area according to an association relationship between the target area and the touch operation;
the function executing module 430 is configured to execute a function corresponding to the target touch operation.
Optionally, the method further comprises:
the second input receiving module is used for receiving a second input of the user;
and the first incidence relation establishing module is used for responding to the second input and establishing incidence relation between the target area and the target touch operation based on the input parameters of the second input.
Optionally, the input parameters include at least one of: input position, input trajectory, input type, and duration of input duration.
Optionally, the method further comprises:
the target application determining module is used for determining a target application program;
a second association relation establishing module, configured to establish an association relation between the target touch operation and the target application program, where the target touch operation corresponds to a target function of the target application program;
the first input receiving module 410 includes;
a first input receiving unit, configured to receive a first input to the target area when the target application is running;
the function execution module 430 includes:
and the target function execution unit is used for responding to the first input and executing a target function of the target application program corresponding to the target touch operation.
Optionally, the second input includes at least two sub-inputs, further including:
the input parameter acquisition module is used for acquiring the input parameter and the input sequence of each sub-input and the time interval between two adjacent sub-inputs;
the touch operation determination module includes:
and the touch operation determining unit is used for determining a target touch operation corresponding to the target area based on the input parameter, the input sequence and the time interval between two adjacent sub-inputs of each sub-input.
Optionally, the target touch operation includes at least two sub-operations, the sub-operations are in one-to-one correspondence with the sub-inputs, and each sub-operation is determined based on an input parameter of one of the sub-inputs;
the function execution module 430 includes:
and the function execution unit is used for sequentially executing the functions corresponding to the sub-operations according to the input sequence and the time interval.
The operation execution device provided by the embodiment of the application determines the target touch operation corresponding to the target area according to the incidence relation between the target area and the touch operation by receiving the first input to the target area and responding to the first input, and executes the function corresponding to the target touch operation. According to the embodiment of the application, the association relation between the target area and the target touch operation is stored in advance, and the function corresponding to the target touch operation can be executed after the target area is triggered, so that the corresponding function can be executed through the target area, no additional peripheral equipment is required to be carried, and convenience is brought to a user.
The operation execution device in the embodiment of the present application may be a device, and may also be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The operation execution device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The operation execution device provided in the embodiment of the present application can implement each process implemented in the embodiment of the method in fig. 1, and is not described here again to avoid repetition.
Optionally, as shown in fig. 5, an electronic device 500 is further provided in this embodiment of the present application, and includes a processor 501, a memory 502, and a program or an instruction stored in the memory 502 and executable on the processor 501, where the program or the instruction is executed by the processor 501 to implement each process of the operation execution method embodiment, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 6 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 600 includes, but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and the like.
Those skilled in the art will appreciate that the electronic device 600 may further comprise a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 610 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 6 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The radio frequency unit 601 is configured to receive a first input to a target area;
responding to the first input, and determining target touch operation corresponding to a target area according to an incidence relation between the target area and the touch operation;
and executing a function corresponding to the target touch operation.
According to the embodiment, the association relationship between the target area and the target touch operation is preserved in advance, and after the target area is triggered, the function corresponding to the target touch operation can be executed, so that the corresponding function can be executed through the target area, no additional peripheral equipment is required to be carried, and convenience is brought to a user.
Optionally, the radio frequency unit 601 is further configured to, before the receiving the first input to the target area, further include:
receiving a second input of the user;
and responding to the second input, and establishing an incidence relation between the target area and the target touch operation based on the input parameters of the second input.
Optionally, the input parameters include at least one of: input position, input trajectory, input type, and duration of input duration.
Optionally, before the receiving the first input to the target region, further comprising:
determining a target application program;
establishing an incidence relation between the target touch operation and the target application program, wherein the target touch operation corresponds to a target function of the target application program;
the receiving a first input to the target area comprises;
receiving a first input to the target area while the target application is running;
the executing the function corresponding to the target touch operation includes:
and responding to the first input, and executing a target function of the target application program corresponding to the target touch operation.
Optionally, the second input includes at least two sub-inputs, and before determining the target touch operation corresponding to the target area based on the input parameter of the second input, the method further includes:
Acquiring an input parameter and an input sequence of each sub-input and a time interval between two adjacent sub-inputs;
the determining, based on the input parameter of the second input, a target touch operation corresponding to the target area includes:
and determining a target touch operation corresponding to the target area based on the input parameter and the input sequence of each sub-input and the time interval between two adjacent sub-inputs.
Optionally, the target touch operation includes at least two sub-operations, the sub-operations are in one-to-one correspondence with the sub-inputs, and each sub-operation is determined based on an input parameter of one of the sub-inputs;
the executing the function corresponding to the target touch operation includes:
and sequentially executing the functions corresponding to the sub-operations according to the input sequence and the time interval.
In the embodiment, by pre-establishing the target touch operation corresponding to the target area associated with the sub-input, a plurality of sub-inputs can be input in the target area to execute the functions corresponding to the plurality of sub-operations, so that the user operation can be further simplified, and the control of one area on the plurality of functions can be realized.
It is to be understood that, in the embodiment of the present application, the input unit 604 may include a Graphics Processing Unit (GPU) 6041 and a microphone 6042, and the graphics processing unit 6041 processes image data of a still picture or a video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 607 includes a touch panel 6071 and other input devices 6072. A touch panel 6071, also referred to as a touch screen. The touch panel 6071 may include two parts of a touch detection device and a touch controller. Other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 609 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 610 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the operation execution method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above operation execution method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An operation execution method, comprising:
receiving a first input to a target area;
responding to the first input, and determining target touch operation corresponding to a target area according to an incidence relation between the target area and the touch operation;
and executing a function corresponding to the target touch operation.
2. The method of claim 1, further comprising, prior to said receiving a first input to a target area:
receiving a second input of the user;
and responding to the second input, and establishing an incidence relation between the target area and the target touch operation based on the input parameters of the second input.
3. The method of claim 2, wherein the input parameters comprise at least one of: input position, input trajectory, input type, and duration of input duration.
4. The method of claim 1, further comprising, prior to said receiving a first input to said target region:
determining a target application program;
establishing an incidence relation between the target touch operation and the target application program, wherein the target touch operation corresponds to a target function of the target application program;
The receiving a first input to the target area comprises;
receiving a first input to the target area while the target application is running;
the executing the function corresponding to the target touch operation includes:
and executing a target function of the target application program corresponding to the target touch operation.
5. The method of claim 2, wherein the second input comprises at least two sub-inputs, and before establishing the association between the target area and the target touch operation based on the input parameters of the second input, the method further comprises:
acquiring an input parameter and an input sequence of each sub-input and a time interval between two adjacent sub-inputs;
the determining, based on the input parameter of the second input, a target touch operation corresponding to the target area includes:
and determining a target touch operation corresponding to the target area based on the input parameter and the input sequence of each sub-input and the time interval between two adjacent sub-inputs.
6. The method according to claim 5, wherein the target touch operation comprises at least two sub-operations, the sub-operations are in one-to-one correspondence with the sub-inputs, and each sub-operation is determined based on an input parameter of one sub-input;
The executing the function corresponding to the target touch operation includes:
and sequentially executing the functions corresponding to the sub-operations according to the input sequence and the time interval.
7. An operation execution apparatus, comprising:
the first input receiving module is used for receiving a first input of the target area;
the target operation determining module is used for responding to the first input and determining target touch operation corresponding to a target area according to the incidence relation between the target area and the touch operation;
and the function execution module is used for executing the function corresponding to the target touch operation.
8. The apparatus of claim 7, further comprising:
the second input receiving module is used for receiving a second input of the user;
and the first incidence relation establishing module is used for responding to the second input and establishing incidence relation between the target area and the target touch operation based on the input parameters of the second input.
9. The apparatus of claim 8, wherein the input parameters comprise at least one of: input position, input trajectory, input type, and duration of input duration.
10. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the operation execution method of any one of claims 1-6.
CN202010615539.3A 2020-06-30 2020-06-30 Operation execution method and device and electronic equipment Pending CN111857496A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010615539.3A CN111857496A (en) 2020-06-30 2020-06-30 Operation execution method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010615539.3A CN111857496A (en) 2020-06-30 2020-06-30 Operation execution method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN111857496A true CN111857496A (en) 2020-10-30

Family

ID=72988773

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010615539.3A Pending CN111857496A (en) 2020-06-30 2020-06-30 Operation execution method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111857496A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112764618A (en) * 2021-01-22 2021-05-07 维沃移动通信有限公司 Interface operation method, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202748768U (en) * 2012-09-24 2013-02-20 广东欧珀移动通信有限公司 Communication terminal for realizing customized operation events
US20130149964A1 (en) * 2011-12-07 2013-06-13 At&T Intellectual Property I, L.P. Extending the Functionality of a Mobile Device
CN104699408A (en) * 2015-04-01 2015-06-10 广东欧珀移动通信有限公司 Operation method and device of touch screen and touch device
CN105487803A (en) * 2015-11-27 2016-04-13 努比亚技术有限公司 Touch response method and mobile terminal
CN105892910A (en) * 2016-03-28 2016-08-24 努比亚技术有限公司 Mobile terminal control method and device
CN108984093A (en) * 2018-06-28 2018-12-11 Oppo广东移动通信有限公司 touch operation method, device, storage medium and electronic equipment
CN109800135A (en) * 2017-11-17 2019-05-24 腾讯科技(深圳)有限公司 A kind of information processing method and terminal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130149964A1 (en) * 2011-12-07 2013-06-13 At&T Intellectual Property I, L.P. Extending the Functionality of a Mobile Device
CN202748768U (en) * 2012-09-24 2013-02-20 广东欧珀移动通信有限公司 Communication terminal for realizing customized operation events
CN104699408A (en) * 2015-04-01 2015-06-10 广东欧珀移动通信有限公司 Operation method and device of touch screen and touch device
CN105487803A (en) * 2015-11-27 2016-04-13 努比亚技术有限公司 Touch response method and mobile terminal
CN105892910A (en) * 2016-03-28 2016-08-24 努比亚技术有限公司 Mobile terminal control method and device
CN109800135A (en) * 2017-11-17 2019-05-24 腾讯科技(深圳)有限公司 A kind of information processing method and terminal
CN108984093A (en) * 2018-06-28 2018-12-11 Oppo广东移动通信有限公司 touch operation method, device, storage medium and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112764618A (en) * 2021-01-22 2021-05-07 维沃移动通信有限公司 Interface operation method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
WO2023005920A1 (en) Screen splitting method and apparatus, and electronic device
CN112269508B (en) Display method and device and electronic equipment
WO2022121790A1 (en) Split-screen display method and apparatus, electronic device, and readable storage medium
CN112433693B (en) Split screen display method and device and electronic equipment
CN112486444A (en) Screen projection method, device, equipment and readable storage medium
CN113655929A (en) Interface display adaptation processing method and device and electronic equipment
CN112911147A (en) Display control method, display control device and electronic equipment
CN112099702A (en) Application running method and device and electronic equipment
CN111857496A (en) Operation execution method and device and electronic equipment
CN112162689B (en) Input method and device and electronic equipment
CN111913617B (en) Interface display method and device and electronic equipment
CN111752428A (en) Icon arrangement method and device, electronic equipment and medium
CN113885981A (en) Desktop editing method and device and electronic equipment
CN113515216A (en) Application program switching method and device and electronic equipment
CN113253884A (en) Touch method, touch device and electronic equipment
CN112637407A (en) Voice input method and device and electronic equipment
CN112765508A (en) Information display method and device and electronic equipment
CN112596645A (en) Application identifier hiding method and device and electronic equipment
CN112269511A (en) Page display method and device and electronic equipment
CN112437196B (en) Page display method and device and electronic equipment
CN112035032B (en) Expression adding method and device
CN111666010B (en) Application display method and device
CN113778237A (en) Character display method and device, electronic equipment and medium
CN112114774A (en) Volume adjusting method and device and electronic equipment
CN113885765A (en) Screenshot picture association method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201030

RJ01 Rejection of invention patent application after publication