CN111858451A - Intelligent computing method, terminal and storage medium - Google Patents
Intelligent computing method, terminal and storage medium Download PDFInfo
- Publication number
- CN111858451A CN111858451A CN202010757610.1A CN202010757610A CN111858451A CN 111858451 A CN111858451 A CN 111858451A CN 202010757610 A CN202010757610 A CN 202010757610A CN 111858451 A CN111858451 A CN 111858451A
- Authority
- CN
- China
- Prior art keywords
- gesture
- calculation
- terminal
- current interface
- interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000004364 calculation method Methods 0.000 title claims abstract description 308
- 238000003860 storage Methods 0.000 title claims abstract description 17
- 230000009471 action Effects 0.000 claims abstract description 72
- 238000000034 method Methods 0.000 claims abstract description 48
- 230000006870 function Effects 0.000 claims description 51
- 230000015654 memory Effects 0.000 claims description 26
- 238000001514 detection method Methods 0.000 claims description 7
- 230000008569 process Effects 0.000 description 20
- 238000010586 diagram Methods 0.000 description 15
- 238000004590 computer program Methods 0.000 description 7
- 230000002093 peripheral effect Effects 0.000 description 5
- 238000007667 floating Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 210000004556 brain Anatomy 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000011022 operating instruction Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F15/00—Digital computers in general; Data processing equipment in general
- G06F15/02—Digital computers in general; Data processing equipment in general manually operated with input through keyboard and computation using a built-in program, e.g. pocket calculators
- G06F15/0225—User interface arrangements, e.g. keyboard, display; Interfaces to other computer systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Computing Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application discloses an intelligent computing method, a terminal and a storage medium, wherein the method comprises the following steps: after the initial number is recognized in the current interface, detecting a first gesture action in the current interface; wherein the initial number is displayed in the current interface; determining a first target number and a first calculation mode according to the first gesture action; wherein the first target number is displayed in the current interface; calculating the initial number and the first target number according to a first calculation mode to obtain a first operation result; and calling a result display interface in a first area of the current interface, and displaying the first operation result in the result display interface.
Description
Technical Field
The present invention relates to the field of electronic devices, and in particular, to a computing method, a terminal, and a storage medium.
Background
With the development of science and technology, the smart phone gradually realizes the comprehensive coverage of each user group, is completely integrated into the daily life and work and study of people, such as mobile phone office, digital information, mobile phone banking, digital payment, paperless transaction and the like, provides people with more intelligent, faster and more portable digital life and network link experience, and enables people to successfully enter the digital intelligent era.
At present, when a user reads news or uses financial mobile phone software (APP), a large number of basic numbers appearing in a page may need to be calculated at any time, and because the mobile phone calculator APP in an intelligent terminal exists independently of other functions of a system, when the user needs to calculate data, the user is usually required to consciously record the data, then the user jumps to open the mobile phone calculator APP, the recorded data is input for calculation, then the previous page is switched back again, the data is recorded, and the actions are repeated. However, the method that the user needs to record data frequently and needs to switch to the mobile phone calculator APP repeatedly for data calculation is not only inefficient and inaccurate, but also the frequent switching between the functional software causes great waste of system resources, poor terminal intelligence,
disclosure of Invention
The embodiment of the application provides a computing method, a terminal and a storage medium, which not only improve the computing efficiency and the computing accuracy, but also effectively reduce the system resource loss, and the terminal is higher in intelligence.
The technical scheme of the embodiment of the application is realized as follows:
in a first aspect, an embodiment of the present application provides a computing method, where the method includes:
after an initial number is recognized in a current interface, detecting a first gesture action in the current interface; wherein the initial number is displayed in the current interface;
determining a first target number and a first calculation mode according to the first gesture action; wherein the first target number is displayed in the current interface;
calculating the initial number and the first target number according to the first calculation mode to obtain a first operation result;
and calling a result display interface in a first area of the current interface, and displaying the first operation result in the result display interface.
In a second aspect, an embodiment of the present application provides a terminal, where the terminal includes: a detection unit, a determination unit, a calculation unit and a display unit,
the detection unit is used for detecting a first gesture action in a current interface after an initial number is recognized in the current interface; wherein the initial number is displayed in the current interface;
the determining unit is used for determining a first target number and a first calculation mode according to the first gesture action; wherein the first target number is displayed in the current interface;
the calculation unit is used for performing calculation processing on the initial number and the first target number according to the first calculation mode to obtain a first operation result;
the display unit is used for calling a result display interface in a first area of the current interface and displaying the first operation result in the result display interface.
In a third aspect, an embodiment of the present application provides a terminal, where the terminal includes a processor and a memory storing instructions executable by the processor, and when the instructions are executed by the processor, the intelligent computing method as described above is implemented.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a program is stored, and the program is applied to a terminal, and when the program is executed by a processor, the intelligent computing method is implemented as described above.
The embodiment of the application provides a computing method, a terminal and a storage medium, wherein the terminal can detect a first gesture action in a current interface after recognizing an initial number in the current interface; wherein the initial number is displayed in the current interface; determining a first target number and a first calculation mode according to the first gesture action; wherein the first target number is displayed in the current interface; calculating the initial number and the first target number according to a first calculation mode to obtain a first operation result; and calling a result display interface in a first area of the current interface, and displaying the first operation result in the result display interface. That is to say, in the embodiment of the application, after the terminal determines the initial number, the terminal may determine, through the gesture motion detected in the current interface, a target number that needs to be calculated and a calculation mode between numbers, further perform calculation processing on the initial number and the target number according to the calculation mode, and display the calculation result on the current interface. Therefore, according to the intelligent computing method, when the terminal performs computing processing on the numbers in the current interface, the terminal does not need to be switched to the APP interface of the calculator for computing, the computing mode and the computing numbers can be directly determined based on gesture recognition processing in the current interface, and the digital computing processing process is completed in the current interface, so that the computing time is greatly shortened, the computing efficiency and the computing accuracy are improved, the system resource loss is effectively reduced, and the terminal is higher in intelligence.
Drawings
Fig. 1 is a first schematic flow chart illustrating an implementation process of an intelligent computing method according to an embodiment of the present application;
fig. 2 is a schematic diagram of an implementation flow of an intelligent computing method according to an embodiment of the present application;
fig. 3 is a first schematic view of a setting interface of an intelligent gesture calculating function according to an embodiment of the present disclosure;
fig. 4 is a schematic view of a setting interface of an intelligent gesture calculating function according to an embodiment of the present application;
fig. 5A is a schematic flow chart illustrating an implementation process of the intelligent computing method according to the embodiment of the present application;
fig. 5B is a schematic diagram illustrating an implementation flow of the intelligent computing method according to the embodiment of the present application;
fig. 6 is a schematic flow chart illustrating an implementation process of the intelligent computing method according to the embodiment of the present application;
fig. 7 is a schematic diagram of an effect of an intelligent gesture calculation processing interface according to an embodiment of the present application;
fig. 8 is a first schematic structural diagram of a terminal according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the specific embodiments described herein are illustrative of the relevant application and are not limiting of the application. It should be noted that, for the convenience of description, only the parts related to the related applications are shown in the drawings.
With the development of science and technology, the smart phone gradually realizes the comprehensive coverage of each user group, is completely integrated into the daily life and work and study of people, such as mobile phone office, digital information, mobile phone banking, digital payment, paperless transaction and the like, provides people with more intelligent, faster and more portable digital life and network link experience, and enables people to successfully enter the digital intelligent era.
At present, when a user reads a news or browses a financial APP on a mobile phone, a large number of basic numbers appear, and the user may need to calculate and process the numbers at any time. However, since the mobile phone calculator APP is independent of other functions of the system, the current mainstream calculation mode is that a user firstly memorizes data by using the brain or records data by using pen paper, then exits from the current application, opens the mobile phone calculator APP to perform addition, subtraction, multiplication and division calculation, then switches back to the previous application, and the subsequent calculation processing is repeated.
However, the method that requires the user to record data frequently and repeatedly switch to the mobile phone calculator APP for data calculation has the following disadvantages: 1) the calculation consumes long time, and users usually need to record data first and then calculate the data, and repeated operation causes the time consumption of the users; 2) calculation errors are easy to occur, a user often records data through the brain and inputs the data into a calculator, and the operation can cause the data to be often recorded inaccurately, cause large deviation of the calculated data and even need to be recalculated; 3) the user needs to frequently switch the data source APP and the calculator APP, and the high-frequency switching operation of entry/exit between the functional software easily causes system resource consumption.
In order to solve the problems of the existing terminal data computing mechanism, the embodiment of the application provides an intelligent computing method, a terminal and a storage medium. Specifically, after the initial number is determined, the terminal can determine a target number to be calculated and a calculation mode between numbers through a gesture motion detected in the current interface, further perform calculation processing on the initial number and the target number according to the calculation mode, and display an operation result on the current interface. Therefore, according to the intelligent computing method, when the terminal performs computing processing on the numbers in the current interface, the terminal does not need to be switched to the APP interface of the calculator for computing, the computing mode and the computing numbers can be directly determined based on gesture recognition processing in the current interface, and the digital computing processing process is completed in the current interface, so that the computing time is greatly shortened, the computing efficiency and the computing accuracy are improved, the system resource loss is effectively reduced, and the terminal is higher in intelligence.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
An embodiment of the present application provides a computing method, fig. 1 is a schematic view illustrating an implementation flow of an intelligent computing method provided in an embodiment of the present application, and as shown in fig. 1, in an embodiment of the present application, a method for a terminal to perform intelligent computing may include the following steps:
In an embodiment of the application, the terminal may recognize an initial number in the current interface, and may detect a first gesture action in the current interface after recognizing the initial number.
It should be noted that, in the embodiments of the present application, the terminal may be any electronic device, including but not limited to: tablet computers, mobile phones, electronic readers, Personal Computers (PCs), notebook computers, in-vehicle devices, network televisions, wearable devices, and the like. In particular, the terminal may be an electronic device configured with a touch screen display.
It should be noted that, in the embodiment of the present application, the current interface is a terminal device interface including a plurality of numbers. For example, the operation interface in the news page, the data table or the financial APP, which is not specifically limited in this application. Further, the initial number may be any one of a plurality of numbers displayed in the current interface.
It should be noted that, in the embodiment of the present application, when the terminal performs the calculation processing, the terminal first determines an initial number, that is, a number before an operation symbol in the calculation processing. For example, a dividend in division, or a subtrahend in subtraction.
Specifically, in the embodiment of the present application, the terminal may receive a selection instruction in the current interface first; the selection instruction carries a position parameter, namely second coordinate data, and the terminal can determine the initial number according to the second coordinate data. Specifically, a user performs an initial digit selection operation (for example, double-click and long-press) in the current interface, the terminal can receive a corresponding selection instruction carrying second coordinate data through the user selection operation, and then the terminal can determine a selection area in the current interface according to the second coordinate data, and take the digits displayed in the selection area as the initial digits.
For example: after a user performs double-click selection operation at a position A in the touch screen, the terminal receives a selection instruction carrying position A coordinate data, the terminal can determine a selection area in a current interface according to the corresponding coordinate data of the position A, and then the number a displayed in the selection area is determined as an initial number.
It will be appreciated that the arithmetic processing of the data requires at least two digits, and therefore, after determining the initial digit from the received selection instruction, the terminal needs to further determine the digit following the operator. For example, the divisor in a division, or the divisor in a subtraction.
Specifically, in the embodiment of the present application, after the terminal confirms the initial number, when further determining the number following the operator, the terminal may perform the recognition detection processing of the first gesture motion in the current interface.
Optionally, in an embodiment of the present application, the first gesture action detected by the terminal may be obtained based on a touch operation of a user on the display screen, and includes: a sliding or clicking operation performed on the display screen by the user using the body (finger, arm, toe, etc.) or by the user using an electronic accessory (stylus pen). For example: when the terminal displays the interface of the payment device, a user slides up the touch screen with two fingers, and the terminal can detect gesture actions corresponding to the sliding operations.
It is understood that the gesture actions detected by the terminal are not all available after the user performs a touch operation on the touch screen. Alternatively, the user may use a movement or pressing operation of a separate electronic accessory (e.g., a mouse) of the electronic device to perform the input of the first gesture.
Optionally, in an embodiment of the application, the terminal may further be equipped with an image sensor, and is configured to perform information acquisition processing on operation information of the user on the display screen, including an operation direction and an operation trajectory, so as to determine a corresponding gesture action.
Further, in the embodiment of the application, the terminal may also perform recognition processing on an object touched on the touch screen by using the image sensor, so as to avoid misoperation. For example, only when the object of the touch operation on the touch screen is a user body or an electronic accessory (a touch pen), the terminal performs gesture motion detection and recognition processing on the operation information, and when the object is another object (such as an ordinary ball pen), the gesture motion corresponding to the operation information is not recognized, so that misoperation is avoided.
Further, in an embodiment of the present application, after the terminal determines the initial number and detects the first gesture in the current interface, the terminal may further determine the target number and the calculation mode according to the first gesture.
In an embodiment of the application, after the terminal determines the initial number and detects the first gesture motion in the current interface, the terminal may further determine a first target number for performing calculation processing and a first calculation mode according to the first gesture motion.
It can be understood that, since the current interface may include a plurality of numbers, the user may not need to perform calculation processing on all the numbers in the page according to actual needs, and therefore, the terminal needs to select the numbers after the operation symbol and the target numbers that are performed calculation processing together with the initial numbers, and the calculation manner used between the initial numbers and the target numbers.
In this embodiment, the terminal may determine, according to the detected first gesture motion, a first target number that is a number that needs to be calculated together with the initial number, and a first calculation mode that is a calculation method used between the initial number and the first target number. Specifically, the terminal can obtain track information and direction information corresponding to the first gesture, and then determine a first target number and a first calculation mode according to the track information and the direction information.
Alternatively, the first target number may be the same number as the initial number or may be a number different from the target number. The first target number is any one of a plurality of numbers displayed in the current interface.
Further, in the embodiment of the present application, after the terminal determines the first target number and the first calculation mode according to the first gesture, the terminal may further perform calculation processing on the initial number and the target number according to the calculation mode.
And 103, calculating the initial number and the first target number according to a first calculation mode to obtain a first operation result.
In an embodiment of the application, after the terminal determines the first target number and the first calculation mode according to the first gesture, the terminal may further perform calculation processing on the initial number and the first target number according to the first calculation mode, so as to obtain a first operation result.
Specifically, in the embodiment of the present application, based on step 101 and step 102, the initial number is always a number before the operator, the first target number is always a number after the operator, that is, a number to be subjected to calculation processing, and the first calculation mode is used to indicate an operation manner used between the initial number and the first target number. The terminal can perform corresponding calculation processing on the initial number and the target number according to the determined calculation mode, and then a first operation result is obtained. If the initial number is a and the first target number is b, when the calculation mode is division, a is dividend and b is divisor, and the first operation result is (a/b).
Further, in the embodiment of the present application, after the terminal performs calculation processing on the initial number and the first target number according to the first calculation mode to obtain the first calculation result, the terminal may display the first calculation result.
And 104, calling a result display interface in the first area of the current interface, and displaying the first operation result in the result display interface.
In an embodiment of the application, after the terminal calculates the first operation result of the initial number and the first target number, the terminal may call a result display interface in the first area of the current interface to display the first operation result on the result display interface.
Optionally, in an embodiment of the present application, the first area may be a blank sub-area located in the current interface; the first area can also be a sub-area located at the peripheral edge of the terminal display interface.
Specifically, after the terminal obtains the first operation result, if a blank sub-region exists in the current interface, the terminal may call out a result display interface in the blank sub-region to display the operation result; that is, the results display interface is embedded in the first region of the current interface.
Specifically, after the terminal obtains the operation result, if the current interface has no blank sub-area, the terminal may call out an interface in the form of a floating window at the edge position of a display interface closest to the target number in the peripheral edge positions as a result display interface to display the operation result; that is, the results display interface is overlaid in a first area of the current interface.
Further, in the embodiment of the application, after the terminal calculates the first operation result, if the terminal needs to continue performing subsequent calculation processing on the first operation result, the terminal may use the first operation result as an initial number in a next-stage calculation processing process, then continue to detect the second gesture motion in the current interface, and further determine the second target number and the second calculation mode according to the second gesture motion. And then according to a second calculation mode, calculating the second target number and the first calculation result to obtain a second calculation result, and displaying the result display page by using the second calculation result.
The embodiment of the application provides an intelligent computing method, wherein a terminal can detect a first gesture action in a current interface after recognizing an initial number in the current interface; wherein the initial number is displayed in the current interface; determining a first target number and a first calculation mode according to the first gesture action; wherein the first target number is displayed in the current interface; calculating the initial number and the first target number according to a first calculation mode to obtain a first operation result; and calling a result display interface in a first area of the current interface, and displaying the first operation result in the result display interface. That is to say, in the embodiment of the application, after the terminal determines the initial number, the terminal may determine, through the gesture motion detected in the current interface, a target number that needs to be calculated and a calculation mode between numbers, further perform calculation processing on the initial number and the target number according to the calculation mode, and display the calculation result on the current interface. Therefore, according to the intelligent computing method, when the terminal performs computing processing on the numbers in the current interface, the terminal does not need to be switched to the APP interface of the calculator for computing, the computing mode and the computing numbers can be directly determined based on gesture recognition processing in the current interface, and the digital computing processing process is completed in the current interface, so that the computing time is greatly shortened, the computing efficiency and the computing accuracy are improved, the system resource loss is effectively reduced, and the terminal is higher in intelligence.
Based on the foregoing embodiment, in another embodiment of the present application, fig. 2 is a schematic flow chart illustrating an implementation process of an intelligent computing method provided in the embodiment of the present application, and as shown in fig. 2, in the embodiment of the present application, a method for a terminal to determine a first target number and a first computing mode according to the first gesture motion may include the following steps:
In an embodiment of the application, after the terminal detects the first gesture motion in the current interface, the terminal may first determine whether the first gesture motion is a preset calculation gesture according to the direction information and the trajectory information corresponding to the gesture motion.
It should be noted that, in the embodiment of the present application, when the terminal detects a corresponding gesture motion in the current interface, the terminal may first determine whether the gesture motion is a gesture for performing calculation processing on a number in the current interface. Specifically, the terminal may preset some calculation gestures, that is, different gesture actions corresponding to different operation modes, and after detecting the first gesture action, the terminal may compare the detected and recognized first gesture action with a locally preset calculation gesture, and then determine whether a calculation gesture matched with the detected gesture action is stored locally.
Optionally, in an embodiment of the application, before comparing the first gesture action with the preset calculation gesture, the terminal may determine direction information and trajectory information corresponding to the first gesture action, and then determine whether the first gesture action belongs to the preset calculation gesture by using the trajectory information and the direction information. Specifically, the terminal may pre-store a corresponding relationship between a plurality of calculation gestures and direction information and trajectory information, when the first gesture action is compared with the preset calculation gesture, compare the direction information and trajectory information corresponding to the first gesture action with the direction information and trajectory information corresponding to all the preset calculation gestures, and if a preset calculation gesture exists in which the direction information and trajectory information are consistent with the first gesture action, indicate that the first gesture action is the preset calculation gesture.
It should be noted that, in the embodiment of the present application, the terminal gesture setting mode may include a standard gesture setting mode and a custom gesture setting mode, where gesture information included in the standard gesture setting mode is set before the terminal leaves a factory, and gesture information included in the custom gesture mode is set by a user through a terminal preset management interface in a custom manner.
Further, in the embodiment of the present application, the correspondence between the computation gesture and the direction information and the trajectory information, which is pre-stored in the terminal, may be determined by the correspondence between the standard computation gesture included in the standard gesture mode and the direction information and the trajectory information, or may be determined by the correspondence between the custom computation gesture included in the custom gesture setting mode and the direction information and the trajectory information. For example, the trajectory information corresponding to the add gesture may be "horizontal and vertical lines of intersection" corresponding to the standard add gesture, or may be "vertical and horizontal lines of column" corresponding to the custom add gesture.
Further, in the embodiment of the application, a user can also perform setting selection operation on a terminal preset management interface, and the terminal can determine whether a standard gesture mode or a custom gesture mode is currently adopted according to a received setting selection instruction. Wherein there is no repetitive gesture action between the standard computing gesture and the custom computing gesture.
Further, in the embodiment of the application, after the terminal performs the determination processing according to whether the first gesture motion is the preset calculation gesture, the terminal may further determine that the first gesture motion belongs to the preset calculation gesture, or that the first gesture motion does not belong to the preset calculation gesture.
In the embodiment of the application, after the terminal determines whether the first gesture motion is the preset calculation gesture according to the trajectory information and the direction information, if the terminal determines that the first gesture motion is the preset calculation gesture, the terminal may further determine the first calculation mode according to a corresponding relationship between the first gesture motion, the preset calculation gesture and the calculation mode.
It should be noted that, in the embodiment of the present application, the terminal may pre-establish a corresponding relationship between the calculation gesture and the calculation mode. Optionally, the corresponding relationship between the calculation gesture and the calculation mode may be a corresponding relationship between a standard calculation gesture included in the standard gesture mode and the calculation mode, or a corresponding relationship between a custom calculation gesture included in the custom gesture mode and the calculation mode.
Optionally, in the same gesture mode, different computation gestures correspond to different computation modes. For example, in the standard gesture mode, the calculation mode corresponding to the calculation gesture with the trajectory information of "crossed horizontal lines and vertical lines" is the addition calculation, and the calculation mode corresponding to the calculation gesture with the trajectory information of "single horizontal line" is the subtraction calculation.
Optionally, in different gesture modes, different computation gestures may correspond to the same computation mode. For example, in the standard gesture mode, the computation mode corresponding to the computation gesture with the trajectory information of "horizontal and vertical lines that intersect" is the addition computation, and in the custom gesture mode, the computation mode corresponding to the computation gesture with the trajectory information of "vertical lines that are both columns" is the addition computation.
Specifically, in the embodiment of the application, if a gesture motion consistent with the first gesture motion trajectory information and the direction information is matched in a plurality of computing gestures stored locally by the terminal, the terminal may determine, according to a preset correspondence between the computing gesture and the computing mode, a computing mode corresponding to a preset computing gesture that is the same as the first gesture motion as the computing mode corresponding to the first gesture motion.
For example, if the trajectory information of the first gesture motion is "horizontal and vertical intersecting lines", and the direction information is "from top to bottom and from left to right", and the trajectory information and the direction information correspond to a computation gesture in the standard gesture pattern, it is determined that the computation pattern corresponding to the computation pattern is an addition computation according to the correspondence between the computation gesture and the computation pattern, and the computation pattern corresponding to the first gesture motion is also an addition computation.
It should be noted that, if the terminal determines that the first gesture motion is not the preset calculation gesture, it indicates that a calculation gesture matching the first gesture motion is not stored locally, and at this time, the touch operation of the user on the touch screen may be an incorrect operation.
Further, in this embodiment of the application, after determining that the first gesture motion belongs to the preset calculation gesture and determining the first calculation mode corresponding to the first gesture motion, the terminal may further determine a target number corresponding to the first gesture motion.
And step 203, determining coordinate data corresponding to the first gesture according to the track information.
In an embodiment of the application, after determining that the first gesture motion belongs to the preset calculation gesture and determining the first calculation mode corresponding to the first gesture motion, the terminal needs to further determine the target number. Specifically, the terminal may determine the coordinate data corresponding to the first gesture according to the trajectory information corresponding to the first gesture.
In an embodiment of the present application, the coordinate data corresponding to the first gesture includes coordinate data corresponding to an area with a center of the first gesture as a center of a circle and a radius of a preset length threshold. Specifically, after the terminal collects track information of a first gesture through the image sensor, the center of the first gesture can be determined according to the track information, and then screen coordinate data, which takes the center of the gesture as the center of a circle and has a radius of a preset length threshold, is acquired as coordinate data corresponding to the first gesture.
For example, assuming that the preset length threshold is 0.3cm, when the trajectory information of the first gesture motion is "crossed horizontal lines and vertical lines", determining a crossing point of the horizontal lines and the vertical lines as a center of the gesture motion, and acquiring, by the terminal, screen coordinate data at a radius of 0.3cm with the crossing point as a center of a circle as coordinate data corresponding to the first gesture motion; similarly, when the trajectory information of the first gesture motion is a "single horizontal line", a half of the horizontal line is determined as the center of the gesture motion, and the terminal can acquire screen coordinate data with the half as the center of a circle and a radius of 0.3cm as coordinate data corresponding to the first gesture motion.
Further, in an embodiment of the application, after the terminal determines the coordinate data corresponding to the first gesture according to the trajectory information, the terminal may further determine the target number according to the coordinate data.
And step 204, determining a target area in the current interface based on the coordinate data, and determining the number displayed in the target area as a first target number.
In an embodiment of the application, after the terminal determines the coordinate data corresponding to the first gesture according to the trajectory information, the terminal may further determine a target area in the current interface based on the coordinate data, and further determine the number displayed in the target area as the first target number.
It should be noted that, in the embodiment of the present application, the target area is a sub-area that is not coincident with the first area in the current interface.
Specifically, in the embodiment of the application, when the terminal determines the target number according to the first gesture, a sub-region, that is, a target region, may be determined in the current interface according to coordinate data corresponding to the first gesture, that is, the first coordinate data, and then the number displayed in the target region may be determined as the target number. That is, the target area is a sub-area of the current interface that contains the target number.
For example, assuming that the current interface is financial news, the terminal determines, according to the coordinate data corresponding to the first gesture, that the target area is a circular sub-area with screen coordinates (5,10) as a center and a radius of 0.3cm, that is, the left side of the second line of the news, and further, the terminal determines a number b displayed in the circular sub-area, and further, the number b is used as the first target number.
It can be understood that, in the embodiment of the application, if a user wants to perform operation processing on an initial number and a target number in a current interface, the user may perform touch operation in a selection area where the initial number is located in a display screen to input a selection instruction, so that the terminal determines the initial number according to the received selection instruction; then, the user can perform gesture operation in a target area where the target digit is located in the display screen to input gesture action information, so that the terminal can determine the target digit which is subjected to calculation processing with the initial digit and an operation mode between the initial digit and the target digit according to the detected gesture action information, and the terminal can finish the operation processing of the initial digit and the target digit.
It should be noted that, in the embodiment of the present application, if the terminal determines that the first gesture is the preset calculation gesture, the terminal may determine the target number after determining the calculation mode; the terminal can also determine that the target number is determining the calculation mode; or the terminal may perform the calculation mode and the determination process of the target number at the same time, which is not specifically limited in this application.
Therefore, in the embodiment of the application, the terminal can directly determine the target number to be calculated and the corresponding calculation mode thereof according to the calculation gesture received at the current interface, and then execute the data calculation processing process, so that when the terminal calculates the number in the current interface, the terminal does not need to switch to the calculator APP interface for calculation any more, the calculation mode and the calculation number can be directly determined based on the gesture recognition processing in the current interface, and the digital calculation processing process is completed in the current interface, so that the calculation time is greatly shortened, the calculation efficiency and the calculation accuracy are improved, the system resource loss is effectively reduced, and the terminal is higher in intelligence.
Based on the foregoing embodiment, in another embodiment of the present application, after the initial number is recognized in the current interface, before the first gesture action is detected in the current interface, that is, before step 101, the method for the terminal to perform the intelligent computation may further include the following steps:
and step 105, receiving a call-out instruction in the current interface.
And step 107, receiving a starting instruction in the setting interface, responding to the starting instruction, and starting the intelligent gesture calculation function.
It should be noted that, in the embodiment of the present application, the terminal is preset with an intelligent gesture calculation function trigger mechanism, and before determining an initial number, that is, before performing digital calculation processing by using a gesture action, the terminal needs to start the intelligent gesture calculation function first. That is, if the terminal starts the intelligent gesture calculation function, the terminal may perform a data calculation processing process based on gesture recognition processing in the current interface; if the terminal does not start the intelligent gesture calculation function, the terminal cannot perform the data calculation processing based on gesture recognition processing in the current interface, and only performs the data calculation processing through other calculation processing processes, for example, switching to a calculator APP.
Optionally, in an embodiment of the present application, the terminal may start an intelligent gesture calculation function in the current interface. Specifically, when the terminal needs to start the intelligent gesture calculation function, a user can perform touch operation of interface calling on the display screen when the terminal is located in the current interface, the terminal can receive a corresponding calling instruction in the current interface based on the touch operation, then the terminal responds to the calling instruction, and the calling is performed in the second area of the current interface to set the interface.
It should be noted that, in the embodiment of the present application, the second area is a sub-area that is not overlapped with the first area in the current interface. Optionally, the second area may be a sub-area located at a peripheral edge of the terminal display interface, and different from the first area. After the terminal receives the call-out instruction, the terminal can call out a display interface in a suspension window form at one edge position of the peripheral edge positions of the display interface, wherein the edge position is different from the first area, and the display interface is used as a setting interface of an intelligent gesture calculation function. Specifically, the setting interface may be an interface provided with buttons for turning on and off the intelligent gesture calculation function.
Fig. 3 is a schematic view of a setting interface of an intelligent gesture calculating function according to an embodiment of the present disclosure, and as shown in fig. 3, assuming that a current interface is an economic news, a terminal receives a call-out instruction corresponding to the setting interface ON the current interface, the terminal responds to the call-out instruction, and at a right edge of the current interface of a display screen, the terminal calls out the setting interface, which is a floating window interface provided with ON and OFF buttons.
Furthermore, a user can perform opening operation in the setting interface, the terminal receives a corresponding opening instruction in the current interface through the opening operation, and then the opening instruction is responded, and the intelligent gesture calculation function is started. For example, the user may click ON an ON button in the setup interface shown in fig. 3, thereby initiating the smart gesture calculation function. That is, the setup interface is mainly used to initiate the smart gesture calculation function in the current interface.
It can be understood that, in the embodiment of the present application, the terminal may also receive a closing instruction in the setting interface, for example, the user clicks an OFF button in the setting interface shown in fig. 3, so as to close the smart gesture calculating function. Further, if the gesture calculation mode is turned off, the terminal may not perform data calculation processing in the current page using gesture recognition processing.
Further, in the embodiment of the application, when the terminal needs to start the intelligent gesture calculation function, the terminal may also start the intelligent gesture calculation function through the preset management interface. Specifically, before the current interface is opened, when the terminal performs gesture calculation processing on numbers in other interfaces, the intelligent gesture calculation function is started through the preset management interface; or when the intelligent gesture calculation function in the current interface is in a closed state, switching to a preset management interface from the current interface through a received page switching instruction, so that the intelligent gesture calculation function is started in the set management interface.
Specifically, in the embodiment of the application, the setting management interface of the terminal is a multifunctional parameter configuration interface, and a user can select and configure multiple functions on the setting management interface. For example, the activation and deactivation of smart gesture calculation functions; selecting a standard gesture mode and a custom gesture mode, and setting the corresponding relation between the custom calculation gesture and the calculation mode.
For example, fig. 4 is a schematic view of a second setting interface of the intelligent gesture calculating function provided in the embodiment of the present application, as shown in fig. 4, in a preset management interface, a first sub-region is provided with a configuration interface of the intelligent gesture calculating function, and a second sub-region is provided with a gesture mode, which includes configuration interfaces of a standard gesture mode and a custom gesture mode, wherein an on button and an off button of the intelligent gesture calculating function are provided in the first sub-region, a selection button of the standard gesture mode and the custom gesture mode is provided in the second sub-region, and a setting option of a corresponding relationship between the custom gesture calculating function and the calculating mode is provided. That is to say, in the preset management interface, the intelligent gesture calculation function can be turned on and off in the setting management interface, the selection processing of the standard gesture mode and the custom gesture mode can also be performed, and meanwhile, the corresponding relation between the custom calculation gesture and the calculation mode can also be set. As shown in fig. 4, the terminal starts the intelligent gesture calculation function, and the calculation mode is selected as the standard calculation mode, where in the standard calculation gesture, the trajectory information of the addition calculation gesture is "crossed horizontal line and vertical line", the trajectory information of the subtraction calculation gesture is "single horizontal line", and other calculation gestures are not repeated; in the user-defined calculation gesture, the trajectory information of the addition calculation gesture is a double-row vertical line, the trajectory information of the subtraction calculation gesture is a single-row horizontal line, and other calculation gestures are not repeated.
Further, if the terminal is switched to the preset management interface from the current page and the intelligent gesture calculation function is started, the terminal can be switched back to the current interface, and the gesture recognition processing is further utilized to perform data calculation processing on the current page.
Optionally, in an embodiment of the present application, fig. 5A is a schematic view of an implementation flow of an intelligent computing method provided in the embodiment of the present application, and as shown in fig. 5A, after the terminal starts an intelligent gesture computing function, if computing processing is performed on different numbers, a computing processing procedure is as follows: after step 107, the terminal receives the selection command, determines an initial number according to the selection command (step 108), further detects the first gesture to determine a calculation mode and a target number, and performs calculation processing of the initial number and the target number according to the calculation mode, that is, continues to execute steps 101, 102, 103, and 104.
Optionally, in an embodiment of the present application, fig. 5B is a schematic diagram of an implementation flow of an intelligent computing method provided in the embodiment of the present application, as shown in fig. 5B, and as shown in fig. 5B, if the same number is subjected to computing processing, that is, the number is subjected to self-operation processing, such as self-addition or self-multiplication, then the computing processing procedure is: and step 107, directly detecting the gesture action by the terminal to obtain a first gesture action, determining a calculation mode and a target number according to the gesture action, wherein the target number is the initial number because the number is subjected to self-operation processing, and then calculating the same initial number and the same target number by the terminal according to the calculation mode.
It can be understood that the terminal may also receive a close instruction in the preset management interface, for example, a user clicks a close button, so as to close the intelligent gesture calculation function. Further, if the intelligent gesture calculation function is turned off, the terminal cannot perform data calculation processing through corresponding gesture actions.
The embodiment of the application provides an intelligent calculation method, after an initial number is determined, a terminal can determine a target number to be calculated and a calculation mode among numbers through gesture actions detected in a current interface, further calculate and process the initial number and the target number according to the calculation mode, and display an operation result on the current interface. Therefore, according to the intelligent computing method, when the terminal performs computing processing on the numbers in the current interface, the terminal does not need to be switched to the APP interface of the calculator for computing, the computing mode and the computing numbers can be directly determined based on gesture recognition processing in the current interface, and the digital computing processing process is completed in the current interface, so that the computing time is greatly shortened, the computing efficiency and the computing accuracy are improved, the system resource loss is effectively reduced, and the terminal is higher in intelligence.
Based on the foregoing embodiment, in another embodiment of the present application, fig. 6 is a schematic view of an implementation flow of an intelligent computing method provided in the embodiment of the present application, as shown in fig. 6, in the embodiment of the present application, a method for a terminal to perform intelligent computing may include the following steps:
Specifically, when the terminal needs to start the intelligent gesture calculation function, a user can perform touch operation of interface calling on the display screen when the terminal is located in the current interface, the terminal can receive a corresponding calling instruction in the current interface based on the touch operation, then the terminal responds to the calling instruction, and the calling is performed in the second area of the current interface to set the interface.
Specifically, a user can perform opening operation in a setting interface, and the terminal receives a corresponding opening instruction in the current interface through the opening operation, and then responds to the opening instruction to open the intelligent gesture calculation function. For example, the user may click ON an ON button in the setup interface shown in fig. 3, thereby initiating the smart gesture calculation function. That is, the setup interface is mainly used to initiate the smart gesture calculation function in the current interface.
Specifically, a user performs an initial digit selection operation (for example, double-click and long-press) in the current interface, the terminal can receive a corresponding selection instruction carrying second coordinate data through the user selection operation, and then the terminal can determine a selection area in the current interface according to the second coordinate data, and take the digits displayed in the selection area as the initial digits.
And 304, detecting a gesture action in the current interface, and determining direction information and track information corresponding to the gesture action.
Optionally, the terminal may be equipped with an image sensor, and configured to perform information acquisition processing on operation information of the user on the display screen, including an operation direction and an operation trajectory, to determine a corresponding gesture action, and store direction information and trajectory information corresponding to the gesture action.
Specifically, the terminal may pre-store a corresponding relationship between a plurality of calculation gestures and direction information and trajectory information, when comparing a gesture action with a preset calculation gesture, compare the direction information and trajectory information corresponding to the gesture action with the direction information and trajectory information corresponding to all preset calculation gestures, if a preset calculation gesture exists in which the direction information and trajectory information are consistent with the gesture action, it indicates that the gesture action is the preset calculation gesture, then step 306 is executed; if there is no consistent preset calculation gesture, the terminal continues to detect the next gesture action, i.e. jumps to execute step 304.
And step 306, determining a calculation mode according to the corresponding relation of the gesture action, the preset calculation gesture and the calculation mode.
Further, if the terminal matches a gesture motion consistent with the first gesture motion trajectory information and the direction information in the plurality of locally stored computation gestures, the terminal may determine, according to a preset correspondence between the computation gesture and the computation mode, a computation mode corresponding to a preset computation gesture that is the same as the first gesture motion as the computation mode corresponding to the first gesture motion.
And 307, determining coordinate data corresponding to the gesture according to the track information.
Specifically, after the terminal collects track information of the gesture action through the image sensor, the center of the gesture action can be determined according to the track information, and then screen coordinate data with the center of the gesture action as the center of a circle and the radius as a preset length threshold is obtained and used as coordinate data corresponding to the gesture action.
And 308, determining a target area in the current interface based on the coordinate data, and determining the number displayed in the target area as a target number.
Specifically, when the terminal determines the target number according to the gesture, a sub-region, that is, a target region, may be determined in the current interface according to the coordinate data corresponding to the gesture, and then the number displayed in the target region may be determined as the target number. That is, the target area is a sub-area of the current interface that contains the target number.
And 309, calculating the initial number and the target number according to the calculation mode to obtain an operation result.
As can be seen from steps 101 and 102, the initial number is always the number before the operator, the first target number is always the number after the operator, i.e., the number to be calculated, and the first calculation mode is used to indicate the operation method used between the initial number and the first target number. The terminal can perform corresponding calculation processing on the initial number and the target number according to the determined calculation mode, and then a first operation result is obtained. If the initial number is a and the first target number is b, when the calculation mode is division, a is dividend and b is divisor, and the first operation result is (a/b).
And 310, calling a result display interface in the first area of the current interface, and displaying the operation result in the result display interface.
Specifically, after the terminal obtains the first operation result, if a blank sub-region exists in the current interface, the terminal may call out a result display interface in the blank sub-region to display the operation result; that is, the results display interface is embedded in the first region of the current interface.
Specifically, after the terminal obtains the operation result, if the current interface has no blank sub-area, the terminal may call out an interface in the form of a floating window at the edge position of a display interface closest to the target number in the peripheral edge positions as a result display interface to display the operation result; that is, the results display interface is overlaid in a first area of the current interface.
Based on the calculation methods provided in the above steps 301 to 310, after the intelligent gesture calculation function is started, the terminal does not need to frequently switch to the calculator APP for calculation, and can directly implement calculation processing of numbers in the current page by using gesture recognition processing in the current page, so that not only is the calculation duration greatly shortened, but also the calculation efficiency and the calculation accuracy are improved, and the system resource loss is effectively reduced, and the terminal is more intelligent.
For example, fig. 7 is a schematic diagram of an effect of an intelligent gesture calculation processing interface provided in the embodiment of the present application, as shown in fig. 7, assuming that a current interface is an economic news, a terminal receives a call-out instruction corresponding to a setting interface ON the current interface, the terminal responds to the call-out instruction, calls out a setting interface in a floating window form provided with ON and OFF buttons at a right edge position of the current interface ON a display screen, and then, the terminal receives an opening instruction in the setting interface, and starts an intelligent gesture calculation function in response to the opening instruction. Further, the terminal receives a selection instruction in the current page, the terminal determines an initial number 23207 according to coordinate data carried by the selection instruction, then the terminal detects gesture actions of track information of 'crossed horizontal lines and vertical lines' and direction information of 'from top to bottom and from left to right' in the current page, the terminal determines that a calculation mode is addition and a target number is 247743 according to the track information, then the terminal performs addition operation on the 23207 and 247743, calls a result display interface in a blank area of the lower right corner of the current interface, and displays a calculation result 270950 in the result display interface.
Further, in the embodiment of the application, after the terminal calculates the operation result, if the terminal needs to continue to perform subsequent calculation processing on the operation result, the terminal may use the previous operation result as an initial number in the next-stage calculation processing process, then continue to detect the next gesture action in the current interface, that is, jump to execute step 304, and then continue to perform calculation processing on the operation result and the next target number according to the next gesture action. If the calculation processing of other initial numbers and target numbers needs to be performed again after the operation result is obtained, the step 303 is executed, and the calculation processing of other initial numbers and target numbers continues in the next round. The calculation mode and the target number determination processing, and the calculation mode completion number calculation processing determine a second target number and a second calculation mode. And then according to a second calculation mode, calculating the second target number and the first calculation result to obtain a second calculation result, and displaying the result display page by using the second calculation result.
The embodiment of the application provides an intelligent calculation method, after an initial number is determined, a terminal can determine a target number to be calculated and a calculation mode among numbers through gesture actions detected in a current interface, further calculate and process the initial number and the target number according to the calculation mode, and display an operation result on the current interface. Therefore, according to the intelligent computing method, when the terminal performs computing processing on the numbers in the current interface, the terminal does not need to be switched to the APP interface of the calculator for computing, the computing mode and the computing numbers can be directly determined based on gesture recognition processing in the current interface, and the digital computing processing process is completed in the current interface, so that the computing time is greatly shortened, the computing efficiency and the computing accuracy are improved, the system resource loss is effectively reduced, and the terminal is higher in intelligence.
Based on the foregoing embodiments, in another embodiment of the present application, fig. 8 is a schematic diagram of a composition structure of a terminal according to an embodiment of the present application, and as shown in fig. 8, a terminal 10 according to an embodiment of the present application may include a detecting unit 11, a determining unit 12, a calculating unit 13, a displaying unit 14, a receiving unit 15 and a responding unit 16,
the detection unit 11 is configured to detect a first gesture in a current interface after an initial number is recognized in the current interface; wherein the initial number is displayed in the current interface;
the determining unit 12 is configured to determine a first target number and a first calculation mode according to the first gesture; wherein the first target number is displayed in the current interface;
the calculating unit 13 is configured to perform calculation processing on the initial number and the first target number according to the first calculation mode to obtain a first operation result;
the display unit 14 is configured to call a result display interface in the first area of the current interface, and display the first operation result in the result display interface.
Further, in the embodiment of the present application, the determining unit 12 is further configured to determine direction information and trajectory information corresponding to the first gesture motion before determining the first calculation mode and the first target number according to the first gesture motion.
Further, in an embodiment of the application, the determining unit 12 is specifically configured to determine whether the first gesture action is a preset calculation gesture according to the direction information and the trajectory information; and if the preset calculation gesture is judged, determining the first calculation mode according to the corresponding relation among the first gesture action, the preset calculation gesture and the calculation mode.
Further, in an embodiment of the present application, the determining unit 12 is specifically configured to determine, according to the trajectory information, coordinate data corresponding to the first gesture; determining a target area in the current interface based on the coordinate data, and determining the number displayed in the target area as the first target number; wherein the target region is not coincident with the first region.
Further, in an embodiment of the present application, the receiving unit 15 is configured to receive a selection instruction in the current interface; wherein the selection instruction carries second coordinate data.
Further, in the embodiment of the present application, the determining unit 12 is further configured to determine the initial number according to the second coordinate data.
Further, in an embodiment of the present application, the receiving unit 15 is further configured to receive an outgoing instruction in the current interface before receiving a selection instruction in the current interface.
Further, in this embodiment of the application, the response unit 16 is further configured to respond to the call-out instruction, and call out a setting interface in a second area of the current interface; wherein the second region is not coincident with the first region; the second region is not coincident with the first region.
Further, in the embodiment of the present application, the receiving unit 15 is further configured to receive an opening instruction in the setting interface.
Further, in the embodiment of the present application, the response unit 16 is further configured to respond to the start instruction to start the smart gesture calculating function.
Further, in the embodiment of the present application, the detecting unit 11 is further configured to detect a second gesture motion in the current interface after the first operation result is displayed in the result display interface.
Further, in the embodiment of the present application, the determining unit 12 is further configured to determine a second target number and a second calculation mode according to the second gesture; wherein the second target number is displayed in the current interface.
Further, in the embodiment of the present application, the display unit 14 is further configured to display the second operation result on the result display interface.
Further, in the embodiment of the present application, the receiving unit 15 is further configured to receive a closing instruction in the setting interface.
Further, in this embodiment of the application, the response unit 16 is further configured to respond to the closing instruction to close the smart gesture calculating function.
In an embodiment of the present application, further, fig. 9 is a schematic diagram of a composition structure of a terminal according to an embodiment of the present application, as shown in fig. 9, the terminal 10 according to the embodiment of the present application may further include a processor 17 and a memory 18 storing executable instructions of the processor 17, and further, the terminal 10 may further include a communication interface 19, and a bus 110 for connecting the processor 17, the memory 18, and the communication interface 19.
In an embodiment of the present application, the Processor 17 may be at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a ProgRAMmable Logic Device (PLD), a Field ProgRAMmable Gate Array (FPGA), a Central Processing Unit (CPU), a controller, a microcontroller, and a microprocessor. It is understood that the electronic devices for implementing the above processor functions may be other devices, and the embodiments of the present application are not limited in particular. The terminal 10 may further comprise a memory 18, which memory 18 may be connected to the processor 17, wherein the memory 18 is adapted to store executable program code comprising computer operating instructions, and wherein the memory 18 may comprise a high speed RAM memory and may further comprise a non-volatile memory, such as at least two disk memories.
In the embodiment of the present application, the bus 110 is used to connect the communication interface 19, the processor 17, and the memory 18 and the intercommunication among these devices.
In an embodiment of the present application, the memory 18 is used for storing instructions and data.
Further, in an embodiment of the present application, the processor 17 is configured to detect a first gesture in a current interface after identifying an initial number in the current interface; wherein the initial number is displayed in the current interface; determining a first target number and a first calculation mode according to the first gesture action; wherein the first target number is displayed in the current interface; calculating the initial number and the first target number according to the first calculation mode to obtain a first operation result; and calling a result display interface in a first area of the current interface, and displaying the first operation result in the result display interface.
In practical applications, the Memory 18 may be a volatile Memory (volatile Memory), such as a Random-Access Memory (RAM); or a non-volatile Memory (non-volatile Memory), such as a Read-Only Memory (ROM), a flash Memory (flash Memory), a Hard disk (Hard disk Drive, HDD) or a Solid-State Drive (SSD); or a combination of the above types of memories and provides instructions and data to the processor 17.
In addition, each functional module in this embodiment may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware or a form of a software functional module.
Based on the understanding that the technical solution of the present embodiment essentially or a part contributing to the prior art, or all or part of the technical solution, may be embodied in the form of a software product stored in a storage medium, and include several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method of the present embodiment. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The embodiment of the application provides a terminal, which can detect a first gesture action in a current interface after an initial number is recognized in the current interface; wherein the initial number is displayed in the current interface; determining a first target number and a first calculation mode according to the first gesture action; wherein the first target number is displayed in the current interface; calculating the initial number and the first target number according to a first calculation mode to obtain a first operation result; and calling a result display interface in a first area of the current interface, and displaying the first operation result in the result display interface. That is to say, in the embodiment of the application, the terminal may determine the corresponding calculation mode and the target number through the designated gesture received on the current interface, further perform calculation processing on the initial number and the target number according to the calculation mode, and display the calculation result. Therefore, according to the intelligent calculation method, when the terminal calculates the numbers in the current interface, frequent switching to the calculator APP for calculation is not needed any more, and data operation processing can be directly completed in the current interface by using gesture actions, so that the calculation time is greatly shortened, the calculation efficiency and the calculation accuracy are improved, the system resource loss is effectively reduced, and the terminal is higher in intelligence.
An embodiment of the present application provides a computer-readable storage medium, on which a program is stored, which when executed by a processor implements the computing method as described above.
Specifically, the program instructions corresponding to the calculation method in the embodiment may be stored on a storage medium such as an optical disc, a hard disc, a usb disk, or the like, and when the program instructions corresponding to the calculation method in the storage medium are read or executed by an electronic device, the method includes the following steps:
after an initial number is recognized in a current interface, detecting a first gesture action in the current interface; wherein the initial number is displayed in the current interface;
determining a first target number and a first calculation mode according to the first gesture action; wherein the first target number is displayed in the current interface;
calculating the initial number and the first target number according to the first calculation mode to obtain a first operation result;
and calling a result display interface in a first area of the current interface, and displaying the first operation result in the result display interface.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of implementations of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart block or blocks and/or flowchart block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks in the flowchart and/or block diagram block or blocks. The above description is only a preferred embodiment of the present application, and is not intended to limit the scope of the present application.
Claims (11)
1. An intelligent computing method, characterized in that the method comprises:
after an initial number is recognized in a current interface, detecting a first gesture action in the current interface; wherein the initial number is displayed in the current interface;
determining a first target number and a first calculation mode according to the first gesture action; wherein the first target number is displayed in the current interface;
calculating the initial number and the first target number according to the first calculation mode to obtain a first operation result;
and calling a result display interface in a first area of the current interface, and displaying the first operation result in the result display interface.
2. The method of claim 1, wherein prior to determining the first calculation mode and the first target number from the first gesture, the method further comprises:
and determining direction information and track information corresponding to the first gesture.
3. The method of claim 2, wherein determining a first target number and a first calculation mode based on the first gesture comprises:
judging whether the first gesture action is a preset calculation gesture or not according to the direction information and the track information;
and if the preset calculation gesture is judged, determining the first calculation mode according to the corresponding relation among the first gesture action, the preset calculation gesture and the calculation mode.
4. The method of claim 1, wherein determining a first target number and a first calculation mode based on the first gesture comprises:
determining first coordinate data corresponding to the first gesture according to the track information;
determining a target area in the current interface based on the first coordinate data, and determining a number displayed in the target area as the first target number; wherein the target region is not coincident with the first region.
5. The method of claim 1, wherein identifying an initial number in the current interface comprises:
receiving a selection instruction in the current interface; wherein the selection instruction carries second coordinate data;
determining the initial number from the second coordinate data.
6. The method of claim 5, wherein prior to receiving a selection instruction in the current interface, the method further comprises:
receiving an outgoing instruction in the current interface;
responding to the call-out instruction, and setting an interface in a second area of the current interface; wherein the second region is not coincident with the first region; the second region is not coincident with the first region;
and receiving a starting instruction in the setting interface, responding to the starting instruction, and starting the intelligent gesture calculation function.
7. The method of claim 1, wherein after the displaying the first operation result in the result display interface, the method further comprises:
detecting a second gesture action in the current interface, and determining a second target number and a second calculation mode according to the second gesture action; wherein the second target number is displayed in the current interface;
according to the second calculation mode, performing the calculation processing on the second target number and the first calculation result to obtain a second calculation result;
and displaying the second operation result on the result display interface.
8. The method of claim 6, further comprising:
receiving a closing instruction in the setting interface;
and responding to the closing instruction, and closing the intelligent gesture calculation function.
9. A terminal, characterized in that the terminal comprises: a detection unit, a determination unit, a calculation unit and a display unit,
the detection unit is used for detecting a first gesture action in a current interface after an initial number is recognized in the current interface; wherein the initial number is displayed in the current interface;
the determining unit is used for determining a first target number and a first calculation mode according to the first gesture action; wherein the first target number is displayed in the current interface;
the calculation unit is used for performing calculation processing on the initial number and the first target number according to the first calculation mode to obtain a first operation result;
the display unit is used for calling a result display interface in a first area of the current interface and displaying the first operation result in the result display interface.
10. A terminal, characterized in that the terminal comprises a processor, a memory storing instructions executable by the processor, which instructions, when executed by the processor, implement the method according to any of claims 1-8.
11. A computer-readable storage medium, on which a program is stored, for use in a terminal, characterized in that the program, when executed by a processor, implements the method according to any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010757610.1A CN111858451A (en) | 2020-07-31 | 2020-07-31 | Intelligent computing method, terminal and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010757610.1A CN111858451A (en) | 2020-07-31 | 2020-07-31 | Intelligent computing method, terminal and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111858451A true CN111858451A (en) | 2020-10-30 |
Family
ID=72953755
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010757610.1A Pending CN111858451A (en) | 2020-07-31 | 2020-07-31 | Intelligent computing method, terminal and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111858451A (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105378599A (en) * | 2013-07-12 | 2016-03-02 | 微软技术许可有限责任公司 | Interactive digital displays |
CN106547434A (en) * | 2016-11-30 | 2017-03-29 | 努比亚技术有限公司 | A kind of implementation method and terminal of terminal calculator |
CN110383244A (en) * | 2017-12-29 | 2019-10-25 | 华为技术有限公司 | A kind of operation method and terminal of calculator |
CN111352892A (en) * | 2020-03-03 | 2020-06-30 | 维沃移动通信有限公司 | Operation processing method and electronic equipment |
-
2020
- 2020-07-31 CN CN202010757610.1A patent/CN111858451A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105378599A (en) * | 2013-07-12 | 2016-03-02 | 微软技术许可有限责任公司 | Interactive digital displays |
CN106547434A (en) * | 2016-11-30 | 2017-03-29 | 努比亚技术有限公司 | A kind of implementation method and terminal of terminal calculator |
CN110383244A (en) * | 2017-12-29 | 2019-10-25 | 华为技术有限公司 | A kind of operation method and terminal of calculator |
CN111352892A (en) * | 2020-03-03 | 2020-06-30 | 维沃移动通信有限公司 | Operation processing method and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105824559B (en) | False touch recognition and processing method and electronic equipment | |
WO2018157662A1 (en) | Display control method for mobile terminal, and mobile terminal | |
CN106485124B (en) | Operation control method of mobile terminal and mobile terminal | |
CN107193438B (en) | Method for managing desktop icons and mobile terminal | |
WO2018196699A1 (en) | Method for displaying fingerprint recognition region, and mobile terminal | |
CN107102809B (en) | Fingerprint identification method and mobile terminal | |
EP3485358B1 (en) | Electronic device and method thereof for managing applications | |
US20140035853A1 (en) | Method and apparatus for providing user interaction based on multi touch finger gesture | |
CN110633044B (en) | Control method, control device, electronic equipment and storage medium | |
CN106415471A (en) | Processing method for user interface of terminal, user interface and terminal | |
WO2019015581A1 (en) | Text deletion method and mobile terminal | |
CN106970752A (en) | A kind of screenshotss method and mobile terminal | |
CN107562262B (en) | Method for responding touch operation, terminal and computer readable storage medium | |
CN113448479A (en) | Single-hand operation mode starting method, terminal and computer storage medium | |
WO2019037680A1 (en) | Method for controlling operation interface of mobile terminal, and mobile terminal | |
CN106681640B (en) | Screen display control method of mobile terminal and mobile terminal | |
CN111580905A (en) | Negative one-screen card management method, terminal and computer readable storage medium | |
CN105335086A (en) | Touch screen operation method and apparatus | |
WO2020118491A1 (en) | Fingerprint recognition-based interaction method, electronic device and related device | |
WO2023169499A1 (en) | Single-hand control method and control apparatus for touch screen, electronic device, and storage medium | |
CN110658976A (en) | Touch track display method and electronic equipment | |
CN107948362A (en) | Screen display method, device, computer installation and computer-readable recording medium | |
CN111858451A (en) | Intelligent computing method, terminal and storage medium | |
CN111831196A (en) | Control method of folding screen, terminal device and storage medium | |
CN108595042B (en) | Touch screen input control method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20201030 |