WO2014032239A1 - 一种终端设备获取指令的方法及终端设备 - Google Patents

一种终端设备获取指令的方法及终端设备 Download PDF

Info

Publication number
WO2014032239A1
WO2014032239A1 PCT/CN2012/080719 CN2012080719W WO2014032239A1 WO 2014032239 A1 WO2014032239 A1 WO 2014032239A1 CN 2012080719 W CN2012080719 W CN 2012080719W WO 2014032239 A1 WO2014032239 A1 WO 2014032239A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch point
coordinate
predicted
coordinate parameter
actual
Prior art date
Application number
PCT/CN2012/080719
Other languages
English (en)
French (fr)
Inventor
陈磊
Original Assignee
华为终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为终端有限公司 filed Critical 华为终端有限公司
Priority to PCT/CN2012/080719 priority Critical patent/WO2014032239A1/zh
Priority to CN201280003611.0A priority patent/CN103403665B/zh
Publication of WO2014032239A1 publication Critical patent/WO2014032239A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to the field of information processing technologies, and in particular, to a method for acquiring an instruction by a terminal device and a terminal device. Background technique
  • the capacitive touch screen operates with an X-Y electrode grid overlaid on the terminal device, using the voltage above.
  • the capacitance changes and can be measured.
  • the position of the finger can be accurately located, that is, the coordinate parameters of the touched point are determined.
  • the technical solution provides a method for acquiring an instruction by a terminal device, and a terminal device, which is used to implement a method for acquiring a corresponding instruction according to a coordinate parameter of a predicted touch point, and reduces an acquisition instruction of the terminal device without improving the hardware of the terminal device.
  • the time has improved the ability of the terminal device to interact with the user.
  • a method for obtaining an instruction by a terminal device is provided: Acquiring the gesture information, the gesture information includes: a coordinate parameter of the actual touch point; acquiring a coordinate parameter of the predicted touch point according to the coordinate parameter of the actual touch point; acquiring the predicted touch according to the coordinate parameter of the predicted touch point The instruction corresponding to the coordinate parameter of the point.
  • the instruction specifically includes: a program background execution instruction, where the program background execution instruction is an instruction that the terminal device performs a specific operation in the background.
  • the method further includes: if the predetermined time period, the predicted touch The coordinate parameter of the point is equal to the coordinate parameter of the actual touch point, and the execution result of executing the program background execution instruction is displayed.
  • the instruction specifically includes: displaying a touch location instruction.
  • the method further includes: displaying an execution result of executing the display touch location instruction .
  • the method for obtaining a coordinate parameter of a predicted touch point according to the coordinate parameter of the actual touch point includes: a coordinate parameter of the Nth actual touch point and a displacement difference And a coordinate parameter equal to the Nth predicted touch point; the N is a natural number greater than zero; the displacement difference is between a coordinate parameter of the Nth predicted touch point and a coordinate parameter of the Nth actual touch point Poor.
  • the method for obtaining a coordinate parameter for predicting a touch point according to the coordinate parameter of the actual touch point includes: a method for: a coordinate parameter of the first actual touch point
  • the sum of the displacement differences is equal to the coordinate parameter of the first predicted touch point
  • the sum of the coordinate parameter and the displacement difference of the Nth predicted touch point is equal to the coordinate parameter of the N+1th predicted touch point
  • the N is greater than zero
  • the natural difference; the displacement difference is a difference between a coordinate parameter of the Nth predicted touch point and a coordinate parameter of the Nth actual touch point.
  • the method for acquiring a coordinate parameter of the predicted touch point according to the coordinate parameter of the actual touch point includes: To accelerate the motion, the first coordinate parameter of the (N+1)th predicted touch point is equal to the sum of the coordinate parameter of the Nth predicted touch point and the displacement difference, and the second coordinate of the (N+1)th predicted touch point The parameter is equal to the sum of the coordinate parameter of the (N+1)th actual touch point and the actually moved pixel in the last touch event reporting period, and the coordinate parameter of the (N+1)th predicted touch point is the (N+1)th prediction a first value of the first coordinate parameter of the touch point and the second coordinate parameter of the (N+1th predicted touch point); when the gesture information is a deceleration motion, the first coordinate of the (N+1)th predicted touch point The parameter is equal to the sum of the coordinate parameter of the Nth predicted touch point and the displacement difference, and the second coordinate parameter of the N+1th predicted touch point is equal to the coordinate parameter of the
  • a terminal device is provided:
  • An acquiring unit configured to acquire gesture information, where the gesture information includes: a coordinate parameter of an actual touch point;
  • a processing unit configured to acquire a coordinate parameter of the predicted touch point according to the coordinate parameter of the actual touch point acquired by the acquiring unit, and acquire a coordinate parameter corresponding to the predicted touch point according to the coordinate parameter of the predicted touch point Instructions.
  • the terminal device further includes: a first execution unit, configured to receive the instruction of the processing unit, and execute the instruction, where the instruction includes a program background Executing an instruction; the program executing instructions in the background is an instruction that the terminal device performs a specific operation in the background.
  • the terminal device further includes: a determining unit, configured to receive coordinate parameters of the predicted touch point of the processing unit a coordinate of the actual touch point, and determining whether the coordinate parameter of the predicted touch point is equal to the coordinate parameter of the actual touch point; the first display unit is configured to receive the first execution unit to execute the program background execution An execution result of the instruction and a determination result of the determination unit, if the determination result is within a predetermined time, the coordinate parameter of the predicted touch point is equal to the actual touch point The coordinate parameter displays the execution result of the background execution instruction.
  • the terminal device further includes: a second execution unit, configured to receive the instruction of the processing unit And executing the instruction, the instruction includes: displaying a touch position instruction; the second display unit is configured to receive, by the second execution unit, an execution result of the display touch position instruction, and display the display touch position instruction Results of the.
  • the processing The unit is configured to obtain, according to the coordinate parameter of the actual touch point acquired by the acquiring unit, a coordinate parameter of the predicted touch point, specifically, acquiring the Nth prediction according to the sum of the coordinate parameter of the Nth actual touch point and the displacement difference a coordinate parameter of the touch point; the N is a natural number greater than zero; the displacement difference is a difference between a coordinate parameter of the Nth predicted touch point and a coordinate parameter of the Nth actual touch point.
  • the processing The unit is configured to obtain, according to the coordinate parameter of the actual touch point acquired by the acquiring unit, a coordinate parameter of the predicted touch point, specifically, according to the sum of the coordinate parameter of the first actual touch point and the displacement difference, obtain the first Predicting a coordinate parameter of the touch point; acquiring a coordinate parameter of the N+1th predicted touch point according to a sum of a coordinate parameter of the Nth predicted touch point and a displacement difference; the N is a natural number greater than zero; the displacement difference is The difference between the coordinate parameter of the Nth predicted touch point and the coordinate parameter of the Nth actual touch point.
  • the technical solution provided by the present application reduces the acquisition of the terminal device by a method for obtaining the corresponding instruction according to the predicted touch point in the coordinate parameter, without improving the hardware of the terminal device.
  • the time of the order improves the interaction between the terminal device and the user and enhances the user experience.
  • the response speed of the terminal device is accelerated, the tracking rate of the touch position is improved, and the intelligent interaction capability between the terminal device and the user is improved.
  • the response speed of the terminal device is accelerated, the waiting time of the user is reduced, and the intelligent interaction capability between the terminal device and the user is improved.
  • FIG. 2 is a flowchart of a possible implementation manner of Embodiment 2 of the method according to the embodiment of the present invention
  • FIG. 3 is a flowchart of another possible implementation manner of Embodiment 2 of the method according to the embodiment of the present invention
  • FIG. 5 is a schematic structural diagram of a terminal device according to Embodiment 5 of the present invention
  • FIG. 6 is a schematic structural diagram of a possible implementation manner of a terminal device according to Embodiment 5 of the present invention
  • FIG. 7 is a schematic structural diagram of another possible implementation manner of a terminal device according to Embodiment 5 of the present invention.
  • FIG. 8 is a schematic structural diagram of another possible implementation manner of a terminal device according to Embodiment 5 of the present invention.
  • FIG. 9 is a schematic structural diagram of a mobile phone according to an embodiment of the present invention.
  • the technical solutions in the embodiments of the present invention are clearly and completely described in the following with reference to the accompanying drawings in the embodiments of the present invention. It is a partial embodiment of the invention, and not all of the embodiments. All other embodiments obtained by those skilled in the art based on the embodiments of the present invention without departing from the inventive scope are the scope of the embodiments of the present invention.
  • the terms used in the embodiments of the present invention are for the purpose of describing particular embodiments only and are not intended to limit the invention.
  • the terminal device includes but is not limited to a mobile phone, a personal digital assistant (PDA), a tablet computer, a portable device (for example, a portable computer), a desktop computer, an ATM (Automatic Teller Machine, an ATM).
  • PDA personal digital assistant
  • portable device for example, a portable computer
  • desktop computer for example, a desktop computer
  • ATM Automatic Teller Machine
  • a method for acquiring an instruction by a terminal device may specifically include: 5101.
  • Acquire gesture information where the gesture information includes: a coordinate parameter of an actual touch point.
  • the capacitive touch screen operates through the XY electrode grid overlying the terminal device, applying the voltage above.
  • the capacitance changes and the capacitance change can be measured.
  • the gesture information is a coordinate parameter of a touch point, a pressure parameter, an area parameter, and the like, and a moving speed and trend information of the touch.
  • the coordinate parameter is also the coordinate parameter of the actual touch point.
  • the coordinate parameter of the predicted touch point refers to a parameter obtained by a certain method according to the coordinate parameter of the actual touch point.
  • the coordinate parameter of the predicted touch point may be the same as the coordinate parameter of the actual touch point, or may be different from the coordinate parameter of the actual touch point, which is not limited by the embodiment of the present invention.
  • the coordinate parameter of the touch point in the terminal device has a corresponding instruction.
  • an instruction corresponding to the coordinate parameter of the predicted touch point can be obtained, that is, the instruction can be obtained before the actual touch point arrives.
  • the embodiment of the present invention reduces the time for the terminal device to acquire an instruction, and improves the interaction capability between the terminal device and the user, by using a method for obtaining a corresponding instruction according to the coordinate parameter of the predicted touch point, without improving the hardware of the terminal device. , improve user experience.
  • FIG. 2 is a flowchart of a possible implementation manner of Embodiment 2 of a method according to an embodiment of the present invention.
  • the method for the terminal device to acquire an instruction may include:
  • Acquire gesture information where the gesture information includes: a coordinate parameter of an actual touch point.
  • the capacitive touch screen When the terminal device screen is touched, the capacitive touch screen operates through the X-Y electrode grid overlaid on the terminal device, applying the voltage above.
  • the capacitance changes and the capacitance change can be measured.
  • the position of the object By comparing the measured values of all the electrodes, the position of the object can be accurately located, including the coordinate parameters, pressure, area, etc. of the touch point.
  • the gesture information is a coordinate parameter of a touch point, a pressure parameter, an area parameter, and the like, and a moving speed and trend information of the touch.
  • the coordinate parameter is also the coordinate parameter of the actual touch point.
  • the position of the object can be accurately located, that is, the coordinate parameters of the touched point are determined.
  • the coordinate parameter is also the coordinate parameter of the actual touch point.
  • the coordinate parameter of the predicted touch point refers to a parameter obtained by a certain method according to the coordinate parameter of the actual touch point.
  • the coordinate parameter of the predicted touch point may be the same as the coordinate parameter of the actual touch point, or may be different from the coordinate parameter of the actual touch point, which is not limited by the embodiment of the present invention.
  • the method for obtaining the coordinate parameter of the predicted touch point according to the coordinate parameter of the actual touch point may be that the sum of the coordinate parameter and the displacement difference of the Nth actual touch point is equal to the Nth predicted touch point
  • the coordinate parameter; the N is a natural number greater than zero; the displacement difference is a difference between a coordinate parameter of the Nth predicted touch point and a coordinate parameter of the Nth actual touch point.
  • the method for obtaining a coordinate parameter for predicting a touch point according to the coordinate parameter of the actual touch point may be that the sum of the coordinate parameter and the displacement difference of the first actual touch point is equal to the first predicted touch.
  • a coordinate parameter of the N+1th predicted touch point is equal to a sum of a coordinate parameter of the Nth predicted touch point and a displacement difference;
  • the N is a natural number greater than zero;
  • the displacement difference is The difference between the coordinate parameter of the Nth predicted touch point and the coordinate parameter of the Nth actual touch point.
  • the method 2 may further be: when the gesture information is an acceleration motion, the first coordinate parameter of the N+1th predicted touch point is equal to the sum of the coordinate parameter of the Nth predicted touch point and the displacement difference, The second coordinate parameter of the N+1 predicted touch points is equal to the sum of the coordinate parameters of the N+1th actual touch point and the actually moved pixels in the last touch event reporting period, and the (N+1)th predicted touch point
  • the coordinate parameter is a larger one of a first coordinate parameter of the (N+1)th predicted touch point and a second coordinate parameter of the (N+1)th predicted touch point;
  • the first coordinate parameter of the N+1th predicted touch point is equal to the sum of the coordinate parameter of the Nth predicted touch point and the displacement difference, and the first N+1th predicted touch point
  • the coordinate parameter of the N+1th actual touch point is equal to the sum of the actually moved pixels in the last touch event reporting period, and the coordinate parameter of the (N+1)th predicted touch point is the N+1th a smaller one of a first coordinate parameter of the predicted touch point and a second coordinate parameter of the (N+1)th predicted touch point;
  • the coordinate parameter of the (N+1)th predicted touch point is equal to the coordinate parameter of the (N+1)th actual touch point.
  • the S is a displacement difference
  • the L is an image that actually moves during a reporting period of a touch event.
  • the M is a natural number greater than zero.
  • the displacement difference is a difference between a coordinate parameter of the Nth predicted touch point and a coordinate parameter of the Nth actual touch point.
  • the gesture conversion module After the touch screen of the terminal device detects the touch event, the event is reported to the frame layer through the I2C (Inter-Integrated Circuit, I2C protocol) bus and the event reading module, and the touch event is acquired by the gesture conversion module.
  • the gesture information includes information such as an acceleration motion, a deceleration motion, a hook motion, and a stationary motion. It should be understood that the gesture information of the touch event is obtained by the gesture conversion module, and includes not only information such as an acceleration motion, a deceleration motion, a hook motion, and a rest, but also other gesture information, which is not used in the embodiment of the present invention. Make restrictions.
  • Each coordinate parameter on the touch screen has an instruction corresponding to the coordinate parameter, and the coordinate parameter of the predicted touch point also corresponds to a specific instruction.
  • the corresponding relationship between the coordinate parameter and the associated instruction is stored on the terminal device, and the content of the corresponding relationship may be fixed, or may be automatically updated according to the change of the display content of the touch screen, which is not limited by the embodiment of the present invention. .
  • the program execution instruction in the background is an instruction that the terminal device performs a specific operation in the background.
  • the back-end operation means that the execution process of the instruction is not displayed to the user, and the user does not care, and is completely executed by the terminal device automatically.
  • S2051 Determine whether a coordinate parameter of the predicted touch point is equal to a coordinate parameter of the actual touch point within a predetermined time period. If the coordinate parameter of the predicted touch point is equal to the actual touch point For the coordinate parameter, execute S2061; otherwise, execute S2071.
  • the predetermined time period refers to a predetermined number of touch event reporting periods. Whether the coordinate parameter of the predicted touch point is equal to the coordinate parameter of the actual touch point is equivalent to the user actually touching the icon, the control, etc., which is equivalent to the user performing some operation.
  • the coordinate parameter of the predicted touch point is equal to the coordinate parameter of the actual touch point, and it is considered that the user actually performs some operation, and immediately displays the terminal device to perform the Execute the execution result of the instruction in the background.
  • the coordinate parameter of the predicted touch point is not equal to the coordinate parameter of the actual touch point, and it is considered that the user actually only passes through the point, and the terminal device does not actually acquire and execute.
  • the specific operation instruction at this time, if the instruction has not been executed yet, the execution of the instruction is cancelled, and if the instruction has been executed, the execution result of the instruction is deleted.
  • the embodiment of the present invention reduces the time for the terminal device to acquire an instruction, and improves the interaction between the terminal device and the user, by using a method for obtaining the corresponding instruction according to the coordinate parameter of the predicted touch point, without improving the hardware of the terminal device.
  • Ability to enhance user experience At the same time, by executing the program in advance in the background, the response speed of the terminal device is accelerated, the waiting time of the user is reduced, and the intelligent interaction capability between the terminal device and the user is improved.
  • FIG. 3 is a flowchart of another possible implementation manner of Embodiment 2 of a method according to an embodiment of the present disclosure. As shown in FIG. 3, after step S203, the method may further include: 52042. Execute the instruction, where the instruction is a display touch location instruction.
  • the display execution of the display of the touch position command is to display the icon, the trace, and the like of the touch position on the touch screen to clarify the location of the touch point to the user.
  • the specific display form is not limited in the embodiment of the present invention.
  • the embodiment of the present invention displays an execution result of executing the display touch position instruction according to a method for acquiring a corresponding instruction of a touch point in a coordinate parameter, and reduces a terminal device acquisition instruction without improving the hardware of the terminal device.
  • the time has improved the interaction between the terminal device and the user, and improved the user experience.
  • the response speed of the terminal device is accelerated, the tracking rate of the touch position is improved, and the intelligent interaction capability between the terminal device and the user is improved.
  • FIG. 4 is a flowchart of another possible implementation manner of Embodiment 2 of a method according to an embodiment of the present disclosure. As shown in FIG. 4, after step S203, the method may further include:
  • the program execution instruction in the background is an instruction that the terminal device performs a specific operation in the background.
  • the back-end operation means that the execution process of the instruction is not displayed to the user, and the user does not care, and is completely executed by the terminal device automatically.
  • the display of the execution of the display of the touch position command is to display the icon, the trace, and the like of the touch position on the touch screen to clarify the location of the touch point to the user.
  • the specific display form is not limited in this embodiment of the present invention.
  • the predetermined time period refers to a predetermined number of touch event reporting periods. Whether the coordinate parameter of the predicted touch point is equal to the coordinate parameter of the actual touch point is equivalent to the user actually touching the icon, the control, etc., which is equivalent to the user performing some operation.
  • step S 2053 there is no sequence between step S 2053 and step S2063.
  • Step S2053 may be performed before step S2063, or step S2063 may be performed first, step S2053 may be performed, and step S2053 and step S2063 may be performed at the same time.
  • the embodiments of the present invention do not limit this.
  • the coordinate parameter of the predicted touch point is equal to the coordinate parameter of the actual touch point, and it is considered that the user actually performs some operation, and immediately displays the terminal device to perform the Execute the execution result of the instruction in the background.
  • the coordinate parameter of the predicted touch point is not equal to the coordinate parameter of the actual touch point, and it is considered that the user actually only passes through the point, and the terminal device does not actually acquire and execute.
  • An instruction of a certain operation if the instruction has not been executed yet, the instruction is cancelled, and if the instruction has been executed, the execution result of the instruction is deleted.
  • the embodiment of the present invention reduces the time for the terminal device to acquire an instruction, and improves the interaction between the terminal device and the user, by using a method for obtaining a corresponding instruction according to the predicted touch point in the coordinate parameter, without improving the hardware of the terminal device.
  • Embodiment 3 of the present invention is implemented on the basis of Embodiment 2. Specifically, the specific implementation method for obtaining the coordinate parameter of the predicted touch point according to the coordinate parameter of the actual touch point described in Embodiment 2 can be implemented by the third embodiment.
  • the method 1 of Embodiment 2 may be that the sum of the coordinate parameter and the displacement difference of the Nth actual touch point is equal to the coordinate parameter of the Nth predicted touch point; the N is a natural number greater than zero; the displacement difference is The difference between the coordinate parameter of the Nth predicted touch point and the coordinate parameter of the Nth actual touch point.
  • the specific method of the method 1 is as follows:
  • the sum of the coordinate parameter and the displacement difference of the Nth actual touch point is equal to the coordinate parameter of the Nth predicted touch point; the N is a natural number greater than zero; the displacement difference is a coordinate parameter of the Nth predicted touch point and The difference between the coordinate parameters of the Nth actual touch point.
  • the S is a displacement difference
  • the L is a pixel that actually moves within a reporting period of a touch event
  • the M is a natural number greater than zero.
  • the touch event reporting period is a time interval at which the terminal device reports the actual touch point, and the terminal is configured.
  • the time interval between the two actual touch points is a touch time reporting period.
  • the position at which the actual touch point changes on the touch screen is the pixel actually moved during the reporting period of the touch event.
  • the frequency of the actual touch point reported by the capacitive touch screen of most mobile phones is set to 80 Hz, that is, as long as the hand touches the touch screen, the touch screen maintains the coordinate parameters of an actual touch point every 12.5 ms.
  • 12.5ms is a touch event reporting period.
  • the movement mode of the human finger on the screen and the coordinate parameter of the corresponding actual touch point are: dragging action, a dragging action is a process in which the finger moves from rest to acceleration to hook speed to deceleration and finally to rest.
  • Throwing action a throwing action is that the finger is always moving on the touch screen at the fastest rate until the touch point disappears. It is the finger from rest, acceleration, to constant speed and high speed movement, and then the finger leaves the screen during high speed movement.
  • the displacement difference (the difference between the coordinate parameter of the Nth predicted touch point and the coordinate parameter of the Nth actual touch point) is calculated as follows:
  • the M is equal to 5 results obtained by actual measurement, according to different terminal settings
  • the value of the M may be different, and the embodiment of the present invention does not limit this.
  • a finger moves on the touch screen at a moderately fast rate, and moves 16 pixels every 12.5 ms to perform a horizontal linear movement operation (this value is a rate obtained by actual testing, to illustrate the embodiment of the present invention. Any value, the embodiment of the present invention does not limit this.
  • Table 1 below is the coordinate parameter of the actual touch point and the coordinate parameter of the predicted touch point reported in the 12 touch event reporting period before the finger is completely stopped after sliding a certain distance. The values in the table below are for illustrative purposes only, and the actual situation may have two or three pixel errors from the data in the table below. Where (100,200) indicates that the reported coordinate parameters are the abscissa 100 and the ordinate 200.
  • Predictive touch ( 116,200 ) ( 132,200 ) ( 148,200 ) ( 164,200 ) ( 172,200 ) ( 175,200 )
  • the calculation process of the touch point is as follows: The calculation process is the same as the calculation process and the calculation process is as follows: The calculation process is the same as the coordinate reference coordinate One point, the first point, the first point, the ordinate is stationary, the fifth point. The number is stopped, so according to the formula, therefore according to the formula:
  • the coordinate parameter is number
  • the abscissa parameter is,
  • Predictive Touch (178,200) (178,200) (178,200) (178,200) (178,200) (178,200) (178,200) (178,200) (178,200) (178,200)
  • the calculation process of the touch point is the same as the calculation process.
  • the calculation process is the same as the calculation process:
  • the calculation process has the same coordinates as five points.
  • the fifth point. For: The ordinate is stationary, the eleventh number The ordinate is static. Therefore, according to the formula: point.
  • the standard parameter is that the abscissa is stationary.
  • Finger dragging action refers to dragging an icon or menu movement (for example, moving an icon on the home interface from one position to another), so that when the finger moves, the icon moves with the finger. Or the finger moves on the touch screen, and during the movement, the finger will leave a trace of movement on the touch screen.
  • the icon or interactive trace follows the finger, not the actual touch point of the finger. Therefore, according to the embodiment of the present invention, since the coordinate parameter of the touch point is actually reported, the touch point can be reported in advance at 12.5 ms, so that when the finger moves, the icon or the movement trace moves closer to the finger.
  • the background program execution instruction corresponding to the coordinate parameter of the predicted prediction touch point can be executed before the finger reaches a certain control on the screen, so that when the finger moves, the quick response can be performed. User's operation.
  • the finger-throwing action is that the finger is always moving on the touch screen at the fastest rate until the touch point disappears. It is the finger from the standstill, to the acceleration, to the high speed movement of the hook speed, and then the finger leaves the screen during high speed movement. Therefore, according to an embodiment of the present invention, since the report point is actually a coordinate parameter for predicting a touch point, It is conceivable that when the finger leaves the touch screen in a state where the finger moves at a high speed, for example, from the point P, the actual point is actually the coordinate parameter of the point P plus the displacement difference, but in fact the finger does not touch the point at all. In the end, the moving distance of the finger within 12.5ms was reported.
  • the embodiment of the present invention reduces the time for the terminal device to acquire an instruction, and improves the interaction between the terminal device and the user, by using a method for obtaining a corresponding instruction according to the predicted touch point in the coordinate parameter, without improving the hardware of the terminal device.
  • Embodiment 3 of the present invention is implemented on the basis of Embodiment 2. Specifically, the specific implementation method for obtaining the coordinate parameter of the predicted touch point according to the coordinate parameter of the actual touch point described in Embodiment 2 can be implemented by the fourth embodiment.
  • the method 2 of Embodiment 2 may be that the sum of the coordinate parameter and the displacement difference of the first actual touch point is equal to the coordinate parameter of the first predicted touch point; the coordinate parameter of the N+1th predicted touch point is equal to the Nth prediction a sum of a coordinate parameter of the touch point and a displacement difference; the N is a natural number greater than zero; the displacement difference is a difference between a coordinate parameter of the Nth predicted touch point and a coordinate parameter of the Nth actual touch point.
  • the specific method of the method 2 is as follows:
  • the sum of the coordinate parameter and the displacement difference of the first actual touch point is equal to the coordinate parameter of the first predicted touch point; the coordinate parameter of the N+1th predicted touch point is equal to the coordinate parameter and the displacement difference of the Nth predicted touch point And N is a natural number greater than zero; the displacement difference is a difference between a coordinate parameter of the Nth predicted touch point and a coordinate parameter of the Nth actual touch point.
  • the method 2 may be further refined into a method as follows. When the gesture information is an accelerated motion, the first coordinate parameter of the N+1th predicted touch point is equal to the coordinate parameter of the Nth predicted touch point.
  • the sum of the displacement differences, the second coordinate parameter of the (N+1)th touched touch point is equal to the sum of the coordinate parameter of the (N+1)th actual touch point and the pixel actually moved in the last touch event reporting period, the Nth
  • the coordinate parameters of the +1 predicted touch points are the larger ones of the first coordinate parameter of the (N+1)th predicted touch point and the second coordinate parameter of the (N+1th predicted touch point);
  • the first coordinate parameter of the (N+1)th predicted touch point is equal to the sum of the coordinate parameter of the Nth predicted touch point and the displacement difference
  • the second of the N+1th predicted touch points The coordinate parameter is equal to the sum of the coordinate parameter of the (N+1)th actual touch point and the actually moved pixel in the last touch event reporting period
  • the coordinate parameter of the (N+1)th predicted touch point is the (N+1)th Predicting a first coordinate parameter of the touch point and the (N+1)th predicted touch A smaller value among the second coordinate parameters of the touch point;
  • the S is a displacement difference
  • the L is a pixel that actually moves within a reporting period of a touch event
  • the M is a natural number greater than zero.
  • the touch event reporting period is a time interval at which the terminal device reports the actual touch point, and the time interval between the two actual touch points reported by the terminal device is a touch time reporting period.
  • the position at which the actual touch point changes on the touch screen is the pixel actually moved during the reporting period of the touch event.
  • the frequency of the actual touch point reported by the capacitive touch screen of most mobile phones is set to 80 Hz, that is, as long as the hand touches the touch screen, the touch screen maintains the coordinate parameters of an actual touch point every 12.5 ms.
  • 12.5ms is a touch event reporting period.
  • the movement mode of the human finger on the screen and the coordinate parameter of the corresponding actual touch point are: dragging action, a dragging action is a process in which the finger moves from rest to acceleration to hook speed to deceleration and finally to rest.
  • Throwing action a throwing action is that the finger is always moving on the touch screen at the fastest rate until the touch point disappears. It is the finger from rest, acceleration, to constant speed and high speed movement, and then the finger leaves the screen during high speed movement.
  • the displacement difference (the difference between the coordinate parameter of the Nth predicted touch point and the coordinate parameter of the Nth actual touch point) is calculated as follows:
  • the value of the M may be different according to the different characteristics of the different terminal devices, and the value of the M may be different according to different characteristics of the terminal device.
  • a finger moving on the touch screen at a moderately fast rate moving 16 pixels every 12.5 ms for horizontal linear movement (this value is the rate obtained by actual testing, to illustrate the embodiment of the present invention. Any value of the actual situation, the embodiment of the present invention does not limit this. ).
  • the following table 2 is the coordinate parameter of the actual touch point and the coordinate parameter of the predicted touch point reported in the 12 touch event reporting period before the finger is completely stopped after sliding a certain distance.
  • the values in the table below are for illustrative purposes only, and the actual situation may have two or three pixel errors from the data in the table below.
  • (100,200) indicates that the reported coordinate parameters are the abscissa 100 and the ordinate 200.
  • Predictive Touch ( 116,200 ) ( 132,200 ) ( 148,200 ) ( 164,200 ) ( 174,200 ) ( 177,200 )
  • the calculation process of the touch point calculation process is the same as the calculation process.
  • the calculation process is as follows:
  • the calculation process is: Coordinate parameter is: First point .
  • the first point. The first point.
  • the ordinate is stationary, the ordinate is stationary, and the ordinate is static. Therefore, according to the Nth, therefore, according to the Nth, the root predicts the touch point.
  • the predicted touch point is equal to the Nth coordinate parameter according to the Nth pre-standard parameter.
  • the coordinate parameters such as the Nth actual coordinate parameter of the actual touch point of the touch point are measured.
  • the coordinates of the final touch point are reported to the Nth actual ordinate parameter parameter.
  • the final touch point is 200.
  • the ordinate parameter is a coordinate parameter.
  • the abscissa is the deceleration 200.
  • the vertical motion is reported, so the Nth abscissa is decelerating.
  • the final abscissa coordinate parameter of the predicted touch point is then the smaller of the 174th coordinate parameter and the second coordinate parameter. So the final abscissa is 177 Nth 7 8 9 10 11 12 touch things
  • Predictive Touch (179,200) (178,200) (178,200) (178,200) (178,200) (178,200) (178,200) (178,200) (178,200)
  • the calculation process of the touch point is the same as the calculation process.
  • the calculation process is the same as the tenth calculation process.
  • the sixth point The sixth point.
  • the parameter is equal to
  • Finger dragging action refers to dragging an icon or menu movement (for example, moving an icon on the home interface from one position to another), so that when the finger moves, the icon moves with the finger. Or maybe the finger moves on the touch screen, during the move, the hand Fingers will leave traces of movement on the touch screen.
  • the icon or interactive trace follows the finger, rather than the actual touch point of the finger. Therefore, according to the embodiment of the present invention, since the coordinate parameter of the touch point is actually reported, the touch point can be reported in advance at 12.5 ms, so that when the finger moves, the icon or the movement trace moves closer to the finger.
  • the background program execution instruction corresponding to the coordinate parameter of the predicted prediction touch point can be executed before the finger reaches a certain control on the screen, so that when the finger moves, the quick response can be performed. User's operation.
  • the finger-throwing action is that the finger is always moving on the touch screen at the fastest rate until the touch point disappears. It is the finger from the standstill, to the acceleration, to the high speed movement of the hook speed, and then the finger leaves the screen during high speed movement. Therefore, according to the embodiment of the present invention, since the report point is actually the coordinate parameter for predicting the touch point, it is conceivable that when the finger leaves the touch screen in a state of high-speed movement, for example, it is separated from the P point, the fact point is actually the P point. The actual coordinate parameter plus the displacement difference, but in fact the finger did not touch that point at all, that is, finally reported the moving distance of the finger within 12.5ms.
  • the embodiment of the present invention reduces the time for the terminal device to acquire an instruction, and improves the interaction between the terminal device and the user, by using a method for obtaining a corresponding instruction according to the predicted touch point in the coordinate parameter, without improving the hardware of the terminal device.
  • FIG. 5 is a schematic structural diagram of a terminal device according to Embodiment 5 of the present invention.
  • the terminal device may include: an acquiring unit 501, configured to acquire gesture information, where the gesture information includes: a coordinate parameter of an actual touch point;
  • the processing unit 502 is configured to acquire a coordinate parameter of the predicted touch point according to the coordinate parameter of the actual touch point acquired by the acquiring unit 501, and obtain a coordinate parameter corresponding to the predicted touch point according to the coordinate parameter of the predicted touch point. Instructions.
  • the acquiring unit 501 is configured to acquire gesture information, where the gesture information includes: a coordinate parameter of an actual touch point.
  • the processing unit 502 according to the coordinate parameter of the actual touch point acquired by the acquiring unit 501, acquires a coordinate parameter of the predicted touch point, and acquires a coordinate parameter corresponding to the predicted touch point according to the coordinate parameter of the predicted touch point. Instructions.
  • the obtaining unit 501 and the processing unit 502 can be used to perform the method in the first embodiment.
  • the obtaining unit 501 and the processing unit 502 can be used to perform the method in the first embodiment.
  • details refer to the description of the method in Embodiment 1, and details are not described herein again.
  • the terminal device of the present embodiment has, in addition to the above-described respective functional modules, a module necessary for a terminal device such as a power supply module and an interface module connected to the peripheral device, and these modules are not shown in the drawings.
  • the embodiment of the present invention reduces the time for the terminal device to acquire an instruction, and improves the interaction between the terminal device and the user, and improves the user experience, by using a terminal device that obtains the corresponding instruction according to the predicted touch point in the coordinate parameter.
  • FIG. 6 is a schematic structural diagram of a possible implementation manner of a terminal device according to Embodiment 5 of the present invention. As shown in Figure 6: The terminal device further includes:
  • a first execution unit 5031 configured to receive the instruction of the processing unit 502, and execute the instruction, where the instruction includes a program background execution instruction;
  • the program execution instruction in the background is an instruction that the terminal device performs a specific operation in the background.
  • the terminal device further includes:
  • the determining unit 504 is configured to receive a coordinate parameter of the predicted touch point of the processing unit 502 and a coordinate parameter of the actual touch point, and determine whether a coordinate parameter of the predicted touch point is equal to a coordinate of the actual touch point parameter.
  • the terminal device further includes:
  • the first display unit 5051 is configured to receive, by the first execution unit 5031, an execution result of the program background execution instruction and a determination result of the determining unit 504, if the determination result is within a predetermined time, the The coordinate parameter of the predicted touch point is equal to the coordinate parameter of the actual touch point, and the execution result is displayed according to the execution result.
  • the first execution unit 5031 receives the instruction of the processing unit, and executes the instruction, where the instruction includes a program background execution instruction; the determining unit 504 receives the predicted touch point of the processing unit 502. a coordinate parameter and a coordinate parameter of the actual touch point, and determining whether a coordinate parameter of the predicted touch point is equal to a coordinate parameter of the actual touch point; if the determination result is that the predicted touch point is within a predetermined time The coordinate parameter is equal to the coordinate parameter of the actual touch point, and the first display unit 5051 displays the execution result of the program background execution instruction and the determination result of the determination unit 504 according to the receiving the first execution unit 5031, and displays the The result of the execution.
  • the first execution unit 5031 may be configured to perform the method of step S2041 in the second embodiment, and the determining unit 504 may be configured to perform the method of step S2051 in the second embodiment, the first display unit 5051
  • the method of step S2061 in Embodiment 2 can be used. For details, refer to the description of the method in Embodiment 2, and details are not described herein again.
  • the embodiment of the present invention reduces the time for the terminal device to acquire an instruction, and improves the interaction between the terminal device and the user, and improves the user experience, by using a terminal device that obtains the corresponding instruction according to the predicted touch point in the coordinate parameter. At the same time, by executing the program in the background in advance, the response speed of the terminal device is accelerated, the waiting time of the user is reduced, and the intelligent interaction capability between the terminal device and the user is improved.
  • FIG. 7 is a schematic structural diagram of another possible implementation manner of a terminal device according to Embodiment 5 of the present invention. As shown in Figure 7:
  • the terminal device further includes:
  • the second execution unit 5032 is configured to receive the instruction of the processing unit 502, and execute the instruction, where the instruction includes displaying a touch location instruction.
  • the terminal device further includes:
  • the second display unit 5052 is configured to receive an execution result of the second execution unit 5032 to execute the display touch position instruction, and display the execution result according to the execution result.
  • the second execution unit 5032 receives the instruction of the processing unit 502, and executes the instruction, the instruction includes displaying a touch position instruction; the second display unit 5052 receives the second execution unit 5032 to execute The displaying the execution result of the touch position instruction, and displaying the execution result according to the execution result.
  • the second execution unit 5032 can be used to perform the method of step S2042 in the second embodiment, and the second display unit 5052 can be used to perform the method of step S2052 in the second embodiment.
  • the description is not repeated here.
  • the embodiment of the present invention reduces the time for the terminal device to acquire an instruction, and improves the interaction between the terminal device and the user, and improves the user experience, by using a terminal device that obtains the corresponding instruction according to the predicted touch point in the coordinate parameter. At the same time, by acquiring the touch point in advance, the response speed of the terminal device is accelerated, the tracking rate of the touch position is improved, and the intelligent interaction capability between the terminal device and the user is improved.
  • FIG. 8 is a schematic structural diagram of another possible implementation manner of a terminal device according to Embodiment 5 of the present invention. As shown in Figure 8:
  • the terminal device further includes:
  • a first execution unit 5031 configured to receive the instruction of the processing unit 502, and execute the instruction, where the instruction includes a program background execution instruction; the program execution instruction in the background is a terminal device performing a specific operation in the background. instruction.
  • the second execution unit 5032 is configured to receive the instruction of the processing unit 502, and execute the instruction, where the instruction includes displaying a touch location instruction.
  • the terminal device further includes:
  • the determining unit 504 is configured to receive a coordinate parameter of the predicted touch point of the processing unit 502 and a coordinate parameter of the actual touch point, and determine whether a coordinate parameter of the predicted touch point is equal to a coordinate of the actual touch point parameter.
  • the terminal device further includes:
  • the first display unit 5051 is configured to receive, by the first execution unit 5031, an execution result of the program background execution instruction and a determination result of the determining unit 504, if the determination result is within a predetermined time, the The coordinate parameter of the predicted touch point is equal to the coordinate parameter of the actual touch point, and the execution result is displayed according to the execution result.
  • the second display unit 5052 is configured to receive an execution result of the display touch position instruction by the second execution unit 5032, and display the execution result according to the execution result.
  • the first execution unit 5031 and the second execution unit 5032 may be configured to perform the method of step S2043 in the second embodiment, and the determining unit 504 may be configured to perform the method of step S2063 in the second embodiment, the first The display unit 5051 can be used to perform the method of step S2073 in Embodiment 2, and the second display unit 5052 can be used to perform the method of step S2053 in Embodiment 2. For details, refer to the description of the method in Embodiment 2, This will not be repeated here.
  • the embodiment of the present invention reduces the time for the terminal device to acquire an instruction, and improves the interaction between the terminal device and the user, and improves the user experience, by using a terminal device that obtains the corresponding instruction according to the predicted touch point in the coordinate parameter.
  • the response speed of the terminal device is accelerated, the tracking rate of the touch position is improved, and the intelligent interaction capability between the terminal device and the user is improved.
  • the response speed of the terminal device is accelerated, the waiting time of the user is reduced, and the intelligent interaction capability between the terminal device and the user is improved.
  • the processing unit 502 is configured to obtain a coordinate parameter of the predicted touch point according to the coordinate parameter of the actual touch point acquired by the acquiring unit, specifically,
  • the embodiment of the present invention reduces the time for the terminal device to acquire an instruction, and improves the interaction between the terminal device and the user, and improves the user experience, by using a terminal device that obtains the corresponding instruction according to the predicted touch point in the coordinate parameter.
  • the processing unit 502 is configured to obtain a coordinate parameter of the predicted touch point according to the coordinate parameter of the actual touch point acquired by the acquiring unit, specifically,
  • the embodiment of the present invention reduces the time for the terminal device to acquire an instruction, and improves the interaction between the terminal device and the user, by using a terminal device that obtains the corresponding instruction according to the predicted touch point in the coordinate parameter.
  • FIG. 9 is a schematic structural diagram of a mobile phone according to an embodiment of the present invention.
  • the handset 600 shown in FIG. 6 includes: a touch screen 61, a CPU 62, a memory 63, an RF circuit 65, a peripheral interface 66, an audio circuit 67, a speaker 68, and an I/O subsystem 69.
  • the touch screen 61 is configured to receive gesture information of a user, where the gesture information includes: a coordinate parameter of an actual touch point.
  • the gesture information is a coordinate parameter of the touch point, a pressure parameter, an area parameter, and the like, and a moving speed and trend information of the touch.
  • the coordinate parameter is also the coordinate parameter of the actual touch point.
  • the touch screen 61 can also be used to display the execution results of various programs. Execution result execution result execution result execution result execution result execution result execution result execution result execution result touch screen 61 is an input interface and an output interface between the mobile phone and the user, and in addition to the function of receiving the user instruction, the visual output is also displayed to the user, visual output Can include graphics, text, icons, videos, and more.
  • the I/O subsystem 69 can control input and output peripherals on the device, and the I/O subsystem 69 can include a display controller 691 and one for controlling other input/control devices. Or multiple input controllers 692.
  • one or more input controllers 692 receive electrical signals from other input/control devices or transmit electrical signals to other input/control devices, and other input/control devices may include physical buttons (press buttons, rocker buttons, etc.) , dial, slide switch, joystick, click wheel.
  • the input controller 692 can be connected to any of the following: a keyboard, an infrared port, a USB interface, and a pointing device such as a mouse.
  • the display controller 691 in the I/O subsystem 69 receives an electrical signal from the touch screen 61 or transmits an electrical signal to the touch screen 61.
  • the touch screen 61 detects the contact on the touch screen, and the display controller 691 converts the detected contact into an interaction with the user interface object displayed on the touch screen 61, that is, realizes human-computer interaction, and the user interface object displayed on the touch screen 61 can be operated. Game icons, networking to corresponding network icons, filtering modes, etc.
  • the device can also include a light mouse, which is a touch sensitive surface that does not display a visual output, or a touch formed by a touch screen. Touch the extension of the sensitive surface.
  • the CPU 62 is configured to process the received data. Specifically, the coordinate parameter of the predicted touch point is obtained according to the coordinate parameter of the actual touch point acquired by the touch screen 61.
  • the instruction includes a program background execution instruction and/or a display touch position instruction; the CPU 62 acquires a specific method for predicting a coordinate parameter of the touch point according to the coordinate parameter of the actual touch point acquired by the acquisition unit, which may be according to Embodiment 3 Obtained by the method described in Embodiment 4, and details are not described herein again.
  • the CPU 62 is a control center of the handset 600, which connects various parts of the entire handset using various interfaces and lines, by running or executing software programs and/or modules stored in the memory 63, and recalling data stored in the memory 63, The various functions and processing data of the mobile phone 600 are performed to perform overall monitoring of the mobile phone.
  • the CPU 62 may include one or more processing units; optionally, the CPU 62 may integrate an application processor and a modem processor.
  • the application processor mainly processes an operating system, a user interface, and an application.
  • the modem processor primarily handles wireless communications. It can be understood that the above modem processor may not be integrated into the CPU 62.
  • the memory 63 may be configured to store an execution result of the CPU 62, an algorithm for acquiring a predicted touch point coordinate parameter according to a coordinate parameter of the actual touch point, and a second instruction for storing the instruction corresponding to the predicted touch point coordinate parameter.
  • Memory 63 may be accessed by CPU 62, peripheral interface 66, etc., which may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other volatile Solid state storage devices.
  • the RF circuit 65 can be used to transmit and receive information.
  • the RF circuit 65 is mainly used for establishing communication between the mobile phone and the wireless network (ie, the network side), and realizing data reception and transmission between the mobile phone and the wireless network. For example, sending and receiving short messages, emails, etc.
  • the RF circuit 65 receives and transmits an RF signal, which is also referred to as an electromagnetic signal, and the RF circuit 65 converts the electrical signal into an electromagnetic signal or converts the electromagnetic signal. Switching to an electrical signal and communicating with the communication network and other devices via the electromagnetic signal.
  • RF circuitry 65 may include known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a user identification Module (Subscriber Identity Module, SIM) and so on.
  • an antenna system an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a user identification Module (Subscriber Identity Module, SIM) and so on.
  • SIM Subscriber Identity Module
  • the peripheral interface 66 the peripheral interface can connect the input and output peripherals of the device to
  • the audio circuit 67 is primarily operable to receive audio data from the peripheral interface 66, convert the audio data to an electrical signal, and transmit the electrical signal to the speaker 68.
  • the speaker 68 can be used to restore the voice signal received by the mobile phone from the wireless network through the RF circuit 65 to sound and play the sound to the user.
  • the power management chip 64 can be used for power supply and power management of the hardware connected to the CPU 62, the I/O subsystem 69, and the peripheral interface 66.
  • the touch screen 61 receives gesture information of the user, and the gesture information includes: coordinate parameters of the actual touch point.
  • the CPU 62 receives the coordinate parameter of the actual touch point acquired by the touch screen 61, acquires a coordinate parameter of the predicted touch point according to an algorithm stored in the memory 63, and acquires coordinates of the predicted touch point according to the coordinate parameter of the predicted touch point.
  • An instruction corresponding to the parameter, the instruction includes a program background execution instruction and/or a display touch position instruction.
  • the touch screen 61 displays the execution of the The execution result of the instruction in the background of the program;
  • the touch screen 61 displays the execution result of the touch position instruction.
  • the CPU 62 executes the instruction, and the execution result is displayed by the touch screen 61 according to the CPU 62 executing the execution result of the instruction. It should be understood that the execution result may also be output through a portion other than the touch screen 61, for example, may be output by the speaker 68 or the like to complete an interaction process with the user.
  • the embodiment of the invention reduces the time for the mobile phone to acquire the instruction and improves the user experience by using a mobile phone that obtains the corresponding instruction according to the predicted touch point in the coordinate parameter.
  • the response speed of the mobile phone is accelerated, the tracking rate of the touch position is improved, and the intelligent interaction capability between the mobile phone and the user is improved.
  • the response speed of the mobile phone is accelerated, the waiting time of the user is reduced, and the intelligent interaction capability between the mobile phone and the user is improved.
  • the embodiments of the present invention can be implemented by hardware implementation, firmware implementation, or a combination thereof.
  • the functions described above may be stored in a terminal device readable medium or transmitted as one or more instructions or code on a terminal device readable medium.
  • the terminal device readable medium includes a terminal device storage medium and a communication medium, and the optional communication medium includes any medium that facilitates transfer of the terminal device program from one place to another.
  • the storage medium can be any available medium that the terminal device can access.
  • the terminal device readable medium may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, disk storage media or other magnetic storage device, or can be used to carry or store instructions or data structures.
  • Any connection may suitably be a terminal device readable medium.
  • the software is using coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL) or Wireless technologies such as infrared, radio, and microwave are transmitted from websites, servers, or other remote sources, such as coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, wireless, and microwave. Fixing of the media.
  • a disk and a disc include a compact disc (CD), a laser disc, a compact disc, a digital versatile disc (DVD), a floppy disc, and a Blu-ray disc, and the optional disc is usually magnetically replicated. Data, while the disc is optically replicated with a laser

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明实施例提供一种终端设备获取指令的方法包括:获取手势信息,所述手势信息包括:实际触摸点的坐标参数;根据所述实际触摸点的坐标参数,获取预测触摸点的坐标参数;根据所述预测触摸点的坐标参数,获取所述预测触摸点的坐标参数对应的指令。同时提供一种终端设备包括:获取单元,用于获取手势信息,所述手势信息包括:实际触摸点的坐标参数;处理单元,用于根据所述获取单元获取的实际触摸点的坐标参数,获取预测触摸点的坐标参数,并根据所述预测触摸点的坐标参数,获取所述预测触摸点的坐标参数对应的指令。

Description

一种终端设备获取指令的方法及终端设备
技术领域 本发明实施例涉及信息处理技术领域,尤其涉及一种终端设备获取指令的 方法及终端设备。 背景技术
终端设备屏幕时发生的事件被称为触碰事件。电容式触摸屏通过覆盖在终 端设备上的 X-Y电极网格工作, 运用上面的电压。 当有手指靠近电极时, 电 容会改变, 而且可以被测量。 通过比较所有电极的测量值, 就可以准确定位手 指的位置点, 即确定触摸点的坐标参数。
从用户触摸屏幕, 到应用响应事件, 到终端设备获取指令, 需要经过多个 模块处理, 时间比较长, 从具体效果上看, 就是当用户操作屏幕后, 实时响应 能力不佳。 如果要增加响应实时性, 最常用的处理方法是增加 CPU ( central processing unit, 中央处理器)处理能力、 增加显示刷新速率等方法。 在现有技术中, 这些终端设备获取指令的时间都比较长, 不能及时响应, 与用户的交互能力比较低。 发明内容
本技术方案提供一种终端设备获取指令的方法、终端设备, 用以实现根据 预测触摸点的坐标参数获取其对应的指令的方法,在不对终端设备硬件改进的 情况下,减少了终端设备获取指令的时间,提升了终端设备与用户的交互能力。 第一方面, 提供一种终端设备获取指令的方法: 获取手势信息, 所述手势信息包括: 实际触摸点的坐标参数; 根据所述实际触摸点的坐标参数, 获取预测触摸点的坐标参数; 根据所述预测触摸点的坐标参数,获取所述预测触摸点的坐标参数对应的 指令。
在第一方面的第一种可能的实现方式中, 所述指令具体包括: 程序后台执 行指令, 所述程序后台执行指令是终端设备在后台执行具体操作的指令。
结合第一方面的第一种可能的实现方式,在第二种可能的实现方式中,在 执行所述程序后台执行指令之后, 所述方法还包括: 如果在预定时间段内, 所 述预测触摸点的坐标参数等于所述实际触摸点的坐标参数,则显示执行所述程 序后台执行指令的执行结果。
结合第一方面或第一方面的第二种可能的实现方式,在第三种可能的实现 方式中, 所述指令具体还包括: 显示触摸位置指令。
结合第一方面的第三种可能的实现方式,在第四种可能的实现方式中,在 执行所述显示触摸位置指令之后, 所述方法还包括: 显示执行所述显示触摸位 置指令的执行结果。
结合第一方面或第一方面的第一种可能的实现方式或第一方面的第二种 可能的实现方式或第一方面的第三种可能的实现方式或第一方面的第四种可 能的实现方式,在第五种可能的实现方式中, 所述根据所述实际触摸点的坐标 参数, 获取预测触摸点的坐标参数的方法具体包括, 第 N个实际触摸点的坐标 参数与位移差的和等于第 N个预测触摸点的坐标参数; 所述所述 N是大于零的 自然数; 所述位移差是, 第 N个预测触摸点的坐标参数与第 N个实际触摸点的 坐标参数之间的差。 结合第一方面或第一方面的第一种可能的实现方式或第一方面的第二种 可能的实现方式或第一方面的第三种可能的实现方式或第一方面的第四种可 能的实现方式,在第六种可能的实现方式中, 所述根据所述实际触摸点的坐标 参数, 获取预测触摸点的坐标参数的方法具体包括的方法是, 第一个实际触摸 点的坐标参数与位移差之和等于第一个预测触摸点的坐标参数; 第 N个预测触 摸点的坐标参数与位移差之和等于第 N+1个预测触摸点的坐标参数; 所述所述 N是大于零的自然数; 所述位移差是, 第 N个预测触摸点的坐标参数与第 N个 实际触摸点的坐标参数之间的差。
结合第一方面的第六种可能的实现方式,在第七种可能的实现方式中, 所 述根据所述实际触摸点的坐标参数,获取预测触摸点的坐标参数的方法具体包 括, 当手势信息为加速运动时, 第 N+1个预测触摸点的第一坐标参数等于所述 第 N个预测触摸点的坐标参数与所述位移差之和, 第 N+1个预测触摸点的第二 坐标参数等于第 N+1个实际触摸点的坐标参数与上一个触摸事件上报周期内 实际移动的像素之和, 所述第 N+1个预测触摸点的坐标参数为所述第 N+1个预 测触摸点的第一坐标参数和所述第 N+1个预测触摸点的第二坐标参数之中的 较大值; 当手势信息为减速运动时, 第 N+1个预测触摸点的第一坐标参数等于 所述第 N个预测触摸点的坐标参数与所述位移差之和, 第 N+1个预测触摸点的 第二坐标参数等于第 N+1个实际触摸点的坐标参数与上一个触摸事件上报周 期内实际移动的像素之和, 所述第 N+1个预测触摸点的坐标参数为所述第 N+1 个预测触摸点的第一坐标参数和所述第 N+1个预测触摸点的第二坐标参数之 中的较小值; 当手势信息为静止时, 所述第 N+1个预测触摸点的坐标参数等于 第 N+1个实际触摸点的坐标参数。 结合第一方面的第五种可能的实现方式或第一方面的第六种可能的实现 方式或第一方面的第七种可能的实现方式,在第八种可能的实现方式中, 获取 所述位移差的方法具体包括, 当手势信息为静止时, s=o; 当手势信息为匀速 运动时, S=L; 当手势信息为加速运动时, S=L x ( 1+1/M ) ; 当手势信息为 减速运动时, S=L x ( 1-1/M ) ; 其中, 所述 S为所述位移差, 所述 L为一个触 摸事件上报周期内实际移动的像素, 所述 M是大于零的自然数。
第二方面, 提供一种终端设备:
获取单元, 用于获取手势信息, 所述手势信息包括: 实际触摸点的坐标参 数;
处理单元, 用于根据所述获取单元获取的所述实际触摸点的坐标参数, 获 取预测触摸点的坐标参数, 并根据所述预测触摸点的坐标参数, 获取所述预测 触摸点的坐标参数对应的指令。
在第二方面的第一种可能的实现方式中, 所述终端设备还包括: 第一执行 单元, 用于接收所述处理单元的所述指令, 并执行所述指令, 所述指令包括程 序后台执行指令; 所述程序后台执行指令,是终端设备在后台执行具体操作的 指令。
结合第一方面的第二种可能的实现方式,在第二种可能的实现方式中, 所 述终端设备还包括: 判断单元, 用于接收所述处理单元的所述预测触摸点的坐 标参数和所述实际触摸点的坐标参,并判断所述预测触摸点的坐标参数是否等 于所述实际触摸点的坐标参数;第一显示单元, 用于接收所述第一执行单元执 行所述程序后台执行指令的执行结果和所述判断单元的判断结果,如果所述判 断结果是在预定时间断内,所述预测触摸点的坐标参数等于所述实际触摸点的 坐标参数, 则显示所述后台执行指令的执行结果。
结合第一方面或第一方面的第二种可能的实现方式,在第三种可能的实现 方式中, 所述终端设备还包括: 第二执行单元, 用于接收所述处理单元的所述 指令, 并执行所述指令, 所述指令包括显示触摸位置指令;第二显示单元, 用 于接收所述第二执行单元执行所述显示触摸位置指令的执行结果,并显示所述 显示触摸位置指令的执行结果。
结合第一方面的第一种可能的实现方式或第一方面的第二种可能的实现 方式或第一方面的第三种可能的实现方式,在第四种可能的实现方式中, 所述 处理单元用于根据所述获取单元获取的所述实际触摸点的坐标参数,获取预测 触摸点的坐标参数, 具体包括, 根据第 N个实际触摸点的坐标参数与位移差的 和获取第 N个预测触摸点的坐标参数; 所述 N是大于零的自然数; 所述位移差 是, 第 N个预测触摸点的坐标参数与第 N个实际触摸点的坐标参数之间的差。
结合第一方面的第一种可能的实现方式或第一方面的第二种可能的实现 方式或第一方面的第三种可能的实现方式,在第五种可能的实现方式中, 所述 处理单元用于根据所述获取单元获取的所述实际触摸点的坐标参数,获取预测 触摸点的坐标参数, 具体包括,根据第一个实际触摸点的坐标参数与位移差之 和, 获取第一个预测触摸点的坐标参数; 根据第 N个预测触摸点的坐标参数与 位移差之和,获取第 N+1个预测触摸点的坐标参数; 所述 N是大于零的自然数; 所述位移差是, 第 N个预测触摸点的坐标参数与第 N个实际触摸点的坐标参数 之间的差。 本申请提供的技术方案通过一种根据预测触摸点的在坐标参数获取其对 应的指令的方法, 在不对终端设备硬件改进的情况下, 减少了终端设备获取指 令的时间, 提升了终端设备与用户的交互能力, 提升用户感受。 通过预先获取 触摸点, 加快了终端设备的响应速度, 提高了触摸位置的跟手率, 提升了终端 设备与用户的智能交互能力。 同时, 通过预先后台执行程序, 加快了终端设备 的响应速度, 减少用户等待时间, 提升了终端设备与用户的智能交互能力。 附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施 例中所需要使用的附图作一简单地介绍,显而易见地, 下面描述中的附图是本 发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的 前提下, 还可以根据这些附图获取其他的附图。 图 2为本发明实施例提供的方法实施例 2—种可能实现方式的流程图; 图 3为本发明实施例提供的方法实施例 2另一种可能实现方式的流程图; 图 4为本发明实施例提供的方法实施例 2另一种可能实现方式的流程图; 图 5为本发明实施例 5提供的终端设备的结构示意图
图 6为本发明实施例 5提供的终端设备一种可能的实现方式的结构示意图; 图 7为本发明实施例 5提供的终端设备另一种可能的实现方式的结构示意 图;
图 8为本发明实施例 5提供的终端设备另一种可能的实现方式的结构示意 图;
图 9为本发明实施例提供的手机的结构示意图。 具体实施方式 为使本发明实施例的目的、技术方案和优点更加清楚, 下面将结合本发明 实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然, 所描述的实施例是本发明一部分实施例, 而不是全部的实施例。基于本发明中 的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获取的所有其 他实施例, 都属于本发明实施例保护的范围。 在本发明实施例中使用的术语是仅仅出于描述特定实施例的目的,而非旨 在限制本发明。在本发明实施例和所附权利要求书中所使用的单数形式的 "一 种" 、 "所述" 和 "该" 也旨在包括多数形式, 除非上下文清楚地表示其他含 义。 还应当理解, 本文中使用的术语 "和 /或" 是指并包含一个或多个相关联 的列出项目的任何或所有可能组合。 进一步应当理解, 本文中釆用的术语 "包 括" 规定了所述的特征、 整体、 步骤、 操作、 元件和 /或部件的存在, 而不排 除一个或多个其他特征、 整体、 步骤、 操作、 元件、 部件和 /或它们的组的存 在或附加。 在本发明实施例中,终端设备包括但不限于手机、个人数字助理(Personal Digital Assistant, PDA ) 、 平板电脑、 便携设备(例如, 便携式计算机) 、 台 式机、 ATM ( Automatic Teller Machine, 自动取款机 )机等设备, 本发明实施 例并不限定。 实施例 1 如图 1所示, 本发明实施例 1提供的终端设备获取指令的方法具体可以包 括: 5101 , 获取手势信息, 所述手势信息包括: 实际触摸点的坐标参数。 触摸终端设备屏幕时, 电容式触摸屏通过覆盖在终端设备上的 X-Y电极网 格工作, 运用上面的电压。 当有物体接触电极时, 电容会改变, 该电容改变可 以被测量。 通过比较所有电极的测量值, 就可以准确定位物体的位置点, 包括 该触摸点的坐标参数、 压力、 面积等。 所述手势信息就是触摸点的坐标参数、 压力参数、 面积参数等, 以及可该触摸的移动速度和趋势信息。 该坐标参数也 就是该实际触摸点的坐标参数。
5102, 根据所述实际触摸点的坐标参数, 获取预测触摸点的坐标参数。 该预测触摸点的坐标参数是指根据实际触摸点的坐标参数,通过一定方法 计算,得到的参数。 该预测触摸点的坐标参数可以和实际触摸点的坐标参数相 同, 也可以和实际触摸点的坐标参数不同, 本发明实施例对此不做限制。
5103 ,根据所述预测触摸点的坐标参数,获取所述预测触摸点的坐标参数 对应的指令。
在终端设备中触摸点的坐标参数都有对应的指令,通过预测触摸点的坐标 参数, 可以获得所述预测触摸点的坐标参数对应的指令, 也就是可以在实际触 摸点到达之前获得该指令。 本发明实施例通过一种根据预测触摸点的坐标参数获取其对应的指令的 方法, 在不对终端设备硬件改进的情况下, 减少了终端设备获取指令的时间, 提升了终端设备与用户的交互能力, 提升用户感受。 实施例 2
图 2为本发明实施例提供的方法实施例 2—种可能实现方式的流程图。 如图 2所示, 终端设备获取指令的方法可以包括:
5201、 获取手势信息, 所述手势信息包括: 实际触摸点的坐标参数。
触摸终端设备屏幕时, 电容式触摸屏通过覆盖在终端设备上的 X-Y电极网 格工作, 运用上面的电压。 当有物体接触电极时, 电容会改变, 该电容改变可 以被测量。 通过比较所有电极的测量值, 就可以准确定位物体的位置点, 包括 该触摸点的坐标参数、 压力、 面积等。 所述手势信息就是触摸点的坐标参数、 压力参数、 面积参数等, 以及可该触摸的移动速度和趋势信息。 该坐标参数也 就是该实际触摸点的坐标参数。
通过比较所有电极的测量值, 就可以准确定位物体的位置点, 即确定触摸 点的坐标参数。 该坐标参数也就是该实际触摸点的坐标参数。
5202、 根据所述实际触摸点的坐标参数, 获取预测触摸点的坐标参数。 该预测触摸点的坐标参数是指根据实际触摸点的坐标参数,通过一定方法 计算,得到的参数。 该预测触摸点的坐标参数可以和实际触摸点的坐标参数相 同, 也可以和实际触摸点的坐标参数不同, 本发明实施例对此不做限制。
所述根据所述实际触摸点的坐标参数,获取预测触摸点的坐标参数的一种 方法, 方法一可以是, 第 N个实际触摸点的坐标参数与位移差的和等于第 N个 预测触摸点的坐标参数; 所述所述 N是大于零的自然数; 所述位移差是, 第 N 个预测触摸点的坐标参数与第 N个实际触摸点的坐标参数之间的差。
所述根据所述实际触摸点的坐标参数,获取预测触摸点的坐标参数的另一 种方法, 方法二可以是, 第一个实际触摸点的坐标参数与位移差之和等于第一 个预测触摸点的坐标参数; 第 N+1个预测触摸点的坐标参数等于第 N个预测触 摸点的坐标参数与位移差之和;所述所述 N是大于零的自然数;所述位移差是, 第 N个预测触摸点的坐标参数与第 N个实际触摸点的坐标参数之间的差。
可选的,
所述方法二还可以是, 当手势信息为加速运动时, 第 N+1个预测触摸点的 第一坐标参数等于所述第 N个预测触摸点的坐标参数与所述位移差之和, 第 N+1个预测触摸点的第二坐标参数等于第 N+1个实际触摸点的坐标参数与上一 个触摸事件上报周期内实际移动的像素之和, 所述第 N+1个预测触摸点的坐标 参数为所述第 N+1个预测触摸点的第一坐标参数和所述第 N+1个预测触摸点的 第二坐标参数之中的较大值;
当手势信息为减速运动时, 第 N+ 1个预测触摸点的第一坐标参数等于所述 第 N个预测触摸点的坐标参数与所述位移差之和, 第 N+1个预测触摸点的第二 坐标参数等于第 N+1个实际触摸点的坐标参数与上一个触摸事件上报周期内 实际移动的像素之和, 所述第 N+1个预测触摸点的坐标参数为所述第 N+1个预 测触摸点的第一坐标参数和所述第 N+1个预测触摸点的第二坐标参数之中的 较小值;
当手势信息为静止时, 所述第 N+1个预测触摸点的坐标参数等于第 N+1个 实际触摸点的坐标参数。
对于方法一和方法二中的所述位移差可以通过如下的方法获得, 当手势信息为静止时, s=o;
当手势信息为匀速运动时, S=L;
当手势信息为加速运动时, S=L x ( 1+1/M ) ;
当手势信息为减速运动时, S=L x ( 1-1/M ) ;
其中, 所述 S为位移差, 所述 L为一个触摸事件上报周期内实际移动的像 素, 所述 M是大于零的自然数。
所述位移差是, 第 N个预测触摸点的坐标参数与第 N个实际触摸点的坐标 参数之间的差。
在终端设备的触摸屏检测到触摸事件后, 将该事件通过通过 I2C ( Inter-Integrated Circuit, I2C协议)总线和事件读取模块上报给框架层的手势 转换模块, 通过手势转换模块获取该触摸事件的手势信息。 所述手势信息, 包 括手势为加速运动、 减速运动、 勾速运动、 静止等信息。 应该理解的是, 所述 通过手势转换模块获取该触摸事件的手势信息, 不仅包括手势为加速运动、减 速运动、 勾速运动、 静止等信息, 还包括其他手势信息, 本发明实施例在此不 做限制。
S203、根据所述预测触摸点的坐标参数,获取所述预测触摸点的坐标参数 对应的指令。
在触摸屏上每一个坐标参数都有该坐标参数对应的指令,所述预测触摸点 的坐标参数, 同样也对应具体的指令。 所述坐标参数与所属指令的对应关系存 储在终端设备上, 该对应关系的内容可以是固定不变的, 也可以是根据触摸屏 显示内容的变化自动更新的, 本发明实施例对此不做限制。
S2041、 执行所述指令, 所述指令是程序后台执行指令。
所述程序后台执行指令, 是终端设备在后台执行具体操作的指令。 所述后 台操作是指该指令的执行过程不对用户显示, 用户也不用关心, 完全由终端设 备自动执行。
S2051、判断在预定时间段内, 所述预测触摸点的坐标参数是否等于所述 实际触摸点的坐标参数。如果所述预测触摸点的坐标参数等于所述实际触摸点 的坐标参数, 则执行 S2061 ; 否则执行 S2071。
所述预定时间段是指预定个数的触摸事件上报周期。所述预测触摸点的坐 标参数是否等于实际触摸点的坐标参数就相当于用户实际触摸该图标、 控件 等, 相当于用户执行某种操作。
S2061、 显示执行所述程序后台执行指令的执行结果。
即在指预定个数的触摸事件上报周期内,所述预测触摸点的坐标参数等于 所述实际触摸点的坐标参数, 就认为用户实际上执行了某种操作, 并立即显示 终端设备执行所述后台执行指令的执行结果。
S2071、 取消执行所述程序后台执行指令。
即在指预定个数的触摸事件上报周期内,所述预测触摸点的坐标参数不等 于所述实际触摸点的坐标参数, 就认为用户实际上只是从该点经过, 终端设备 实际上没有获取执行具体操作的指令, 此时如果所述指令还没有执行完毕, 就 取消执行所述指令,如果所述指令已经执行完毕, 就删除执行所述指令的执行 结果。 本发明实施例通过一种根据预测触摸点的坐标参数获取其对应的指令的 方法本,在不对终端设备硬件改进的情况下,减少了终端设备获取指令的时间, 提升了终端设备与用户的交互能力, 提升用户感受。 同时, 通过预先后台执行 程序, 加快了终端设备的响应速度, 减少用户等待时间, 提升了终端设备与用 户的智能交互能力。
可选的,
图 3为本发明实施例提供的方法实施例 2另一种可能实现方式的流程图。 如图 3所示, 在步骤 S203之后还可以包括: 52042、 执行所述指令, 所述指令是显示触摸位置指令。
52052、 显示执行所述显示触摸位置指令的执行结果。
所述显示执行所述显示触摸位置指令的执行结果,就是在触摸屏上显示触 摸位置的图标、 痕迹等, 以向用户明确该触摸点所在位置, 具体显示形式本发 明实施例对此不做限制。 本发明实施例通过一种根据预测触摸点的在坐标参数获取其对应的指令 的方法显示执行所述显示触摸位置指令的执行结果,在不对终端设备硬件改进 的情况下, 减少了终端设备获取指令的时间,提升了终端设备与用户的交互能 力,提升用户感受。 同时,通过预先获取触摸点,加快了终端设备的响应速度, 提高了触摸位置的跟手率, 提升了终端设备与用户的智能交互能力。
可选的,
图 4为本发明实施例提供的方法实施例 2另一种可能实现方式的流程图。 如图 4所示, 在步骤 S203之后还可以包括:
52043 , 执行所述指令, 所述指令是程序后台执行指令和显示触摸位置指 令。
所述程序后台执行指令, 是终端设备在后台执行具体操作的指令。 所述后 台操作是指该指令的执行过程不对用户显示, 用户也不用关心, 完全由终端设 备自动执行。
52053 , 显示触摸位置指令的执行结果。
所述显示执行所述显示触摸位置指令的执行结果,就是在触摸屏上显示触 摸位置的图标、 痕迹等, 以向用户明确该触摸点所在位置, 具体显示形式本发 明实施例对此不做限制。 S2063判断在预定时间段内, 所述预测触摸点的坐标参数是否等于所述实 际触摸点的坐标参数。如果所述预测触摸点的坐标参数等于所述实际触摸点的 坐标参数, 则执行 S2073 ; 否则执行 S2083。
所述预定时间段是指预定个数的触摸事件上报周期。所述预测触摸点的坐 标参数是否等于实际触摸点的坐标参数就相当于用户实际触摸该图标、 控件 等, 相当于用户执行某种操作。
应当理解, 步骤 S 2053和步骤 S2063之间执行没有先后顺序, 可以先执行 步骤 S 2053再执行步骤 S2063 , 也可以先执行步骤 S2063在执行步骤 S 2053 , 还可以同时执行步骤 S 2053和步骤 S2063 , 本发明实施例对此不做限制。
S2073 , 显示执行所述程序后台执行指令的执行结果。
即在指预定个数的触摸事件上报周期内,所述预测触摸点的坐标参数等于 所述实际触摸点的坐标参数, 就认为用户实际上执行了某种操作, 并立即显示 终端设备执行所述后台执行指令的执行结果。
S2083 , 取消执行所述程序后台执行指令。
即在指预定个数的触摸事件上报周期内,所述预测触摸点的坐标参数不等 于所述实际触摸点的坐标参数, 就认为用户实际上只是从该点经过, 终端设备 实际上没有获取执行某种操作的指令, 此时如果所述指令还没有执行完毕, 就 取消执行所述指令,如果所述指令已经执行完毕, 就删除执行所述指令的执行 结果。 本发明实施例通过一种根据预测触摸点的在坐标参数获取其对应的指令 的方法,在不对终端设备硬件改进的情况下,减少了终端设备获取指令的时间, 提升了终端设备与用户的交互能力, 提升用户感受。 通过预先获取触摸点, 加 快了终端设备的响应速度,提高了触摸位置的跟手率,提升了终端设备与用户 的智能交互能力。 同时,通过预先后台执行程序,加快了终端设备的响应速度, 减少用户等待时间, 提升了终端设备与用户的智能交互能力。 实施例 3
本发明实施例 3是在实施例 2的基础上实现的。 具体的, 实施例 2中所述的 根据所述实际触摸点的坐标参数,获取预测触摸点的坐标参数方法一的具体实 现方法可以通过本实施例 3来实现。
实施例 2的方法一可以是, 第 N个实际触摸点的坐标参数与位移差的和等 于第 N个预测触摸点的坐标参数; 所述所述 N是大于零的自然数; 所述位移差 是, 第 N个预测触摸点的坐标参数与第 N个实际触摸点的坐标参数之间的差。
在实施例 2的方法一的基础上, 方法一的具体方法如下所述:
第 N个实际触摸点的坐标参数与位移差的和等于第 N个预测触摸点的坐标 参数; 所述 N是大于零的自然数; 所述位移差是, 第 N个预测触摸点的坐标参 数与第 N个实际触摸点的坐标参数之间的差。
所述位移差的计算方法如下所示:
当手势信息为匀速运动时, S=L;
当手势信息为加速运动时, S=L x ( 1+1/M ) ;
当手势信息为减速运动时, S=L x ( 1-1/M ) ;
其中, 所述 S为位移差, 所述 L为一个触摸事件上报周期内实际移动的像 素, 所述 M是大于零的自然数。
所述触摸事件上报周期是, 终端设备上报实际触摸点的时间间隔, 终端设 备上报两个实际触摸点之间的时间间隔就是一个触摸时间上报周期。在这个周 期的时间段内,实际触摸点在触摸屏上改变的位置就是一个触摸事件上报周期 内实际移动的像素。
下面以人手在触摸屏上移动为例说明上述计算方法。
目前大部分手机的电容式触摸屏上报实际触摸点的频率设置为 80Hz, 即 只要手接触触摸屏, 触摸屏就保持每 12.5ms上报一个实际触摸点的坐标参数。 在这种情况下 12.5ms就是一个触摸事件上报周期。
经过分析 ,人手指在屏幕上的移动方式以及对应的实际触摸点的坐标参数 情况为, 拖拽动作, 一个拖拽动作是, 手指从静止到加速到勾速再到减速最后 到静止的过程。抛掷动作, 一个抛掷动作是手指一直是以最快速率在触摸屏上 移动, 直到触摸点消失。 是手指从静止, 到加速, 到匀速高速运动, 然后再高 速运动中手指离开屏幕。 经过上述分析可以得出, 手指移动过程中主要有匀速 运动、加速运动、减速运动这三种移动形式。针对不同的移动形式,位移差(第 N个预测触摸点的坐标参数,与第 N个实际触摸点的坐标参数之间的位置差距 ) 的计算方法如下所示:
当手势信息为匀速运动时, S=L;
当手势信息为加速运动时, S=L x ( 1+1/M ) ;
当手势信息为减速运动时, S=L x ( 1-1/M ) ;
其中, 所述 S为位移差, 单位是像素; 所述 L为一个触摸事件上报周期内 实际移动的像素, 单位是像素; 所述 M是大于零的自然数, 经过实际测算得出 所述 M等于 5。
应当理解的是, 所述 M等于 5通过实际测量得出的结果, 根据不同终端设 备的不同特性, 所述 M的值可以有所不同, 本发明实施例对此不做限制。
下面, 举个手指在触摸屏上以中等偏快速率, 每 12.5ms移动 16个像素进行 水平直线移动的操作(此值是通过实际测试得出的速率,为说明本发明实施例。 合实际情况的任何值, 本发明实施例对此不做限制。 )。 下表一, 是滑动一定 距离后在手指完全停止前 12个触摸事件上报周期内上报的实际触摸点的坐标 参数和预测触摸点的坐标参数。 下表中的数值只是举例说明, 实际情况可能与 下表中的数据有两三个像素的误差。 其中 ( 100,200 )表示上报的坐标参数为 横坐标 100 , 纵坐标 200。
实际触摸点的坐标参数和预测触摸点的坐标参数如下表一所示:
第 N 1 2 3 4 5 6 个触摸
事件上
报周期
实际 ( 100,200 ) ( 116,200 ) ( 132,200 ) ( 148,200 ) ( 161,200 ) ( 169,200 ) 触摸点
的 坐标
参数
预测触 ( 116,200 ) ( 132,200 ) ( 148,200 ) ( 164,200 ) ( 172,200 ) ( 175,200 ) 摸点 的 计算过程为: 计算过程同 计算过程同 计算过程同 计算过程为: 计算过程同 坐标参 纵坐标为静 第一个点 第一个点 第一个点 纵坐标为静止, 第五个点。 数 止, 因此根据 因此根据公式:
公式: s=o, 所 S=0, 所以最终
以最终上报纵 上报纵坐标参
坐标参数为 数 为
200+0=200。 200+0=200。 横坐标为匀速 横坐标为减速 运动, 因此根 运动,因此根据
o
据公式: S=L, 公式: S=L
II
即 S=16。 因此 ( 1-1/M ), 即 S
最终上报横坐 = 13
标参数为 , ( 1-1/5 ) = 11
因此最终上报
横坐标参数为,
161+11=172
第 N 个 7 8 9 10 11 12 触摸事
件上报
周期
实际 ( 174,200 ) ( 176,200 ) ( 177,200 ) ( 178,200 ) ( 178,200 ) ( 178,200 ) 触摸点
的 坐标
参数
预测触 (178,200) (178,200) (178,200) (178,200) (178,200) (178,200) 摸点 的 计算过程同第 计算过程同 计算过程同 计 算过程 计算过程为: 计算过程同 坐标参 五个点。 第五个点。 第五个点。 为: 纵坐标为静止, 第 十 一个 数 纵坐标为静 因此根据公式: 点。
止, 因此根 s=o, 所以最终
据公式 : 上报纵坐标参
s=o,所以最 数 为
终上报纵坐 200+0=200
标参数为 横坐标为静止,
200+0=200 因此根据公式: 横坐标为匀 s=o, 所以最终
速运动, 因 上报纵坐标参
此根据公 数 为
式: S=L, 即 178+0=178
S=16。 因此
最终上报横
坐 标参数
为 ,
100+16=116
表一
手指拖拽动作, 是指拖着某个图标或者菜单移动 (例如将 home界面上的 某个图标移动从一个位置移动到另外一个位置), 这样当手指移动的时候, 图 标就会跟着手指一起移动; 或者是指手指在触摸屏上移动, 在移动过程中, 手 指会在触摸屏上留下移动的痕迹。但由于上报实际触摸点的坐标参数以及显示 实际触摸点总存在的滞后性, 就会发现图标或者互动痕迹跟在手指的后面, 而 不是手指实际触摸点的位置。 因此根据本发明实施例, 由于实际上报点事预测 触摸点的坐标参数, 所以实际上可以提前 12.5ms上报触摸点, 这样当手指移动 的时候, 图标或者移动痕迹就会更紧密的跟着手指一起移动; 同时由于实际上 报点事预测触摸点的坐标参数,可以在手指到达屏幕上某个控件之前就执行预 测预测触摸点的坐标参数对应的后台程序执行指令, 这样当手指移动时, 就能 快速反应用户的操作。
手指抛掷动作,是手指一直是以最快速率在触摸屏上移动, 直到触摸点消 失。 是手指从静止, 到加速, 到勾速高速运动, 然后再高速运动中手指离开屏 幕。 因此根据本发明实施例, 由于实际上报点是预测触摸点的坐标参数, 因此 可以想象, 当手指高速移动的状态下离开触摸屏的时候,例如是从 P点离开的, 其实实际上报点是 P点的实际坐标参数加上位移差, 但实际上手指根本没有接 触到那个点, 即最后多上报了手指在 12.5ms内的移动距离。 但由于 12.5ms对于 用户是一个极短的时间,同样在 12.5ms内手指移动的位移也是用户不容易感受 到的, 因此不会降低与用户的智能交互能力, 也不会降低用户的使用感受。 本发明实施例通过一种根据预测触摸点的在坐标参数获取其对应的指令 的方法,在不对终端设备硬件改进的情况下,减少了终端设备获取指令的时间, 提升了终端设备与用户的交互能力, 提升用户感受。 实施例 4
本发明实施例 3是在实施例 2的基础上实现的。 具体的, 实施例 2中所述的 根据所述实际触摸点的坐标参数,获取预测触摸点的坐标参数方法二的具体实 现方法可以通过本实施例 4来实现。
实施例 2的方法二可以是, 第一个实际触摸点的坐标参数与位移差之和等 于第一个预测触摸点的坐标参数; 第 N+1个预测触摸点的坐标参数等于第 N个 预测触摸点的坐标参数与位移差之和; 所述 N是大于零的自然数; 所述位移 差是,第 N个预测触摸点的坐标参数与第 N个实际触摸点的坐标参数之间的差。
在实施例 2的方法一的基础上, 方法二的具体方法如下所述:
第一个实际触摸点的坐标参数与位移差之和等于第一个预测触摸点的坐 标参数; 第 N+1个预测触摸点的坐标参数等于第 N个预测触摸点的坐标参数与 位移差之和; 所述 N是大于零的自然数; 所述位移差是, 第 N个预测触摸点的 坐标参数与第 N个实际触摸点的坐标参数之间的差。 所述方法二可进一步细化为如下所述的方法, 当手势信息为加速运动时, 第 N+1个预测触摸点的第一坐标参数等于所述第 N个预测触摸点的坐标参数与 所述位移差之和, 第 N+1个预测触摸点的第二坐标参数等于第 N+1个实际触摸 点的坐标参数与上一个触摸事件上报周期内实际移动的像素之和, 所述第 N+1 个预测触摸点的坐标参数为所述第 N+1个预测触摸点的第一坐标参数和所述 第 N+1个预测触摸点的第二坐标参数之中的较大值; 当手势信息为减速运动 时, 第 N+1个预测触摸点的第一坐标参数等于所述第 N个预测触摸点的坐标参 数与所述位移差之和, 第 N+1个预测触摸点的第二坐标参数等于第 N+1个实际 触摸点的坐标参数与上一个触摸事件上报周期内实际移动的像素之和,所述第 N+1个预测触摸点的坐标参数为所述第 N+1个预测触摸点的第一坐标参数和所 述第 N+1个预测触摸点的第二坐标参数之中的较小值; 当手势信息为静止时, 所述第 N+1个预测触摸点的坐标参数等于第 N+1个实际触摸点的坐标参数。
所述位移差的计算方法如下所示:
当手势信息为匀速运动时, S=L;
当手势信息为加速运动时, S=L x ( 1+1/M ) ;
当手势信息为减速运动时, S=L x ( 1-1/M ) ;
其中, 所述 S为位移差, 所述 L为一个触摸事件上报周期内实际移动的像 素, 所述 M是大于零的自然数。
所述触摸事件上报周期是, 终端设备上报实际触摸点的时间间隔, 终端设 备上报两个实际触摸点之间的时间间隔就是一个触摸时间上报周期。在这个周 期的时间段内,实际触摸点在触摸屏上改变的位置就是一个触摸事件上报周期 内实际移动的像素。 下面以人手在触摸屏上移动为例说明上述计算方法。
目前大部分手机的电容式触摸屏上报实际触摸点的频率设置为 80Hz, 即 只要手接触触摸屏, 触摸屏就保持每 12.5ms上报一个实际触摸点的坐标参数。 在这种情况下 12.5ms就是一个触摸事件上报周期。
经过分析,人手指在屏幕上的移动方式以及对应的实际触摸点的坐标参数 情况为, 拖拽动作, 一个拖拽动作是, 手指从静止到加速到勾速再到减速最后 到静止的过程。抛掷动作, 一个抛掷动作是手指一直是以最快速率在触摸屏上 移动, 直到触摸点消失。 是手指从静止, 到加速, 到匀速高速运动, 然后再高 速运动中手指离开屏幕。 经过上述分析可以得出, 手指移动过程中主要有匀速 运动、加速运动、减速运动这三种移动形式。针对不同的移动形式,位移差(第 N个预测触摸点的坐标参数,与第 N个实际触摸点的坐标参数之间的位置差距 ) 的计算方法如下所示:
当手势信息为匀速运动时, S=L;
当手势信息为加速运动时, S=L x ( 1+1/M ) ;
当手势信息为减速运动时, S=L x ( 1-1/M ) ;
其中, 所述 S为位移差, 单位是像素; 所述 L为一个触摸事件上报周期内 实际移动的像素, 单位是像素; 所述 M是大于零的自然数, 经过实际测算得出 所述 M等于 5。
应当理解的是, 所述 M等于 5通过实际测量得出的结果, 根据不同终端设 备的不同特性, 所述 M的值可以有所不同, 本发明实施例对此不做限制。
下面, 举个手指在触摸屏上以中等偏快速率, 每 12.5ms移动 16个像素进行 水平直线移动的操作(此值是通过实际测试得出的速率,为说明本发明实施例。 合实际情况的任何值, 本发明实施例对此不做限制。 )。 下表二, 是滑动一定 距离后在手指完全停止前 12个触摸事件上报周期内上报的实际触摸点的坐标 参数和预测触摸点的坐标参数。 下表中的数值只是举例说明, 实际情况可能与 下表中的数据有两三个像素的误差。 其中 ( 100,200 )表示上报的坐标参数为 横坐标 100 , 纵坐标 200。
实际触摸点的坐标参数和预测触摸点的坐标参数如下表二所示: 第 N 1 2 3 4 5 6
个触摸
事件上
报周期
实际 ( 100,200 ) ( 116,200 ) ( 132,200 ) ( 148,200 ) ( 161,200 ) ( 169,200 ) 触摸点
的 坐标
参数
预测触 ( 116,200 ) ( 132,200 ) ( 148,200 ) ( 164,200 ) ( 174,200 ) ( 177,200 ) 摸点 的 计算过程 计算过程同 计算过程同 计算过程同 计算过程为: 计算过程为: 坐标参 为: 第一个点。 第一个点。 第一个点。 纵坐标为静止, 纵坐标为静止, 数 纵坐标为静 因此根据第 N个 因此根据第 N 止, 因此根 预测触摸点的坐 个预测触摸点 据第 N个预 标参数等于第 N 的坐标参数等 测触摸点的 个实际触摸点的 于第 N个实际 坐标参数等 坐标参数。 最终 触摸点的坐标 于第 N个实 上报纵坐标参数 参数。最终上报 际触摸点的 为 200。 纵坐标参数为 坐标参数。 横坐标为减速运 200。 最终上报纵 动, 因此第 N个 横坐标为减速 坐标参数为 预测触摸点的第 运动,因此第 N 200 一坐标参数, 根 个预测触摸点 横坐标为匀 据公式 S=L 的第一坐标参 速运动, 因 ( 1- )'即 S = 数, 根据公式 此根据公 S=L 式: S=L , = 11 , 横坐标为 ( 1-1/M ), 即 S 即 S=16 164+11=175.
因此最终横 第 N个预测触摸 = 6, 横坐标为 坐标为 , 点的第二坐标参 174+6=180 100+16=11 数, 横坐标为 第 N个预测触 6 161+ ( 164-148 ) 摸点的第二坐
=174 标参数,横坐标 又根据减速运动
时, 第 N个预测
触摸点的坐标参 =177 数为第一坐标参 又根据减速运 数和第二坐标参 II动时, 第 N个 数中的较小值。 预测触摸点的 因此最终横坐标 坐标参数为第 为 174 一坐标参数和 第二坐标参数 中的较小值。 因此最终横坐 标为 177 第 N 个 7 8 9 10 11 12 触摸事
件上报
周期
实际 ( 174,200 ) ( 176,200 ) ( 177,200 ) ( 178,200 ) ( 178,200 ) ( 178,200 ) 触摸点
的 坐标
参数
预测触 (179,200) (178,200) (178,200) (178,200) (178,200) (178,200) 摸点 的 计算过程同 计算过程同 计算过程同 计算过程 计算过程同第十 计算过程同第 坐标参 第六个点。 第六个点。 第六个点。 为: 点。 十个点。
数 静止时, 第
N个预测触
摸点的坐标
参数等于第
N个实际触
摸点的坐标
参数。
因此最终纵
坐 标
为, 200 ' 最
终横坐标为
178
Figure imgf000026_0001
手指拖拽动作, 是指拖着某个图标或者菜单移动 (例如将 home界面上的 某个图标移动从一个位置移动到另外一个位置), 这样当手指移动的时候, 图 标就会跟着手指一起移动; 或者是指手指在触摸屏上移动, 在移动过程中, 手 指会在触摸屏上留下移动的痕迹。但由于上报实际触摸点的坐标参数以及显示 实际触摸点总存在的滞后性, 就会发现图标或者互动痕迹跟在手指的后面, 而 不是手指实际触摸点的位置。 因此根据本发明实施例, 由于实际上报点事预测 触摸点的坐标参数, 所以实际上可以提前 12.5ms上报触摸点, 这样当手指移动 的时候, 图标或者移动痕迹就会更紧密的跟着手指一起移动; 同时由于实际上 报点事预测触摸点的坐标参数,可以在手指到达屏幕上某个控件之前就执行预 测预测触摸点的坐标参数对应的后台程序执行指令, 这样当手指移动时, 就能 快速反应用户的操作。
手指抛掷动作,是手指一直是以最快速率在触摸屏上移动, 直到触摸点消 失。 是手指从静止, 到加速, 到勾速高速运动, 然后再高速运动中手指离开屏 幕。 因此根据本发明实施例, 由于实际上报点是预测触摸点的坐标参数, 因此 可以想象, 当手指高速移动的状态下离开触摸屏的时候,例如是从 P点离开的, 其实实际上报点是 P点的实际坐标参数加上位移差, 但实际上手指根本没有接 触到那个点, 即最后多上报了手指在 12.5ms内的移动距离。 但由于 12.5ms对于 用户是一个极短的时间,同样在 12.5ms内手指移动的位移也是用户不容易感受 到的, 因此不会降低与用户的智能交互能力, 也不会降低用户的使用感受。 本发明实施例通过一种根据预测触摸点的在坐标参数获取其对应的指令 的方法,在不对终端设备硬件改进的情况下,减少了终端设备获取指令的时间, 提升了终端设备与用户的交互能力, 提升用户感受。
实施例 5
图 5为本发明实施例 5提供的终端设备的结构示意图。 如图 5所示, 终端设备可以包括: 获取单元 501 , 用于获取手势信息, 所述手势信息包括: 实际触摸点的坐 标参数;
处理单元 502, 用于根据所述获取单元 501获取的实际触摸点的坐标参数, 获取预测触摸点的坐标参数, 并根据所述预测触摸点的坐标参数, 获取所述预 测触摸点的坐标参数对应的指令。
所述获取单元 501 , 获取手势信息, 所述手势信息包括: 实际触摸点的坐 标参数。 所述处理单元 502 ,根据所述获取单元 501获取的实际触摸点的坐标参 数, 获取预测触摸点的坐标参数, 并根据所述预测触摸点的坐标参数, 获取所 述预测触摸点的坐标参数对应的指令。
获取单元 501和处理单元 502可以用于执行实施例 1中的方法, 具体方法详 见实施例 1对所述方法的描述, 在此不再赘述。
在此说明, 本实施例的终端设备除了具有上述各功能模块之外, 还具有 电源模块、 与外设连接的接口模块等终端设备必备的模块, 这些模块未在附图 中示出。 本发明实施例通过一种根据预测触摸点的在坐标参数获取其对应的指令 的终端设备, 减少了终端设备获取指令的时间,提升了终端设备与用户的交互 能力, 提升用户感受。 可选的,
图 6为本发明实施例 5提供的终端设备一种可能的实现方式的结构示意图。 如图 6所示: 所述终端设备还包括:
第一执行单元 5031 , 用于接收所述处理单元 502的所述指令, 并执行所述 指令, 所述指令包括程序后台执行指令;
所述程序后台执行指令, 是终端设备在后台执行具体操作的指令。
所述终端设备还包括:
判断单元 504,用于接收所述处理单元 502的所述预测触摸点的坐标参数和 所述实际触摸点的坐标参,并判断所述预测触摸点的坐标参数是否等于所述实 际触摸点的坐标参数。
所述终端设备还包括:
第一显示单元 5051 ,用于接收所述第一执行单元 5031执行所述程序后台执 行指令的执行结果和所述判断单元 504的判断结果, 如果所述判断结果是在预 定时间断内, 所述预测触摸点的坐标参数等于所述实际触摸点的坐标参数, 则 根据所述执行结果, 显示所述执行结果。
所述第一执行单元 5031接收所述处理单元的所述指令, 并执行所述指令, 所述指令包括程序后台执行指令;所述判断单元 504接收所述处理单元 502的所 述预测触摸点的坐标参数和所述实际触摸点的坐标参,并判断所述预测触摸点 的坐标参数是否等于所述实际触摸点的坐标参数;如果所述判断结果是在预定 时间断内所述预测触摸点的坐标参数等于所述实际触摸点的坐标参数,所述第 一显示单元 5051根据接收所述第一执行单元 5031执行所述程序后台执行指令 的执行结果和所述判断单元 504的判断结果, 显示所述执行结果。
所述第一执行单元 5031可以用于执行实施例 2中步骤 S2041的方法,所述判 断单元 504可以用于执行实施例 2中步骤 S2051的方法, 所述第一显示单元 5051 可以用于执行实施例 2中步骤 S2061的方法, 具体方法详见实施例 2对所述方法 的描述, 在此不再赘述。 本发明实施例通过一种根据预测触摸点的在坐标参数获取其对应的指令 的终端设备, 减少了终端设备获取指令的时间,提升了终端设备与用户的交互 能力, 提升用户感受。 同时, 通过预先后台执行程序, 加快了终端设备的响应 速度, 减少用户等待时间, 提升了终端设备与用户的智能交互能力。 可选的,
图 7为本发明实施例 5提供的终端设备另一种可能的实现方式的结构示意 图。 如图 7所示:
所述终端设备还包括:
第二执行单元 5032 , 用于接收所述处理单元 502的所述指令, 并执行所述 指令, 所述指令包括显示触摸位置指令。
所述终端设备还包括:
第二显示单元 5052 ,用于接收所述第二执行单元 5032执行所述显示触摸位 置指令的执行结果, 并根据所述执行结果, 显示所述执行结果。
所述第二执行单元 5032 , 接收所述处理单元 502的所述指令, 并执行所述 指令, 所述指令包括显示触摸位置指令; 所述第二显示单元 5052接收所述第二 执行单元 5032执行所述显示触摸位置指令的执行结果, 并根据所述执行结果, 显示所述执行结果。
所述第二执行单元 5032可以用于执行实施例 2中步骤 S2042的方法,所述第 二显示单元 5052可用于执行实施例 2中步骤 S2052的方法,具体方法详见实施例 2对所述方法的描述, 在此不再赘述。 本发明实施例通过一种根据预测触摸点的在坐标参数获取其对应的指令 的终端设备, 减少了终端设备获取指令的时间,提升了终端设备与用户的交互 能力, 提升用户感受。 同时, 通过预先获取触摸点, 加快了终端设备的响应速 度, 提高了触摸位置的跟手率, 提升了终端设备与用户的智能交互能力。
可选的,
图 8为本发明实施例 5提供的终端设备另一种可能的实现方式的结构示意 图。 如图 8所示:
所述终端设备还包括:
第一执行单元 5031 , 用于接收所述处理单元 502的所述指令, 并执行所述 指令, 所述指令包括程序后台执行指令; 所述程序后台执行指令, 是终端设备 在后台执行具体操作的指令。
第二执行单元 5032 , 用于接收所述处理单元 502的所述指令, 并执行所述 指令, 所述指令包括显示触摸位置指令。
所述终端设备还包括:
判断单元 504 ,用于接收所述处理单元 502的所述预测触摸点的坐标参数和 所述实际触摸点的坐标参,并判断所述预测触摸点的坐标参数是否等于所述实 际触摸点的坐标参数。
所述终端设备还包括:
第一显示单元 5051 ,用于接收所述第一执行单元 5031执行所述程序后台执 行指令的执行结果和所述判断单元 504的判断结果, 如果所述判断结果是在预 定时间断内, 所述预测触摸点的坐标参数等于所述实际触摸点的坐标参数, 则 根据所述执行结果, 显示所述执行结果。 第二显示单元 5052,用于接收所述第二执行单元 5032执行所述显示触摸位 置指令的执行结果, 并根据所述执行结果, 显示所述执行结果。
所述第一执行单元 5031和所述第二执行单元 5032可以用于执行实施例 2中 步骤 S2043的方法, 所述判断单元 504可以用于执行实施例 2中步骤 S2063的方 法,所述第一显示单元 5051可以用于执行实施例 2中步骤 S2073的方法,所述第 二显示单元 5052可用于执行实施例 2中步骤 S2053的方法,具体方法详见实施例 2对所述方法的描述, 在此不再赘述。 本发明实施例通过一种根据预测触摸点的在坐标参数获取其对应的指令 的终端设备, 减少了终端设备获取指令的时间,提升了终端设备与用户的交互 能力, 提升用户感受。 通过预先获取触摸点, 加快了终端设备的响应速度, 提 高了触摸位置的跟手率, 提升了终端设备与用户的智能交互能力。 同时, 通过 预先后台执行程序, 加快了终端设备的响应速度, 减少用户等待时间, 提升了 终端设备与用户的智能交互能力。 可选的,
所述处理单元 502用于根据所述获取单元获取的所述实际触摸点的坐标参 数, 获取预测触摸点的坐标参数, 具体包括,
根据第 N个实际触摸点的坐标参数与位移差的和获取第 N个预测触摸点的 坐标参数; 所述 N是大于零的自然数; 所述位移差是, 第 N个预测触摸点的坐 标参数与第 N个实际触摸点的坐标参数之间的差。 对所述方法的描述, 在此不再赘述。 本发明实施例通过一种根据预测触摸点的在坐标参数获取其对应的指令 的终端设备, 减少了终端设备获取指令的时间,提升了终端设备与用户的交互 能力, 提升用户感受。
可选的,
所述处理单元 502用于根据所述获取单元获取的所述实际触摸点的坐标参 数, 获取预测触摸点的坐标参数, 具体包括,
根据第一个实际触摸点的坐标参数与位移差之和 ,获取第一个预测触摸点 的坐标参数; 根据第 N个预测触摸点的坐标参数与位移差之和, 获取第 N+1个 预测触摸点的坐标参数; 所述 N是大于零的自然数; 所述位移差是, 第 N个预 测触摸点的坐标参数与第 N个实际触摸点的坐标参数之间的差。 对所述方法的描述, 在此不再赘述。 本发明实施例通过一种根据预测触摸点的在坐标参数获取其对应的指令 的终端设备, 减少了终端设备获取指令的时间,提升了终端设备与用户的交互
实施例 6
如图 9所示, 本实施例以手机为例对本发明实施例进行具体说明。 应该理 解的是, 图示手机 600仅仅是手机的一个范例, 并且手机 600可以具有比图中所 示出的更过的或者更少的部件, 可以组合两个或更多的部件, 或者可以具有不 同的部件配置。 图中所示出的各种部件可以在包括一个或多个信号处理和 /或 专用集成电路在内的硬件、 软件、 或硬件和软件的组合中实现。 图 9为本发明实施例提供的手机的结构示意图。如图 6所示手机 600包括: 触 摸屏 61 , CPU62, 存储器 63 , RF电路 65, 外设接口 66, 音频电路 67 , 扬声 器 68, I/O子系统 69。
所述触摸屏 61 , 用于接收用户的手势信息, 所述手势信息包括: 实际触摸 点的坐标参数。所述手势信息就是触摸点的坐标参数、压力参数、面积参数等, 以及可该触摸的移动速度和趋势信息。该坐标参数也就是该实际触摸点的坐标 参数。 该触摸屏 61 , 还可用于显示各种程序的执行结果。 执行结果执行结果执 行结果执行结果执行结果执行结果执行结果触摸屏 61是手机与用户之间的输 入接口和输出接口,除具有接收用户指令的功能外, 还将可视输出显示给用户, 可视输出可以包括图形、 文本、 图标、 视频等。
所述 I/O子系统 69:所述 I/O子系统 69可以控制设备上的输入输出外设, I/O 子系统 69可以包括显示控制器 691和用于控制其他输入 /控制设备的一个或多 个输入控制器 692。 可选的, 一个或多个输入控制器 692从其他输入 /控制设备 接收电信号或者向其他输入 /控制设备发送电信号, 其他输入 /控制设备可以包 括物理按钮(按压按钮、 摇臂按钮等) 、 拨号盘、 滑动开关、 操纵杆、 点击滚 轮。 值得说明的是, 输入控制器 692可以与以下任一个连接: 键盘、 红外端口、 USB接口以及诸如鼠标的指示设备。 所述 I/O子系统 69中的显示控制器 691从触 摸屏 61接收电信号或者向触摸屏 61发送电信号。 触摸屏 61检测触摸屏上的接 触, 显示控制器 691将检测到的接触转换为与显示在触摸屏 61上的用户界面对 象的交互, 即实现人机交互,显示在触摸屏 61上的用户界面对象可以是运行游 戏的图标、 联网到相应网络的图标、 筛选模式等。 值得说明的是, 设备还可以 包括光鼠, 光鼠是不显示可视输出的触摸敏感表面, 或者是由触摸屏形成的触 摸敏感表面的延伸。
所述 CPU62, 用于对接收到的数据进行处理。 具体的, 用于根据触摸屏 61 获取的实际触摸点的坐标参数, 获取预测触摸点的坐标参数。 所述指令包括程 序后台执行指令和 /或显示触摸位置指令; 所述 CPU62根据所述获取单元获取 的所述实际触摸点的坐标参数获取预测触摸点的坐标参数的具体方法,可以根 据实施例 3和实施例 4所述的方法获得, 在此不再赘述。
所述 CPU62是手机 600的控制中心, 利用各种接口和线路连接整个手机的 各个部分, 通过运行或执行存储在存储器 63内的软件程序和 /或模块, 以及调 用存储在存储器 63内的数据, 执行手机 600的各种功能和处理数据, 从而对手 机进行整体监控。可选的, CPU62可包括一个或多个处理单元;可选的, CPU62 可集成应用处理器和调制解调处理器,可选的,应用处理器主要处理操作系统、 用户界面和应用程序等, 调制解调处理器主要处理无线通信。 可以理解的是, 上述调制解调处理器也可以不集成到 CPU62中。
所述存储器 63 , 可以用于存储所述 CPU62的执行结果; 用于存储根据实际 触摸点的坐标参数获取预测触摸点坐标参数的算法;还用于存储所述预测触摸 点坐标参数对应的指令。 存储器 63可以被 CPU62、 外设接口 66等访问, 所述存 储器 63可以包括高速随机存取存储器,还可以包括非易失性存储器, 例如一个 或多个磁盘存储器件、 闪存器件、 或其他易失性固态存储器件。
所述 RF电路 65 , 可以用于发送和接收信息。 所述 RF电路 65 , 主要用于建 立手机与无线网络(即网络侧)的通信, 实现手机与无线网络的数据接收和发 送。 例如收发短信息、 电子邮件等。 具体地, RF电路 65接收并发送 RF信号, RF信号也称为电磁信号, RF电路 65将电信号转换为电磁信号或将电磁信号转 换为电信号, 并且通过该电磁信号与通信网络以及其他设备进行通信。 RF电 路 65可以包括用于执行这些功能的已知电路, 其包括但不限于天线系统、 RF 收发机、 一个或多个放大器、 调谐器、 一个或多个振荡器、 数字信号处理器、 用户标识模块 (Subscriber Identity Module, SIM)等等。
所述外设接口 66 , 所述外设接口可以将设备的输入和输出外设连接到
CPU62和存储器 63。
所述音频电路 67, 主要可用于从外设接口 66接收音频数据,将该音频数据 转换为电信号, 并且将该电信号发送给扬声器 68。
所述扬声器 68, 可用于将手机通过 RF电路 65从无线网络接收的语音信号, 还原为声音并向用户播放该声音。
所述电源管理芯片 64 , 可用于为 CPU62、 I/O子系统 69及外设接口 66所连 接的硬件进行供电及电源管理。
所述触摸屏 61接收用户的手势信息, 所述手势信息包括: 实际触摸点的坐 标参数。 所述 CPU62, 接收所述触摸屏 61获取的实际触摸点的坐标参数, 根据 存储器 63上存储的算法, 获取预测触摸点的坐标参数, 并根据预测触摸点的坐 标参数获取所述预测触摸点的坐标参数对应的指令,该指令包括程序后台执行 指令和 /或显示触摸位置指令。
如果预测触摸点的坐标参数对应的指令是程序后台执行指令,并且在预定 时间段内, 所述预测触摸点的坐标参数等于所述实际触摸点的坐标参数, 则所 述触摸屏 61显示执行所述程序后台执行指令的执行结果;
如果预测触摸点的坐标参数对应的指令是显示触摸位置指令,则所述触摸 屏 61显示触摸位置指令的执行结果。 所述 CPU62执行所述指令, 由所述触摸屏 61根据所述 CPU62执行所述指令 的执行结果, 显示所述执行结果。 应当理解, 所述执行结果也可能通过触摸屏 61以外的部分输出, 如, 可以由扬声器 68等输出, 完成与用户的交互过程。
上述结构可用于执行实施例 2中的方法, 具体方法详见实施例 2,在此不再 赘述。 本发明实施例通过一种根据预测触摸点的在坐标参数获取其对应的指令 的手机, 减少了手机获取指令的时间, 提升了手机与用户的交互能力, 提升用 户感受。 通过预先获取触摸点, 加快了手机的响应速度, 提高了触摸位置的跟 手率, 提升了手机与用户的智能交互能力。 同时, 通过预先后台执行程序, 加 快了手机的响应速度,减少用户等待时间,提升了手机与用户的智能交互能力。 通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到本发 明实施例可以用硬件实现, 或固件实现, 或它们的组合方式来实现。 当使用软 件实现时,可以将上述功能存储在终端设备可读介质中或作为终端设备可读介 质上的一个或多个指令或代码进行传输。终端设备可读介质包括终端设备存储 介质和通信介质 ,可选的通信介质包括便于从一个地方向另一个地方传送终端 设备程序的任何介质。存储介质可以是终端设备能够存取的任何可用介质。 以 此为例但不限于: 终端设备可读介质可以包括 RAM、 ROM, EEPROM、 CD-ROM或其他光盘存储、 磁盘存储介质或者其他磁存储设备、 或者能够用 于携带或存储具有指令或数据结构形式的期望的程序代码并能够由终端设备 存取的任何其他介质。 此外。 任何连接可以适当的成为终端设备可读介质。 例 如, 如果软件是使用同轴电缆、 光纤光缆、 双绞线、 数字用户线(DSL )或者 诸如红外线、无线电和微波之类的无线技术从网站、服务器或者其他远程源传 输的, 那么同轴电缆、 光纤光缆、 双绞线、 DSL 或者诸如红外线、 无线和微 波之类的无线技术包括在所属介质的定影中。 如本发明实施例所使用的, 盘 ( Disk )和碟( disc )包括压缩光碟( CD )、激光碟、光碟、数字通用光碟( DVD )、 软盘和蓝光光碟, 可选的盘通常磁性的复制数据, 而碟则用激光来光学的复制
总之, 以上所述仅为本发明技术方案的实施例而已, 并非用于限定本发明 的保护范围。 凡在本发明的精神和原则之内, 所作的任何修改、 等同替换、 改 进等, 均应包含在本发明的保护范围之内。

Claims

权 利 要 求
1、 一种终端设备获取指令的方法, 其特征在于, 所述方法包括: 获取手势信息, 所述手势信息包括: 实际触摸点的坐标参数;
根据所述实际触摸点的坐标参数, 获取预测触摸点的坐标参数; 根据所述预测触摸点的坐标参数,获取所述预测触摸点的坐标参数对应的 指令。
2、 根据权利要求 1所述的方法, 其特征在于, 所述指令具体包括: 程序后台执行指令,所述程序后台执行指令是终端设备在后台执行具体操 作的指令。
3、 根据权利要求 2所述的方法, 其特征在于, 在执行所述程序后台执行指 令之后, 所述方法还包括:
如果在预定时间段内,所述预测触摸点的坐标参数等于所述实际触摸点的 坐标参数, 则显示执行所述程序后台执行指令的执行结果。
4、 根据权利要求 1或 3所述的方法, 其特征在于, 所述指令具体还包括: 显示触摸位置指令。
5、 根据权利要求 4所述的方法, 其特征在于, 在执行所述显示触摸位置指 令之后, 所述方法还包括:
显示执行所述显示触摸位置指令的执行结果。
6、 根据权利要求 1至 5任一项所述的方法, 其特征在于, 所述根据所述实 际触摸点的坐标参数, 获取预测触摸点的坐标参数的方法具体包括:
第 N个实际触摸点的坐标参数与位移差的和等于第 N个预测触摸点的坐标 参数;
所述所述 N是大于零的自然数;
所述位移差是, 第 N个预测触摸点的坐标参数与第 N个实际触摸点的坐标 参数之间的差。
7、 根据权利要求 1至 5任一项所述的方法, 其特征在于, 所述根据所述实 际触摸点的坐标参数, 获取预测触摸点的坐标参数的方法具体包括的方法是, 第一个实际触摸点的坐标参数与位移差之和等于第一个预测触摸点的坐 标参数;
第 N个预测触摸点的坐标参数与位移差之和等于第 N+1个预测触摸点的坐 标参数;
所述所述 N是大于零的自然数;
所述位移差是, 第 N个预测触摸点的坐标参数与第 N个实际触摸点的坐标 参数之间的差。
8、 根据权利要求 7所述的方法, 其特征在于, 所述根据所述实际触摸点的 坐标参数, 获取预测触摸点的坐标参数的方法具体包括,
当手势信息为加速运动时, 第 N+ 1个预测触摸点的第一坐标参数等于所述 第 N个预测触摸点的坐标参数与所述位移差之和, 第 N+1个预测触摸点的第二 坐标参数等于第 N+1个实际触摸点的坐标参数与上一个触摸事件上报周期内 实际移动的像素之和, 所述第 N+1个预测触摸点的坐标参数为所述第 N+1个预 测触摸点的第一坐标参数和所述第 N+1个预测触摸点的第二坐标参数之中的 较大值;
当手势信息为减速运动时, 第 N+ 1个预测触摸点的第一坐标参数等于所述 第 N个预测触摸点的坐标参数与所述位移差之和, 第 N+1个预测触摸点的第二 坐标参数等于第 N+1个实际触摸点的坐标参数与上一个触摸事件上报周期内 实际移动的像素之和, 所述第 N+1个预测触摸点的坐标参数为所述第 N+1个预 测触摸点的第一坐标参数和所述第 N+1个预测触摸点的第二坐标参数之中的 较小值;
当手势信息为静止时, 所述第 N+ 1个预测触摸点的坐标参数等于第 N+ 1个 实际触摸点的坐标参数。
9、 根据权利要求 6至 8任一项所述的方法, 其特征在于, 获取所述位移差 的方法具体包括,
当手势信息为静止时, s=o;
当手势信息为匀速运动时, S=L;
当手势信息为加速运动时, S=L x ( 1+1/M ) ;
当手势信息为减速运动时, S=L x ( 1-1/M ) ;
其中,
所述 S为所述位移差,
所述 L为一个触摸事件上报周期内实际移动的像素,
所述 M是大于零的自然数。
10、 一种终端设备, 其特征在于, 所述终端设备包括: 获取单元, 用于获取手势信息, 所述手势信息包括: 实际触摸点的坐标参 数;
处理单元, 用于根据所述获取单元获取的所述实际触摸点的坐标参数, 获 取预测触摸点的坐标参数, 并根据所述预测触摸点的坐标参数, 获取所述预测 触摸点的坐标参数对应的指令。
11 ,根据权利要求 10所述的终端设备,其特征在于,所述终端设备还包括: 第一执行单元, 用于接收所述处理单元的所述指令, 并执行所述指令, 所 述指令包括程序后台执行指令;
所述程序后台执行指令, 是终端设备在后台执行具体操作的指令。
12,根据权利要求 11所述的终端设备,其特征在于,所述终端设备还包括: 判断单元,用于接收所述处理单元的所述预测触摸点的坐标参数和所述实 际触摸点的坐标参,并判断所述预测触摸点的坐标参数是否等于所述实际触摸 点的坐标参数;
第一显示单元,用于接收所述第一执行单元执行所述程序后台执行指令的 执行结果和所述判断单元的判断结果, 如果所述判断结果是在预定时间断内, 所述预测触摸点的坐标参数等于所述实际触摸点的坐标参数,则显示所述后台 执行指令的执行结果。
13 , 根据权利要求 10或 12任一项所述的终端设备, 其特征在于, 所述终端 设备还包括:
第二执行单元, 用于接收所述处理单元的所述指令, 并执行所述指令, 所 述指令包括显示触摸位置指令;
第二显示单元,用于接收所述第二执行单元执行所述显示触摸位置指令的 执行结果, 并显示所述显示触摸位置指令的执行结果。
14、 根据权利要求 11至 13任一项所述的终端设备, 其特征在于, 所述处理 单元用于根据所述获取单元获取的所述实际触摸点的坐标参数,获取预测触摸 点的坐标参数, 具体包括, 根据第 N个实际触摸点的坐标参数与位移差的和获取第 N个预测触摸点的 坐标参数;
所述 N是大于零的自然数;
所述位移差是, 第 N个预测触摸点的坐标参数与第 N个实际触摸点的坐标 参数之间的差。
15、 根据权利要求 11至 13任一项所述的终端设备, 其特征在于, 所述处理 单元用于根据所述获取单元获取的所述实际触摸点的坐标参数,获取预测触摸 点的坐标参数, 具体包括,
根据第一个实际触摸点的坐标参数与位移差之和,获取第一个预测触摸点 的坐标参数;
根据第 N个预测触摸点的坐标参数与位移差之和, 获取第 N+ 1个预测触摸 点的坐标参数;
所述 N是大于零的自然数;
所述位移差是, 第 N个预测触摸点的坐标参数与第 N个实际触摸点的坐标
PCT/CN2012/080719 2012-08-29 2012-08-29 一种终端设备获取指令的方法及终端设备 WO2014032239A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2012/080719 WO2014032239A1 (zh) 2012-08-29 2012-08-29 一种终端设备获取指令的方法及终端设备
CN201280003611.0A CN103403665B (zh) 2012-08-29 2012-08-29 一种终端设备获取指令的方法及终端设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2012/080719 WO2014032239A1 (zh) 2012-08-29 2012-08-29 一种终端设备获取指令的方法及终端设备

Publications (1)

Publication Number Publication Date
WO2014032239A1 true WO2014032239A1 (zh) 2014-03-06

Family

ID=49565841

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2012/080719 WO2014032239A1 (zh) 2012-08-29 2012-08-29 一种终端设备获取指令的方法及终端设备

Country Status (2)

Country Link
CN (1) CN103403665B (zh)
WO (1) WO2014032239A1 (zh)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104677376B (zh) * 2013-11-29 2017-09-12 广东瑞图万方科技股份有限公司 导航系统手势指令输入方法及装置
CN103699271B (zh) * 2013-12-31 2017-08-11 广州视睿电子科技有限公司 触摸屏参数的设置方法和系统
CN104035714B (zh) * 2014-06-24 2017-05-03 中科创达软件股份有限公司 一种基于安卓系统的触摸事件处理方法、装置和设备
CN105487685A (zh) * 2015-11-20 2016-04-13 小米科技有限责任公司 空鼠遥控器的优化方法、装置和终端设备
CN107967058B (zh) * 2017-12-07 2021-04-13 联想(北京)有限公司 信息处理方法、电子设备和计算机可读存储介质
CN108055405B (zh) * 2017-12-26 2020-12-15 重庆传音通讯技术有限公司 唤醒终端的方法及终端
CN108874200B (zh) * 2018-04-16 2019-12-03 广州视源电子科技股份有限公司 书写速度的控制方法和装置
CN111666019B (zh) 2019-03-05 2022-01-25 台达电子工业股份有限公司 电子装置及选取目标物件的预测方法
TWI708167B (zh) * 2019-03-05 2020-10-21 台達電子工業股份有限公司 電子裝置及選取目標物件之預測方法
CN112506413B (zh) * 2020-12-16 2022-06-07 Oppo广东移动通信有限公司 触控点预测方法、装置、终端设备及计算机可读存储介质
CN113552966A (zh) * 2021-06-20 2021-10-26 海南雷影信息技术有限公司 一种雷达触摸点主动预测方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101539838A (zh) * 2009-05-04 2009-09-23 深圳华为通信技术有限公司 一种触摸屏用户输入的方法和装置
CN101568898A (zh) * 2006-12-26 2009-10-28 索尼爱立信移动通讯股份有限公司 检测并且定位输入表面上的触摸或触碰
CN101739208A (zh) * 2008-11-25 2010-06-16 三星电子株式会社 提供用户界面的设备和方法
CN102622127A (zh) * 2011-02-12 2012-08-01 微软公司 基于预测的触摸接触跟踪

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9354804B2 (en) * 2010-12-29 2016-05-31 Microsoft Technology Licensing, Llc Touch event anticipation in a computing device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101568898A (zh) * 2006-12-26 2009-10-28 索尼爱立信移动通讯股份有限公司 检测并且定位输入表面上的触摸或触碰
CN101739208A (zh) * 2008-11-25 2010-06-16 三星电子株式会社 提供用户界面的设备和方法
CN101539838A (zh) * 2009-05-04 2009-09-23 深圳华为通信技术有限公司 一种触摸屏用户输入的方法和装置
CN102622127A (zh) * 2011-02-12 2012-08-01 微软公司 基于预测的触摸接触跟踪

Also Published As

Publication number Publication date
CN103403665A (zh) 2013-11-20
CN103403665B (zh) 2016-08-03

Similar Documents

Publication Publication Date Title
WO2014032239A1 (zh) 一种终端设备获取指令的方法及终端设备
CN106708407B (zh) 防止触摸按键误触发的方法、装置及移动终端
US10996834B2 (en) Touchscreen apparatus user interface processing method and touchscreen apparatus
WO2019214522A1 (en) Method for establishing application prediction model, storage medium, and terminal
US10466849B2 (en) Method and terminal for preventing unintentional triggering of a touch key and storage medium
CN104156171B (zh) 防止移动终端横屏状态下触摸按键误操作的方法及装置
WO2018082269A1 (zh) 菜单显示方法及终端
WO2019014859A1 (zh) 一种多任务操作方法及电子设备
WO2015043194A1 (zh) 虚拟键盘显示方法、装置及终端
WO2018027551A1 (zh) 一种消息显示方法、用户终端及图形用户接口
EP2752754A2 (en) Remote mouse function method and terminals
WO2014206101A1 (zh) 一种基于手势的会话处理方法、装置及终端设备
CN105302452B (zh) 一种基于手势交互的操作方法及装置
WO2015067045A1 (en) Method, device and computer system for performing operations on objects in object list
TW201516844A (zh) 一種物件選擇的方法和裝置
WO2013135169A1 (zh) 一种输入法键盘的调整方法及其移动终端
KR20140019530A (ko) 멀티 터치 핑거 제스처를 이용하는 사용자 인터렉션 제공 방법 및 장치
WO2018209555A1 (zh) 连接蓝牙设备的方法及终端设备
WO2015100573A1 (zh) 一种显示刷新方法和终端
US20150128095A1 (en) Method, device and computer system for performing operations on objects in an object list
CN109284041A (zh) 一种应用界面切换方法及移动终端
CN109165033B (zh) 一种应用更新方法及移动终端
CN111078108A (zh) 一种屏幕显示方法、装置、存储介质及移动终端
WO2018112803A1 (zh) 触摸屏手势识别的方法及装置
US20150089370A1 (en) Method and device for playing media data on a terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12883571

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12883571

Country of ref document: EP

Kind code of ref document: A1