CN113805770A - Cursor moving method and electronic equipment - Google Patents

Cursor moving method and electronic equipment Download PDF

Info

Publication number
CN113805770A
CN113805770A CN202110927091.3A CN202110927091A CN113805770A CN 113805770 A CN113805770 A CN 113805770A CN 202110927091 A CN202110927091 A CN 202110927091A CN 113805770 A CN113805770 A CN 113805770A
Authority
CN
China
Prior art keywords
event
sliding
position information
electronic device
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110927091.3A
Other languages
Chinese (zh)
Other versions
CN113805770B (en
Inventor
高杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110927091.3A priority Critical patent/CN113805770B/en
Publication of CN113805770A publication Critical patent/CN113805770A/en
Application granted granted Critical
Publication of CN113805770B publication Critical patent/CN113805770B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The application provides a cursor moving method and electronic equipment, and relates to the technical field of communication. The specific method comprises the following steps: the electronic equipment responds to the sliding operation in the touch area, detects a plurality of sliding events, responds to the operation of leaving the touch area, determines target position information according to at least two reference events, and then draws a moving track of a cursor according to the target position information. The at least two reference events include an end event, and the end event is a last detected sliding event in the plurality of sliding events.

Description

Cursor moving method and electronic equipment
Technical Field
The present application relates to the field of communications technologies, and in particular, to a cursor moving method and an electronic device.
Background
Currently, in a scenario in which an electronic device is connected to a peripheral having a touch area, when a finger of a user performs a sliding operation in the touch area, the electronic device controls a cursor to move in a display screen of the electronic device in response to the sliding operation. When the finger of the user suddenly stops moving in the touch area, the electronic device ends the movement of the cursor in response to the stop operation.
Disclosure of Invention
The application provides a cursor moving method and electronic equipment.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides a cursor movement method, where an electronic device detects a plurality of sliding events in response to a sliding operation in a touch area, determines target position information according to at least two reference events in response to an operation of leaving the touch area, and then draws a movement trajectory of a cursor according to the target position information. The at least two reference events include an end event, and the end event is a last detected sliding event in the plurality of sliding events.
By executing the technical scheme, the electronic equipment can control the cursor to continuously move for a certain distance according to inertia when the finger of the user stops moving in the touch area.
In one possible implementation manner, each sliding event may include a touch time, where the touch time refers to a time when the electronic device detects the sliding event. And the at least two reference events further comprise a proximity event, the proximity event is an event meeting a preset rule in the plurality of sliding events, and the preset rule means that a time difference value between a touch time in the proximity event and a touch time in the end event is smaller than a preset value.
By predicting the target position information using the end event and the information in the slip event close to the end event, the result of the prediction can be made more accurate.
In one possible implementation manner, each sliding event includes position information of a touch point on the touch area when the electronic device detects the sliding event. The at least two reference events include an end event and a first event, the first event being a slide event prior to the end event. In this case, the method for determining the target location information by the electronic device according to at least two reference events may include: the electronic equipment determines the speed according to the touch time and the position information in the end event and the touch time and the position information in the first event, wherein the speed is used for indicating the speed when the finger or the stylus pen leaves the touch area. The electronic device determines target position information based on the velocity, a preset deceleration of the motion, and a first time interval. The first time interval is a time interval between the touch time included by the end event and the time corresponding to the target position information, and the time corresponding to the target position information is preset.
The electronic equipment realizes the prediction of the cursor track by adopting the last two sliding position information and the time information, and the predicted cursor track is a straight line.
In one possible implementation manner, each sliding event includes position information of a touch point on the touch area when the electronic device detects the sliding event. The at least two reference events include an end event, a first event and a second event, the first event is a sliding event before the end event, and the second event is a second sliding event before the end event. The method for determining the target position information by the electronic equipment according to the at least two reference events can comprise the following steps: the electronic device determines circle center position information and circle radius of a perfect circle according to three position information in an end event, a first event and a second event, and determines an arc distance according to touch time and position information in the end event, touch time and position information in the first event, a preset movement deceleration and a first time interval, wherein the arc distance is used for representing the distance of an arc between the position information in the end event and target position information. And the electronic equipment determines the target position information according to the circle center position information, the circle radius and the arc distance.
The electronic equipment realizes the prediction of the cursor track by adopting the last three sliding position information and the time information, and the predicted cursor track is an arc line.
In one possible implementation manner, each sliding event includes position information of a touch point on the touch area when the electronic device detects the sliding event. The at least two reference events include a plurality of slip events. The method for determining the target position information by the electronic equipment according to the at least two reference events may include: the electronic equipment inputs the position information and the touch time included by each sliding event in the plurality of sliding events and the first time interval into a preset track prediction model to obtain target position information.
The electronic equipment realizes the prediction of the cursor track by adopting the time information of all position information of the sliding operation, and the predicted cursor track can be in any shape and depends on the track of the user during the sliding operation.
In one possible implementation manner, the method for moving the cursor may further include: the electronic device stores touch time and position information included in each of a plurality of slide events.
The information included with each slip event is electronically stored for use in making the trajectory prediction.
In one possible implementation manner, the method for moving the cursor may further include: the electronic equipment acquires sampling data, wherein the sampling data comprises position information and touch time of sliding of a finger or a touch pen on a touch area. The electronic equipment trains the sampling data in a machine learning mode to obtain a track prediction model.
In one possible implementation, the sampled data is historical slide events acquired by the electronic device for a user using the electronic device; or the sampling data is historical sliding events of the whole network users from the cloud end received by the electronic equipment; or the sampling data is historical sliding events of the crowd of a preset type; alternatively, the sampled data is all historical slip events within a preset time period.
The sampling data meeting the requirements of the user can be determined according to different requirements of different scenes.
In a possible implementation manner, the method for training the sample data by the electronic device in a machine learning manner to obtain the trajectory prediction model may include: the electronic equipment determines all sliding tracks according to the sampling data, and classifies the sliding tracks according to the shapes of the sliding tracks to obtain a plurality of classification sets. Then, the electronic device integrates the plurality of classification sets to determine a trajectory prediction model, or the electronic device determines a trajectory prediction model corresponding to each classification set.
The realization mode of the training track prediction model is various, and the track prediction model can be trained in advance according to the actual application scene.
In one possible implementation manner, the method for moving the cursor may further include: the electronic equipment receives a track prediction model sent by the cloud, and the track prediction model is obtained through cloud training.
By training the trajectory prediction model through the cloud, processing pressure of the electronic device is reduced.
In a second aspect, the present application provides an electronic device comprising: one or more processors and one or more memories for storing computer program code, the computer program code comprising computer instructions that, when read from the one or more memories, cause the electronic device to perform operations comprising: detecting a plurality of sliding events in response to a sliding operation in a touch area; determining target position information according to at least two reference events in response to an operation of leaving a touch area, wherein the at least two reference events comprise an end event, and the end event is a last detected sliding event in a plurality of sliding events; and drawing the moving track of the cursor according to the target position information.
In one possible implementation, a touch time may be included in each sliding event, and the touch time refers to a time when the electronic device detects the sliding event. And the at least two reference events further comprise a proximity event, the proximity event is an event meeting a preset rule in the plurality of sliding events, and the preset rule means that a time difference value between a touch time in the proximity event and a touch time in the end event is smaller than a preset value.
In one possible implementation manner, each sliding event includes position information of a touch point on the touch area when the electronic device detects the sliding event. The at least two reference events include an end event and a first event, the first event being a slide event prior to the end event. The electronic device determines target location information according to at least two reference events, and specifically includes: the electronic equipment determines the speed according to the touch time and the position information in the end event and the touch time and the position information in the first event, wherein the speed is used for indicating the speed when the finger or the stylus pen leaves the touch area. The electronic device determines target position information based on the velocity, a preset deceleration of the motion, and a first time interval. The first time interval is a time interval between the touch time included by the end event and the time corresponding to the target position information, and the time corresponding to the target position information is preset.
In one possible implementation manner, each sliding event includes position information of a touch point on the touch area when the electronic device detects the sliding event. The at least two reference events include an end event, a first event and a second event, the first event is a sliding event before the end event, and the second event is a second sliding event before the end event. The electronic device determines target location information according to at least two reference events, and specifically includes: the electronic device determines circle center position information and circle radius of a perfect circle according to three position information in an end event, a first event and a second event, and determines an arc distance according to touch time and position information in the end event, touch time and position information in the first event, a preset movement deceleration and a first time interval, wherein the arc distance is used for representing the distance of an arc between the position information in the end event and target position information. And the electronic equipment determines the target position information according to the circle center position information, the circle radius and the arc distance.
In one possible implementation manner, each sliding event includes position information of a touch point on the touch area when the electronic device detects the sliding event. The at least two reference events include a plurality of slip events. The electronic device determines target location information according to at least two reference events, and specifically includes: the electronic equipment inputs the position information and the touch time included by each sliding event in the plurality of sliding events and the first time interval into a preset track prediction model to obtain target position information.
In one possible implementation, the computer instructions that, when read from the one or more memories by the one or more processors, further cause the electronic device to perform the following: the electronic device stores touch time and position information included in each of a plurality of slide events.
In one possible implementation, the computer instructions that, when read from the one or more memories by the one or more processors, further cause the electronic device to perform the following: the electronic equipment acquires sampling data, wherein the sampling data comprises position information and touch time of sliding of a finger or a touch pen on a touch area. The electronic equipment trains the sampling data in a machine learning mode to obtain a track prediction model.
In one possible implementation, the sampled data is historical slide events acquired by the electronic device for a user using the electronic device; or the sampling data is historical sliding events of the whole network users from the cloud end received by the electronic equipment; or the sampling data is historical sliding events of the crowd of a preset type; alternatively, the sampled data is all historical slip events within a preset time period.
In one possible implementation, the computer instructions that, when read from the one or more memories by the one or more processors, further cause the electronic device to perform the following: the electronic equipment determines all sliding tracks according to the sampling data, and classifies the sliding tracks according to the shapes of the sliding tracks to obtain a plurality of classification sets. Then, the electronic device integrates the plurality of classification sets to determine a trajectory prediction model, or the electronic device determines a trajectory prediction model corresponding to each classification set.
In one possible implementation, the computer instructions that, when read from the one or more memories by the one or more processors, further cause the electronic device to perform the following: the electronic equipment receives a track prediction model sent by the cloud, and the track prediction model is obtained through cloud training.
In a third aspect, a computer storage medium comprises computer instructions which, when run on an electronic device, cause the electronic device to perform the method as described in the first aspect and any one of its possible implementations.
In a fourth aspect, a computer program product is provided, which, when run on a computer, causes the computer to perform the method as set forth in the first aspect and any one of its possible implementations.
In a fifth aspect, a chip comprises a processor, which when executing instructions performs the method as described in the first aspect and any one of its possible implementations.
In a sixth aspect, an apparatus is included in an electronic device, and the apparatus has functionality to implement the behavior of the electronic device in any of the methods in the foregoing aspects and possible implementations. The function can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes at least one module or unit corresponding to the above functions. For example, a receiving module or unit, a determining module or unit, and a transmitting module or unit, etc.
A seventh aspect is a communication system, including the electronic device and the cloud in the second aspect and any possible implementation manner.
Drawings
Fig. 1 is a schematic view of a scene to which a cursor moving method provided in an embodiment of the present application is applied;
fig. 2 is a schematic structural diagram of a wireless keyboard according to an embodiment of the present application;
fig. 3 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 4 is a schematic hardware structure diagram of a wireless keyboard according to an embodiment of the present application;
fig. 5 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 6 is a flowchart illustrating a method for moving a cursor according to an embodiment of the present application;
fig. 7A is a schematic view of a coordinate point on a touch area according to an embodiment of the present disclosure;
fig. 7B is a second schematic diagram of coordinate points on a touch area according to the present embodiment;
fig. 7C is a third schematic diagram of coordinate points on a touch area according to the present embodiment;
fig. 8 is a second flowchart illustrating a cursor moving method according to an embodiment of the present application;
fig. 9 is a schematic diagram of a chip system according to an embodiment of the present application.
Detailed Description
The cursor moving method provided by the embodiment of the application can be applied to a scene that the electronic equipment is connected with the peripheral equipment. Or, the cursor moving method provided by the embodiment of the present application may also be applied to an electronic device.
It is understood that the electronic device in the embodiment of the present application may be referred to as a User Equipment (UE), a terminal (terminal), and the like. For example, the electronic device may be a mobile terminal or a fixed terminal having a touch screen, such as a tablet, a Personal Digital Assistant (PDA), a handheld device having a wireless communication function, a computing device, a vehicle-mounted device, or a wearable device, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self driving (self driving), a wireless terminal in remote medical (remote medical), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), and the like. The form of the terminal device is not particularly limited in the embodiment of the present application.
In addition, the peripheral in the embodiment of the present application may be a mouse, a wireless keyboard, a touch pen, or the like.
In the scenario where the electronic device is connected to the peripheral, the connection modes of the electronic device and the peripheral are different for different peripherals. For example, when the peripheral device is a wireless keyboard, the electronic device and the wireless keyboard are interconnected through a communication network to realize wireless signal interaction, or are interconnected in a magnetic attraction manner. When the peripheral device is a mouse, the electronic device and the mouse are interconnected through a Universal Serial Bus (USB) interface, or interconnected through a communication network, so as to realize interaction of wireless signals. The communication network may be, but is not limited to: a wireless fidelity (WI-FI) hotspot network, a WI-FI point-to-point (P2P) network, a bluetooth network, a zigbee network, or a Near Field Communication (NFC) network.
The embodiment of the present application does not limit what kind of scenario the cursor moving method is specifically applied to, and in the embodiment of the present application, a scenario in which the cursor moving method is applied to the electronic device connected to the wireless keyboard is taken as an example for description.
Fig. 1 is a schematic view of a scene applicable to the embodiment of the present application. Referring to fig. 1, an electronic device 100 and a wireless keyboard 200 are included in this scenario. Fig. 1 illustrates an example of the electronic device 100 as a tablet computer. The wireless keyboard 200 may provide an input to the electronic device 100, and the electronic device 100 performs an operation in response to the input based on the input of the wireless keyboard 200. For example, a touch area may be provided on the wireless keyboard 200, and a user may operate the touch area of the wireless keyboard 200 to provide an input to the electronic device 100, and the electronic device 100 may perform an operation in response to the input based on the input.
In one embodiment, a stylus (stylus)300 may also be included in the scene. The stylus 300 may provide an input to the electronic device 100, and the electronic device 100 performs an operation in response to the input based on the input of the stylus 300. For example, the user may operate a screen of the electronic device 100 or operate a touch area of the wireless keyboard 200 using the stylus 300 to provide an input to the electronic device 100, and the electronic device 100 may perform an operation in response to the input based on the input of the stylus 300. In one embodiment, stylus 300 and electronic device 100, and stylus 300 and wireless keyboard 200 may be interconnected via a communication network to enable wireless signal interaction. The description of the communication network may refer to the communication network used when the electronic device 100 and the wireless keyboard 200 are interconnected, and will not be described herein.
In one embodiment, as shown with reference to FIG. 2, a wireless keyboard 200 may include a first portion 201 and a second portion 202. Illustratively, the wireless keyboard 200 may include: keyboard main part and keyboard cover. The first portion 201 may be a keypad sleeve and the second portion 202 a keypad body. The first portion 201 is used for placing the electronic device 100, and the second portion 202 may be provided with keys, a touch pad, and the like for user operation.
Wherein, when the wireless keyboard 200 is used, the first part 201 and the second part 202 of the wireless keyboard 200 need to be opened, and when the wireless keyboard 200 is not used, the first part 201 and the second part 202 of the wireless keyboard 200 can be closed. In one embodiment, the first portion 201 and the second portion 202 of the wireless keyboard 200 are rotatably coupled. For example, the first portion 201 and the second portion 202 may be connected by a hinge or a rotating shaft, or, in some examples, the first portion 201 and the second portion 202 may be rotatably connected by a flexible material (e.g., a leather material or a cloth material). Or, in some examples, the first portion 201 and the second portion 202 may be integrally formed, and the connection between the first portion 201 and the second portion 202 is thinned, so that the connection between the first portion 201 and the second portion 202 may be bent. The connection between the first portion 201 and the second portion 202 may include, but is not limited to, the above-mentioned several rotation connection manners.
Wherein the first portion 201 may include at least two pivotally connected brackets. For example, referring to fig. 2, the first portion 201 includes a first support 201a and a second support 201b, the first support 201a and the second support 201b are rotatably connected, and when in use, the first support 201a and the second support 201b can be used to support the electronic device 100 together (refer to fig. 1). Alternatively, the first stand 201a provides support for the second stand 201b, and the second stand 201b supports the electronic device 100. Referring to fig. 2, the second bracket 201b is rotatably connected to the first portion 201.
As shown in fig. 2, the wireless keyboard 200 may be provided with a storage unit 203 for storing the stylus pen 300 in order to facilitate storage of the stylus pen 300. Referring to fig. 2, the receiving portion 203 is a cylindrical cavity, and the stylus pen 300 is inserted into the receiving cavity along an arrow in fig. 2 when received. In this embodiment, referring to fig. 2, the second portion 202 and the second bracket 201b are rotatably connected by a connecting portion 204, and the connecting portion 204 is provided with a receiving portion 203. The connecting portion 204 may be a rotating shaft.
Fig. 3 is a schematic diagram of a hardware structure of an electronic device 100 according to an embodiment of the present disclosure. Referring to FIG. 3, electronic device 100 may include multiple subsystems that cooperate to perform, coordinate, or monitor one or more operations or functions of electronic device 100. Electronic device 100 includes processor 110, input surface 120, coordination engine 130, power subsystem 140, power connector 150, wireless interface 160, and display 170.
For example, coordination engine 130 may be used to communicate with other subsystems of electronic device 100 and/or process data; measuring and/or obtaining the output of one or more analog or digital sensors (such as touch sensors); measuring and/or obtaining an output of one or more sensor nodes of an array of sensor nodes (such as an array of capacitive sensing nodes); receiving and locating a touch signal from a touch area of the wireless keyboard 200; positioning a cursor position of the electronic device 100 based on the touch signal; receiving and locating the tip signal and the ring signal from the stylus 300; the stylus pen 300 and the like are positioned based on the positions of the tip signal crossing region and the ring signal crossing region.
The coordination engine 130 of the electronic device 100 includes or is otherwise communicatively coupled to a sensor layer located below or integrated with the input surface 120. The coordination engine 130 utilizes the sensor layer to locate touch operations on the input surface 120. In one embodiment, the input surface 120 may be referred to as a touch screen 101.
For example, the sensor layer of coordination engine 130 of electronic device 100 is a grid of capacitive sensing nodes arranged in columns and rows. More specifically, the array of column traces is disposed perpendicular to the array of row traces. The sensor layer may be separate from other layers of the electronic device 100, or the sensor layer may be disposed directly on another layer, such as, but not limited to: display stack layers, force sensor layers, digitizer layers, polarizer layers, battery layers, structural or decorative outer shell layers, and the like.
The sensor layer can operate in multiple modes. If operating in mutual capacitance mode, the column and row traces form a single capacitive sensing node at each overlap point (e.g., a "vertical" mutual capacitance). If operating in self-capacitance mode, the column and row traces form two (vertically aligned) capacitive sensing nodes at each overlap point. In another embodiment, adjacent column traces and/or adjacent row traces may each form a single capacitive sensing node (e.g., a "horizontal" mutual capacitance) if operating in a mutual capacitance mode. As described above, the sensor layer may detect the presence of the tip of stylus 300 and/or the touch of a user's finger by monitoring changes in capacitance (e.g., mutual or self capacitance) present at each capacitive sensing node.
In general, processor 110 may be configured to perform, coordinate, and/or manage the functions of electronic device 100. Such functions may include, but are not limited to: communicate and/or transact data with other subsystems of electronic device 100, communicate and/or transact data with stylus 300, communicate and/or transact data over a wireless interface, communicate and/or transact data over a wired interface, facilitate power exchange over a wireless (e.g., inductive, resonant, etc.) or wired interface, receive position and angular position of one or more styluses, etc.
Processor 110 may be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions. For example, the processor 110 may be a microprocessor, central processing unit, application specific integrated circuit, field programmable gate array, digital signal processor, analog circuit, digital circuit, or a combination of these devices. The processor may be a single threaded or a multi-threaded processor. The processor may be a single core or a multi-core processor.
During use, the processor 110 may be configured to access a memory having stored instructions. The instructions may be configured to cause the processor to perform, coordinate, or monitor one or more operations or functions of the electronic device 100.
The instructions stored in the memory may be configured to control or coordinate the operation of other components of the electronic device 100, such as, but not limited to: another processor, analog or digital circuitry, a volatile or non-volatile memory module, a display, a speaker, a microphone, a rotary input device, a button or other physical input device, a biometric authentication sensor and/or system, a force or touch input/output component, a communication module (such as a wireless interface and/or a power connector), and/or a haptic feedback device.
The memory may also store electronic data that may be used by the stylus or the processor. For example, the memory may store electronic data or content (such as media files, documents, and applications), device settings and preferences, timing signals and control signals or data for various modules, data structures or databases, files or configurations related to detecting tip signals and/or ring signals, and so forth. The memory may be configured as any type of memory. For example, the memory may be implemented as random access memory, read only memory, flash memory, removable memory, other types of storage elements, or a combination of such devices.
Electronic device 100 also includes a power subsystem 140. Power subsystem 140 may include a battery or other power source. Power subsystem 140 may be configured to provide power to electronic device 100. Power subsystem 140 may also be coupled to power connector 150. Power connector 150 may be any suitable connector or port that may be configured to receive power from an external power source and/or configured to provide power to an external load. For example, in some embodiments, power connector 150 may be used to recharge a battery within power subsystem 140. In another embodiment, power connector 150 can be used to transmit power stored in (or available to) power subsystem 140 to stylus 300.
The electronic device 100 also includes a wireless interface 160 to facilitate electronic communication between the electronic device 100 and the wireless keyboard 200 or stylus 300. In one embodiment, wireless interface 160 facilitates electronic communication between electronic device 100 and an external communication network, device, or platform.
The wireless interface 160 may be implemented as one or more of a wireless interface, a bluetooth interface, a near field communication interface, a magnetic interface, a universal serial bus interface, an inductive interface, a resonant interface, a capacitive coupling interface, a Wi-Fi interface, a Transmission Control Protocol (TCP)/Internet Protocol (IP) interface, a network communication interface, an optical interface, an acoustic interface, or any conventional communication interface.
The electronic device 100 also includes a display 170. The display 170 may be located behind the input surface 120 or may be integral therewith. The display 170 may be communicatively coupled to the processor 110. The processor 110 may present information to a user using the display 170. In many cases, the processor 110 uses the display 170 to present an interface with which a user may interact. In many cases, a user manipulates stylus 300 to interact with the interface.
It will be apparent to one skilled in the art that some of the specific details presented above with respect to electronic device 100 may not be required to practice particular described embodiments or their equivalents. Similarly, other electronic devices may include a greater number of subsystems, modules, components, etc. Some sub-modules may be implemented as software or hardware, where appropriate. Accordingly, it should be understood that the above description is not intended to be exhaustive or to limit the disclosure to the precise form disclosed herein. On the contrary, many modifications and variations are possible in light of the above teaching, as would be apparent to those of ordinary skill in the art.
Fig. 4 is a schematic hardware structure diagram of a wireless keyboard 200 according to an embodiment of the present disclosure. Referring to fig. 4, the wireless keyboard 200 may include a processor 210, a memory 220, a charging interface 230, a charging management module 240, a wireless charging coil 250, a battery 260, a wireless communication module 270, a touch pad 280, and a keyboard 290.
The processor 210, the memory 220, the charging interface 230, the charging management module 240, the battery 260, the wireless communication module 270, the touch pad 280, the keyboard 290, and the like may be disposed on the keyboard body (i.e., the second portion 202 shown in fig. 1) of the wireless keyboard 200. The wireless charging coil 250 may be provided in a connection part 204 (shown in fig. 2) for movably connecting the keyboard main body and the cradle. It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the wireless keyboard 200. In other embodiments, wireless keyboard 200 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Memory 220 may be used to store program code, among other things. The memory 220 may also have stored therein a bluetooth address for uniquely identifying the wireless keyboard 200. In addition, the memory 220 may also store connection data of the electronic device 100 that has been successfully paired with the wireless keyboard 200 before. For example, the connection data may be a bluetooth address of the electronic device 100 successfully paired with the wireless keyboard 200. Based on the connection data, the wireless keyboard 200 can be automatically paired with the electronic device 100 without configuring a connection therewith, such as performing a validity check or the like. The bluetooth address may be a Media Access Control (MAC) address.
The processor 210 may be configured to execute the application program codes and call the relevant modules to implement the functions of the wireless keyboard 200 in the embodiment of the present application. For example, a wired charging function, a reverse wireless charging function, a wireless communication function, etc. of the wireless keyboard 200 are implemented. The processor 210 may include one or more processing units, and the different processing units may be separate devices or may be integrated in one or more of the processors 210. The processor 210 may be specifically an integrated control chip, or may be composed of a circuit including various active and/or passive components, and the circuit is configured to perform the functions described in the embodiments of the present application and pertaining to the processor 210. Wherein the processor 210 of the wireless keyboard 200 may be a microprocessor.
The wireless communication module 270 may be configured to support data exchange between the wireless keyboard 200 and other electronic devices, including Bluetooth (BT), Global Navigation Satellite System (GNSS), Wireless Local Area Network (WLAN) (e.g., Wi-Fi network), Frequency Modulation (FM), Near Field Communication (NFC), infrared (infrared, IR) and other wireless communications.
In some embodiments, the wireless communication module 270 may be a bluetooth chip. The wireless keyboard 200 may be a bluetooth keyboard. The wireless keyboard 200 can be paired with bluetooth chips of other electronic devices through the bluetooth chip and establish a wireless connection, so as to realize wireless communication between the wireless keyboard 200 and other electronic devices through the wireless connection.
In addition, the wireless communication module 270 may further include an antenna, and the wireless communication module 270 receives an electromagnetic wave via the antenna, frequency-modulates and filters a signal of the electromagnetic wave, and transmits the processed signal to the processor 210. The wireless communication module 270 may also receive a signal to be transmitted from the processor 210, frequency modulate it, amplify it, and convert it into electromagnetic waves via the antenna to radiate it out.
In some embodiments, wireless keyboard 200 may support wired charging. Specifically, the charging management module 240 may receive a charging input of the wired charger through the charging interface 230.
In other embodiments, wireless keyboard 200 may support forward wireless charging. The charging management module 240 may receive a wireless charging input through the wireless charging coil 250 of the wireless keyboard 200. Specifically, the charging management module 240 is connected to the wireless charging coil 250 through a matching circuit. The wireless charging coil 250 may be coupled to the wireless charging coil 250 of the wireless charger to induce an alternating electromagnetic field emitted by the wireless charging coil 250 of the wireless charger to produce an alternating electrical signal. The alternating electrical signal generated by the wireless charging coil 250 is transmitted to the charging management module 240 through the matching circuit so as to wirelessly charge the battery 230.
The charging management module 240 may also supply power to the wireless keyboard 200 while charging the battery 230. The charge management module 240 receives an input of the battery 230, and supplies power to the processor 210, the memory 220, the external memory, and the wireless communication module 270, etc. The charge management module 240 may also be used to monitor parameters such as battery capacity, battery cycle number, battery state of health (leakage, impedance) of the battery 260. In some other embodiments, the charging management module 240 may also be disposed in the processor 210.
In other embodiments, wireless keyboard 200 may support reverse wireless charging. Specifically, the charging management module 240 may further receive an input from the charging interface 230 or the battery 260, and convert a dc signal input from the charging interface 230 or the battery 260 into an ac signal. The ac signal is transmitted to the wireless charging coil 250 through the matching circuit. The receipt of the alternating current signal by the wireless charging coil 250 may generate an alternating electromagnetic field. The wireless charging coils of other mobile terminals induce the alternating electromagnetic field, and wireless charging can be carried out. That is, the wireless keyboard 200 may also be used to wirelessly charge other mobile terminals. In one embodiment, the wireless charging coil 250 may be disposed in the receiving portion 203 of the wireless keyboard 200.
It should be noted that the matching circuit may be integrated in the charging management module 240, or the matching circuit may be independent from the charging management module 240, which is not limited in this embodiment of the application. Fig. 4 is a schematic diagram illustrating a hardware structure of the wireless keyboard 200, taking as an example that the matching circuit may be integrated in the charging management module 240.
The charging interface 230 may be used to provide a wired connection for charging or communication between the wireless keyboard 200 and other electronic devices (e.g., a wired charger of the wireless keyboard 200).
The touch pad 280 is integrated with a touch sensor. The electronic device 100 may receive a control command of the electronic device 100 from a user through the touch pad 280 and the keyboard 290.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the wireless keyboard 200. It may have more or fewer components than shown in fig. 4, may combine two or more components, or may have a different configuration of components. For example, the external surface of the wireless keyboard 200 may further include keys, indicator lights (which may indicate the status of power, incoming/outgoing calls, pairing mode, etc.), a display screen (which may prompt the user for relevant information), and the like. The key may be a physical key or a touch key (used in cooperation with the touch sensor), and is used for triggering operations such as power on, power off, starting charging, stopping charging, and the like.
The software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. In the embodiment of the present application, an Android (Android) system with a layered architecture is taken as an example to exemplarily illustrate a software structure of the electronic device 100.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. And the layers communicate with each other through an interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android Runtime (Android Runtime) and system services, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 5, the application package may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 5, the application framework layer may include an activity manager, a window manager, a content provider, a view system, a resource manager, a notification manager, etc., which is not limited in this embodiment.
Activity Manager (Activity Manager): for managing the lifecycle of each application. Applications typically run in the operating system in the form of Activity. For each Activity, there is an application record (activetyrecord) in the Activity manager corresponding to it, which records the state of the Activity of the application. The Activity manager can schedule Activity processes for the application using this Activity record as an identification.
Window manager (windowmanager service): graphical User Interface (GUI) resources for managing GUI resources used on a screen may specifically be used to: the method comprises the steps of obtaining the size of a display screen, creating and destroying a window, displaying and hiding the window, arranging the window, managing a focus, managing an input method, managing wallpaper and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc.
The system services and kernel layer below the application framework layer, etc. may be referred to as an underlying system, which includes an underlying display system for providing display services, e.g., the underlying display system includes a display driver in the kernel layer and a surface manager (surface manager) in the system services, etc. Moreover, the bottom layer system in the application further comprises an identification module for identifying the physical form change of the flexible screen, and the identification module can be independently arranged in the bottom layer display system and also can be arranged in a system service and/or kernel layer.
As shown in fig. 5, the Android runtime includes a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
As shown in fig. 5, the system service may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
As shown in fig. 5, the kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver, an identification module and the like.
In one embodiment, an electronic device may include an input collection module, an input caching module, a path prediction module, an input distribution module, a system application, and the like.
Illustratively, the input collection module may be an input reader module included by the system service in fig. 5. The input distribution module may be an input scheduling module included by the system service in fig. 5. The input caching module and the path prediction module may also be modules included in the system service. The system applications may be modules in the application framework layer in fig. 5.
Illustratively, the input acquisition module is configured to receive a sliding operation of a user, acquire a sliding event according to the sliding operation, where the sliding event includes a position and time when a finger touches a touch area, and further configured to send the acquired sliding event to the input distribution module and the input cache module, and send a last sliding event, that is, an end event, to the path prediction module.
Illustratively, the input cache module is configured to store the position information and the time information included in the received sliding event, so that the path prediction module predicts the display position of the cursor after the finger of the subsequent user leaves the touch area.
Illustratively, the path prediction module is configured to predict target location information after the user stops the sliding operation based on the sliding event stored in the input caching module in response to the end event, and is further configured to send the target location information to the input distribution module.
Illustratively, the input distribution module is configured to convert the position information in the sliding event into first position information of the cursor in the display, and is further configured to invoke the cursor drawing interface to convert the target position information into second position information of the cursor in the display. Or, the input distribution module may also send the sliding event or the target position information to other modules, so that the other modules call the cursor drawing interface to perform position conversion.
Illustratively, the input distribution module is further configured to draw a track of the cursor according to the first position information or the second position information, and send the drawn track to the application layer for real-time display. Or the input distribution module sends the converted first position information or the converted second position information of the cursor to the application program framework layer, and the application program framework layer draws the cursor. Alternatively, the first position information or the second position information may be sent to the application layer, and the application layer draws the cursor. The embodiment of the application does not limit the layer and the module of the android system on which the cursor is drawn.
Illustratively, the input distribution module is further configured to send the converted position information of the cursor to the system application.
For example, the system application is configured to determine a current position of the cursor according to the position information of the cursor, determine whether the current position of the cursor triggers some functions of some electronic devices, and if so, perform a corresponding operation. For example, the system application may determine whether the current position of the cursor is located in some control or pop-up box, and if so, the corresponding operation needs to be performed.
The technical solutions involved in the following embodiments can be applied to the scenario as shown in fig. 1. The electronic device 100 may have a hardware architecture as shown in fig. 3 and a software architecture as shown in fig. 5. The wireless keyboard 200 may have a hardware architecture as described in fig. 4.
The technical solution in the embodiment of the present application will be described below with reference to fig. 6 to 8. In the description of the present application, unless otherwise specified, "at least one" means one or more, "a plurality" means two or more. In addition, in order to facilitate clear description of technical solutions of the embodiments of the present application, in the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
According to the cursor moving method provided by the embodiment of the application, the electronic device responds to the sliding operation in the touch area, detects a plurality of sliding events, responds to the operation that a finger or a touch pen leaves the touch area, determines the target position information according to at least two reference events, and then draws the moving track of the cursor according to the target position information. The at least two reference events include an end event, and the end event is a last detected sliding event in the plurality of sliding events. By executing the technical scheme, the electronic equipment can control the cursor to continuously move for a certain distance according to inertia when the finger of the user stops moving in the touch area.
Optionally, the plurality of sliding events may include an input event, a sliding event after the input event, and a last sliding event (i.e., an end event).
Optionally, each sliding event may include a touch time and position information of a touch point on the touch area when the electronic device detects the sliding event, where the touch time refers to a time when the electronic device detects the sliding event.
Optionally, the electronic device may determine the target location information according to at least two reference events, and may also determine the target location information according to a sliding event, for example, an end event, which is not limited herein.
Alternatively, the target position information refers to position information of the finger or the stylus on the touch area predicted by the electronic device after the user finishes sliding. The number of the target location information may be preset or calculated, and is not limited herein.
Optionally, when the electronic device draws the movement track of the cursor according to the target position information, the target position information may be converted into position information of the cursor on the display, and then the track of the cursor may be drawn.
It should be noted that, in the embodiment of the present application, when the finger of the user stops sliding in the touch area, the electronic device may implement that the cursor continuously moves for a distance according to inertia in a plurality of implementation manners. As a possible implementation, the track prediction function may be started by default when the electronic device is turned on, and the electronic device may predict the track of the cursor when detecting that the user stops sliding. As another possible implementation, whether to start the trajectory prediction function may be set by the user in a system setting of the electronic device, considering that the user's requirements are different in different application scenarios. When the function is started, the electronic device can predict the track of the cursor when detecting that the user stops sliding. Of course, the prediction of the cursor trajectory may also be implemented by other implementation manners, and the embodiment of the present application is not particularly limited herein.
In an embodiment, as shown in fig. 6, a flow of a cursor moving method provided in an embodiment of the present application may include:
1. an input acquisition module of the electronic equipment receives sliding operation of a user and acquires a sliding event according to the sliding operation.
When a user performs a sliding operation through a finger or a peripheral device connected to the electronic device, such as a stylus or a mouse, the input acquisition module of the electronic device may receive the sliding operation of the user and obtain a sliding event according to the sliding operation.
In one embodiment, the sliding operation of the user is described as follows: in the scenario of fig. 1 where the electronic device is connected to a wireless keyboard, the sliding operation of the user may be a sliding operation performed by the user on a touch area through a finger or a stylus. The sliding operation refers to an operation that a finger or a stylus keeps in contact with the touch area after contacting the touch area, and the electronic device can detect the sliding operation of the user in real time through a sensor layer coupled or integrated with the touch area. In a scenario where the electronic device is connected to a mouse, the sliding operation of the user may be an operation of the user sliding the mouse. Of course, the sliding operation of the user may also be an operation of the user in other scenarios, and the sliding operation of the user is not limited in this embodiment of the application.
The touch area may be a touch area of an input surface of the electronic device in fig. 3 (for example, when the electronic device is a tablet computer, the input surface may be a touch screen, and when the electronic device is a notebook computer, the input surface may be a touch pad). Alternatively, the touch area may be a peripheral device connected to the electronic device, for example, a touch area on a touch pad of the wireless keyboard in fig. 4.
In one embodiment, taking the sliding operation as an example of the sliding operation performed by a user through a finger on a touch area on a wireless keyboard connected to an electronic device, a processing procedure of receiving the sliding operation of the user by an input acquisition module and acquiring a sliding event according to the sliding operation is as follows: when a finger of a user touches a touch area of the wireless keyboard, the input acquisition module may detect an input event, which may specifically include position information of the finger on the touch area and a time when the input event is detected. When the finger of the user touches the touch area and does not leave the touch area to start the sliding operation, the input acquisition module may periodically detect a touch event of the user, where the touch event may include position information of the finger on the touch area at each detection and a time of each detection of the touch event. When the finger of the user leaves the touch area, that is, after the sliding operation is finished, the input acquisition module may use the last received touch event as an end event, where the end event may specifically include position information of the finger on the touch area, which is detected by the input acquisition module last time, and a time when the end event is detected.
As can be seen, the sliding event obtained by the input acquisition module may include: the touch input method comprises an input event of primary touch of a user and a plurality of touch events detected by an input acquisition module in the sliding process of the user. Wherein the plurality of touch events may include a last detected end event.
It is understood that, in the embodiment of the present application, the input acquisition module may determine the position information of the finger included in the sliding event on the touch region according to the contact area of the finger of the user and the touch region. In an embodiment, when the contact area between the finger and the touch area is smaller than the area threshold, the position information may be position information of a touch point between the mobile phone and the touch area. When the contact area between the finger and the touch area is larger than the area threshold, the position information may be the position information of the touch area between the mobile phone and the touch area. In this case, the position information may be position information of a center point of the touch area, or may be position information of other points of the touch area. Of course, the input acquisition module may also determine the position information included in the sliding event in other manners, and the embodiment of the present application does not limit the position information included in the sliding event. The position information may be a two-dimensional coordinate of a certain point.
It should be noted that, in the embodiment of the present application, the input acquisition module periodically detects a plurality of touch events. And the last touch event of the plurality of touch events is taken as an end event. This is because the input acquisition module periodically detects the touch event, and the duration of each period may be the same or different, so that the touch event detected in the last period of the input acquisition module may be obtained at the moment when the finger of the user leaves the touch area, and the touch event detected in the last period may also be obtained before the finger of the user leaves the touch area, but the finger of the user has left the touch area before the touch event is detected next time by the input acquisition module. That is, the position information of the finger on the touch area included in the end event may refer to the position information of the finger when the finger leaves the touch area, or may refer to the position information of the finger before the finger leaves the touch area. Alternatively, it may also be understood that the time of detecting the end event included in the end event may refer to a time when the finger leaves the touch area, and may also refer to a time earlier than the time when the finger leaves the touch area. The relationship between the position information of the finger on the touch area and the finger leaving the touch area, which is included in the end event, may be determined according to a preset period and an actual scene.
In addition, the period of the touch event detected by the input acquisition module can be obtained by a background worker through preliminary experiments and stored in the electronic device. Alternatively, the period may be derived from a screen refresh frequency of the electronic device. For example, assuming a screen refresh rate of N times per second, the period may be (1/N) second. Of course, the period may also be determined according to other ways. The embodiment of the application does not limit the period of the touch event detected by the input acquisition module and whether the duration of each period is the same.
2. And the input acquisition module sends the acquired sliding event. In a specific implementation, 2201, the input acquisition module sends the acquired sliding event to an input distribution module of the electronic device. 2202, the input acquisition module sends a swipe event to an input cache module of the electronic device. 2203, the input collection module sends an end event to the path prediction module of the electronic device.
After each sliding event is acquired by the input acquisition module, the acquired sliding event can be respectively sent to the input distribution module and the input cache module. And the input acquisition module can send the end event to the path prediction module after acquiring the last sliding event, namely the end event.
3. The input cache module receives the sliding event sent by the input acquisition module and stores the received sliding event.
After the input acquisition module sends the sliding event to the input cache module, the input cache module may receive and store the position information and the time information included in the received sliding event one by one, and the position information and the time information are used for predicting the display position of the cursor by the path prediction module after the finger of the user leaves the touch area.
4. And the input distribution module receives the sliding event sent by the input acquisition module and executes operation according to the sliding event.
The input collection module, after sending the slip event to the input distribution module, the input distribution module may perform an operation based on the received slip event.
For example, the input distribution module may determine a first position of a cursor in a display of the electronic device according to the position information included in the sliding event. That is, the input distribution module may convert the position information of the user's finger on the touch area into the first position information of the corresponding cursor in the display. Alternatively, the input distribution module may send the sliding event to other modules, so that the other modules perform position conversion. Wherein. The other modules may be modules of a system service layer, an application framework layer, and an application layer, and the embodiments of the present application are not limited. Alternatively, the conversion of the position information of the finger on the touch area into the first position information of the cursor in the display may also be performed by another module (that is, the sliding event does not need to be sent to the input distribution module, and may be sent to the other module by the input acquisition module), and the conversion of the first position information is specifically performed by which module performs the embodiment of the present application, and is not limited herein.
For example, the input distribution module may further draw a track of the cursor according to the first position information, and send the drawn track to the application layer for real-time display. Or, the input distribution module may send the converted first position information of the cursor to the application framework layer, and the application framework layer performs drawing of the cursor. Or, the input distribution module can also send the first position information to the application program layer, and the application program layer draws a cursor. The embodiment of the application does not limit the layer of the android system where the cursor is drawn and the module where the cursor is drawn.
It should be noted that, in the embodiment of the present application, the correspondence between the position in the touch area and the position of the cursor in the display may be stored in the electronic device in advance. The correspondence between the two can be obtained by a function or algorithm.
For example, in the embodiment of the present application, the drawing process of the cursor may be: the input acquisition module acquires touch point coordinates included by the sliding event and reports the touch point coordinates to an input map (input map) module. The input mapping module converts the touch point coordinates in the sliding event into coordinates of the cursor in the screen. The input mapping module reports the coordinates of the cursor to a pointer controller. The pointer controller reports the coordinates of the cursor to a mouse controller (sprite controller). And calling a screen drawing interface by the mouse controller to draw a cursor on the screen.
5. The path prediction module receives the end event sent by the input acquisition module, and determines at least one piece of target position information based on the sliding event stored in the input cache module.
After the input acquisition module sends the end event to the path prediction module, the path prediction module may trigger execution of path prediction in response to the received end event, that is, determine a possible sliding trajectory according to inertia after the finger of the user leaves the touch area. For example, the path prediction module may determine at least one target location information based on all the sliding events corresponding to the sliding operation of the user stored in the input cache module. The target position information is information of a position where a finger of the user may be located according to inertia after leaving the touch area. Generally, the faster the user's finger slides on the touch area, the greater the amount of target position information, and thus the longer the sliding trajectory of the cursor in the display according to inertia after the finger leaves the touch area.
For example, the path prediction module may determine the at least one target location information in several ways.
Mode 1, the path prediction module determines at least one target position information based on two slip events and a preset motion deceleration.
It is understood that in the embodiment of the present application, there are many implementations of the selection of the two sliding events. For example, the two sliding events may be any two sliding events of all sliding events corresponding to the sliding operation stored in the input buffer module. Alternatively, two of the sliding events may include an end event, and the remaining other event may be any one of all sliding events corresponding to the sliding operation except the end event. For example, the two slip events may include an end event and one slip event prior to the end event. Alternatively, the two sliding events may be two events near the end event. The embodiment of the present application is not limited to specifically selecting which two sliding events.
In addition, in the embodiment of the present application, the preset movement deceleration may be previously obtained by experiment and stored in the electronic device.
In one embodiment, the path prediction module may determine at least one target position information according to two position information and two time information of the two sliding events, and a preset motion deceleration and a period (a period for the input acquisition module to detect the touch event according to the sliding operation). For example, the target position information may be coordinates of a point predicted in the touch area. In this case, the path prediction module may determine the origin coordinates before determining the at least one target location information. For example, the position of the input event corresponding to the slide operation here may be defaulted as the origin.
For example, for ease of understanding, the target position information is assumed to be two-dimensional coordinates of a certain point on the touch area. And assuming that the two slip events include an end event and a previous slip event of the end event, the above target position information may be obtained according to the following equations (1) and (2):
Figure BDA0003209633600000151
Figure BDA0003209633600000152
where Δ Xi is a difference between an abscissa of the ith target position and an abscissa of a position included in the end event, and Δ Yi is a difference between an ordinate of the ith target position and an ordinate of the position in the end event. (Δ xn, Δ yn) are the coordinates of the location that the end event includes, and Δ Tn is the time information that the end event includes, i.e., the time at which the input acquisition module detected the end event. (xn, yn) is the coordinates of the location included by the sliding event that precedes the end event, and Tn is the time information included by the sliding event that precedes the end event, i.e., the time at which the input acquisition module detected the sliding event that precedes the end event. Alpha is a preset deceleration of the movement. Δ Ti is a time corresponding to the ith target position, and may be obtained according to Δ Tn and a period T of the input acquisition module for detecting the touch event according to the sliding operation, for example. For example, Δ Ti ═ Δ Tn + i × T when the time length of each cycle is the same. As another example, Δ Ti may be obtained from the preconfigured duration and Δ Tn.
After the path prediction module obtains Δ Xi and Δ Yi by using the formula (1) and the formula (2), the coordinates of the ith target position can be determined, and the coordinates of the ith target position are (Δ xn + Δ Xi, Δ yn + Δ Yi).
It should be noted that, in the embodiment of the present application, the coordinate point of the target position is located on a straight line defined by two positions of two sliding events. That is, the cursor in the display of the electronic device slides along the inertia sliding trajectory as a straight line, which is the same as the straight line determined according to the positions of the two cursors after the two position transitions in the two sliding events. And the cursor only slides for a distance according to inertia, and the path prediction module can predict the path according to the distance
Figure BDA0003209633600000153
Calculating the initial velocity in the horizontal direction and using a common equationFormula (II):
Figure BDA0003209633600000154
let V be 0 and calculate Δ Ti, resulting in the longest time to slide in the horizontal direction. And the path prediction module may be based on
Figure BDA0003209633600000155
Calculating the initial speed in the vertical direction and adopting the formula:
Figure BDA0003209633600000156
let V be 0 and calculate Δ Ti, resulting in the longest time to slide in the vertical direction. The smaller Δ Ti of these two longest times from which the number of target positions can be determined. For example, assuming that the duration of each cycle is the same, dividing Δ Ti by the cycle by a whole can yield the number of target positions.
For example, assuming that when a finger of a user performs a sliding operation on a touch area, the electronic device acquires 4 sliding events, as shown in fig. 7A, where the 4 sliding events include: input event 1, slide event 2, slide event 3, end event 4. The input event 1 includes location information (x1, y1) and time information T1. The slide event 2 includes position information of (x2, y2) and time information of T2. The slide time 3 includes position information of (x3, y3) and time information of T3. The end event 4 includes location information of (Δ xn, Δ yn) and time information of Δ Tn. Assuming that the position of the input event 1 is the origin, i.e., x1 is 0 and y1 is 0, the two slide events include an end event and a previous slide event of the end event. And assuming that the number of the target positions calculated by the path prediction module is 3, the time length of each period is T. Then, as shown in FIG. 7A, the 1 st target position has coordinates of
Figure BDA0003209633600000161
Figure BDA0003209633600000162
Δ T1 ═ Δ Tn + T. The 2 nd target position has coordinates of
Figure BDA0003209633600000163
Δ T2 ═ Δ Tn + 2T. The coordinates of the 3 rd target position are
Figure BDA0003209633600000164
Figure BDA0003209633600000165
ΔT3=ΔTn+3T。
Mode 2, the path prediction module determines at least one target location information based on the three sliding events.
It is understood that, in the embodiment of the present application, the three sliding events may be any three sliding events in all sliding times corresponding to the sliding operation stored by the input buffer module, or may be three sliding events close to the end event. For example, in order to ensure the accuracy of the sliding trajectory of the cursor predicted by the path prediction module, three last detected sliding events, namely, the end event and the first two sliding events of the end event, may be selected as the three sliding events. The embodiment of the present application does not limit which three sliding events are specifically selected.
In one embodiment, the path prediction module may determine at least one target location information based on the three sliding events. For example, the path prediction module may determine the circle center coordinates and the circle radius according to the three position information, and determine the arc distance of the finger according to the inertia after the finger leaves the touch area based on the information included in the three sliding events. Thereafter, the path prediction module determines at least one target location information based on the circle center coordinates, the circle radius, and the arc distance. The target position information may be two-dimensional coordinates of a point predicted in the touch area.
For example, for ease of understanding, the target position information is assumed to be two-dimensional coordinates of a certain point on the touch area. And assume that the three slide events include an end event and the first two slide events of the end event, wherein the end event includes a position coordinate of (x)n,yn) Time information is TnBefore the end eventA slide event includes a position coordinate of (x)n-1,yn-1) Time information is Tn-1The second sliding event before the end event includes a position coordinate of (x)n-2,yn-2) Time information is Tn-2. The path prediction module may determine the abscissa and ordinate of the center of the circle from these three coordinates and determine the radius of the circle from the coordinates of the center of the circle. Wherein, the abscissa of the center of the circle satisfies the following formula (3), the ordinate of the center of the circle satisfies the following formula (4), and the radius of the circle satisfies the following formula (5).
Figure BDA0003209633600000166
Figure BDA0003209633600000167
Figure BDA0003209633600000168
Wherein X is the abscissa of the center of the circle, Y is the ordinate of the center of the circle, and R is the radius of the circle. And A, B, C, D, E, F satisfy the following formulas, respectively.
A=xn-xn-1
B=yn-yn-1
C=xn-xn-2
D=yn-yn-2
Figure BDA0003209633600000171
Figure BDA0003209633600000172
The path prediction module may determine an arc distance for each target location relative to the location of the end event based on information included with the three slip events. The pitch arc distance satisfies the following formula (6)
Figure BDA0003209633600000173
Where α is a preset deceleration of motion. Δ Ti is a time corresponding to the ith target position, and Δ Ti is based on TnAnd the input acquisition module is used for detecting the period T of the touch event according to the sliding operation. For example, when the time lengths of each cycle are the same, Δ Ti ═ Tn+i*T。
The path prediction module, after determining the coordinates of the center of the circle, the radius of the circle, and the arc distance of each target location relative to the location of the end event, may determine each target location information based on the coordinates of the center of the circle, the radius of the circle, and the arc distance. The target position information satisfies the following formula (7) and formula (8).
Figure BDA0003209633600000174
Figure BDA0003209633600000175
Where Δ Xi is the abscissa of the ith target position, and Δ Yi is the ordinate of the ith target position.
In the embodiment of the present application, the trajectory of the cursor in the display obtained in the manner 2 slides by inertia is an arc. Also, the number of target positions may be configured in the electronic device in advance. For example, the number of target locations may be obtained according to a screen refresh frequency of the electronic device and a preset time duration. For example, assuming that the screen refresh frequency is N times per second and the preset time duration is M, the number of target locations may be (M × N). Alternatively, the number of target locations may be a fixed value that is pre-configured. Of course, the number of target positions may also be obtained by other ways, and the embodiment of the present application is not limited herein.
In an exemplary manner, the first and second electrodes are,assuming that when the finger of the user performs a sliding operation on the touch area, the electronic device acquires 4 sliding events, as shown in fig. 7B, where the 4 sliding events include: input event 1, slide event 2, slide event 3, end event 4. The input event 1 includes location information (x1, y1) and time information T1. The slide event 2 includes position information of (x2, y2) and time information of T2. The slide event 3 includes position information of (x3, y3) and time information of T3. The end event 4 includes location information of (x4, y4) and time information of T4. Assuming that the position of the input event 1 is the origin, i.e., x1 is 0 and y1 is 0, the three slide events include an end event and the first two slide events of the end event. And assuming that the number of the preset target positions is 3, the duration of each period is T. Then, as shown in fig. 7B, the path prediction module determines coordinates (X, Y) of the center of the circle and determines a radius R of the circle according to the three position information of the sliding event 2, the sliding event 3, and the ending event 4, and also determines an arc distance Si corresponding to each of the 3 target positions, and then determines coordinates of the 3 target positions according to the coordinates of the center of the circle, the radius of the circle, and the arc distance. Wherein the 1 st target position has coordinates of
Figure BDA0003209633600000181
Δ T1 ═ T4+ T. The 2 nd target position has coordinates of
Figure BDA0003209633600000182
Δ T2 ═ T4+ 2T. The coordinates of the 3 rd target position are
Figure BDA0003209633600000183
ΔT3=T4+3T。
Mode 3, the path prediction module determines at least one piece of target position information based on all the sliding events corresponding to the sliding operation.
In one embodiment, the path prediction module may determine at least one target location information based on the location information and the time information of all the sliding events corresponding to the sliding operation, and using a pre-stored trajectory prediction algorithm.
It is understood that, in the embodiment of the present application, the number of target locations may be a fixed value configured in advance, or may be obtained according to a screen refresh frequency of the electronic device and a preset time length. The embodiments of the present application are not limited herein.
In addition, the pre-stored trajectory prediction algorithm may be a trajectory prediction model or a trajectory prediction formula. The trajectory prediction model or the trajectory prediction formula may be obtained by acquiring sample data in advance and training the sample data in a machine learning manner. In order to ensure the accuracy of the trajectory prediction algorithm, the sampling data may be a large amount of position data and time data of the finger sliding on the touch area, and coordinate point data predicted manually.
It is to be noted that, in the embodiment of the present application, the trajectory prediction model may be obtained by the electronic device through training data of the electronic device, or may be sent to the electronic device after obtaining the cloud training data, or may be obtained through manual calculation and stored in the electronic device. The embodiment of the present application does not limit the manner of obtaining the trajectory prediction model in the electronic device.
In some embodiments, the sampling data for training the trajectory prediction model may be acquired by the electronic device, or may be acquired by the cloud. For example, the sample data may be historical swipe data acquired by the electronic device for a user using the electronic device. For another example, the sampling data may be historical sliding data of users of the whole network acquired by the cloud, or may also be historical sliding data of certain types of people. The embodiment of the present application does not limit the source of the sampling data and the type of the sampling data.
Further optionally, the above-mentioned sampling data for training the trajectory prediction model may also be all historical sliding data within a preset time period. For example, the sample data may be historical slip data in the previous day, the previous week, the previous month, or the previous year in which the cursor movement method in the embodiment of the present application was performed. The time period for acquiring the sampling data is not particularly limited in the embodiments of the present application.
In some embodiments, when training the trajectory prediction model using the sample data, the trajectory prediction model may be classified according to the shape of the sliding trajectory of the user, and then a trajectory prediction model may be trained by combining different types of sliding trajectories, or a trajectory prediction model may be trained for each type of sliding trajectory. The type of the sliding track may include a straight line, an arc line, an irregular curve, and the like. The embodiment of the present application does not specifically limit the training process of the trajectory prediction model.
For example, the path prediction module may determine a time corresponding to each target position according to the time information included in the end event and the period for detecting the slip event. Then, the path prediction module may input the position information and the time information in all the sliding events and the time corresponding to a certain target position into a trajectory prediction model or a trajectory prediction formula to obtain the coordinates of the target position.
For example, the trajectory prediction formula may be the following formula (9) and formula (10).
Figure BDA0003209633600000184
Figure BDA0003209633600000191
Where Δ Xi is the abscissa of the ith target position, and Δ Yi is the ordinate of the ith target position. a1, a2.. an, and b1, b2... bn are preset parameters, pre-trained. (x)1,y1) Is the coordinates of the position in the input event and T1 is the time information in the input event. (x)2,y2) Is the coordinates of the position in the first slip event after the input event and T2 is the time information in this slip event. (x)n,yn) Is the coordinates of the location in the end event and Tn is the time information in the end event. Δ Ti is the time corresponding to the ith target position, and exemplary Δ Ti can be calculated according to Δ Tn and the outputAnd the input acquisition module is used for detecting the period T of the touch event according to the sliding operation. For example, Δ Ti ═ Δ Tn + i × T when the time length of each cycle is the same. As another example, Δ Ti may be obtained from the preconfigured duration and Δ Tn.
And the path prediction module substitutes the data included by all the sliding events corresponding to the touch operation into the coordinate formula of the ith target position to obtain the coordinate of the ith target position. It is to be understood that, after the at least one target position information determined in the above manner 3 is converted into the position information of the cursor in the display, the shape of the trajectory of the cursor in the display may be implemented in any way. For example, the sliding track of the supplementary cursor may be a straight line, an arc line, or a line with an unspecified shape, which may be determined by the actual scene.
Mode 4, the path prediction module determines at least one target location information based on five sliding events.
It is understood that, in the embodiment of the present application, the five sliding events may be any five sliding events of all sliding times corresponding to the sliding operation stored by the input buffer module, or may be five sliding events close to the end event. Illustratively, to ensure the accuracy of the sliding trajectory of the cursor predicted by the path prediction module, the last five detected sliding events may be selected. The embodiment of the present application is not limited to specifically selecting which five sliding events.
In one embodiment, the path prediction module may determine at least one target location information based on the five sliding events.
For example, the path prediction module may determine an ellipse equation from five pieces of position information, i.e., five coordinates, and draw the ellipse according to the ellipse equation, thereby obtaining coordinates of a center of the ellipse, and a major semi-axis a and a minor semi-axis b. The handset may then determine at least one target location based on the coordinates of the center of the ellipse, the major and minor semi-axes a, b, and the arc distance (arc distance determined according to equation 6 above). The target position information may be two-dimensional coordinates of a point predicted in the touch area.
In the embodiment of the present application, the trajectory of the cursor in the display obtained in the manner 4 by the inertia sliding is an arc.
For example, assuming that when the finger of the user performs a sliding operation on the touch area, the electronic device acquires 5 sliding events, as shown in fig. 7C, where the 5 sliding events include: input event 1, slide event 2, slide event 3, slide event 4, end event 5. The input event 1 includes location information (x1, y1) and time information T1. The slide event 2 includes position information of (x2, y2) and time information of T2. The slide event 3 includes position information of (x3, y3) and time information of T3. The slide event 4 includes position information of (x4, y4) and time information of T4. The end event 5 includes location information of (x5, y5) and time information of T5.
Assuming that the position of the input event 1 is the origin, i.e., x1 is 0 and y1 is 0, the five sliding events include all the above events. And assuming that the number of the preconfigured target positions is 3, the duration of each period is T. Then, as shown in fig. 7C, the path prediction module determines coordinates (X, Y) of the center of the ellipse according to the five position information of the input event 1, the sliding event 2, the sliding event 3, the sliding event 4, and the end event 5, and determines a major axis a and a minor axis b, and also determines an arc distance Si corresponding to each of the 3 target positions. And then determining the coordinates of 3 target positions according to the circle center coordinates, the major semi-axis a, the minor semi-axis b and the arc distance of the ellipse.
It should be noted that, in the embodiment of the present application, the path prediction module may determine at least one piece of target location information by using other manners besides the above four manners.
In addition, in the embodiment of the present application, any one of the four manners may be pre-stored in the electronic device for use by the path prediction module when determining the target position coordinate. Alternatively, all of the above four modes may be prestored in the electronic device. In this case, when the target location information needs to be determined, the path prediction module may select one manner from all pre-stored manners to determine the target location information. For example, the path prediction module may select one of all ways as the final use way. Or, the path prediction module may determine which way to select to determine the target position information by using a preset trajectory selection algorithm according to the trajectory of the actual movement of the cursor. For example, the path prediction module may determine the radian of the trajectory of the real sliding by analyzing the position information in all sliding events corresponding to the sliding operation. If the arc is less than the preset range, the path prediction module may select mode 1 or mode 3 to determine the target location information. If the arc is greater than the preset range, the path prediction module may select mode 2 or mode 3 to determine the information of the target position. The embodiment of the present application does not limit what manner the path prediction module determines that the at least one target location information specifically uses.
As can be seen from the above, in the above mode 3, compared with the modes 1, 2 and 4, the path prediction module refers to the position information and the time information in all the sliding events corresponding to the sliding operation when predicting the information of the target position, and since there are more data comprehensively considered in the mode 3, the information of the target position predicted by the mode 3 is more reasonable, so that the trajectory of the cursor in the display moving according to the inertia is smoother.
6. The path prediction module sends at least one target location information to the input distribution module.
For example, the path prediction module may send all the target location information to the input distribution module once after determining all the target location information. Or, each time the path prediction module determines one piece of target location information, the determined target location information may be sent to the input distribution module. The method for the path prediction module to send the at least one piece of target location information is not limited herein in the embodiments of the present application.
For another example, the path prediction module may not perform any operation after determining the at least one target location information.
7. The input distribution module receives at least one piece of target position information sent by the path prediction module and executes operation according to the target position information.
In this embodiment of the application, the specific description of the operation executed by the input distribution module according to the target location information is similar to the description of the operation executed by the input distribution module according to the location information in the sliding event in step 4, and is not described herein again.
In another embodiment, as shown in fig. 8, a flow of a cursor moving method provided in an embodiment of the present application may include:
801. the electronic equipment receives the sliding operation of the user and acquires the sliding event according to the sliding operation.
The sliding event comprises position information of the finger of the user in the touch area and the time when the electronic equipment detects the sliding event.
The detailed description of step 801 may refer to the related description in step 1 in fig. 6, and is not repeated herein.
802. The electronic equipment converts the position information included by the sliding event into first position information of the cursor in the display, and draws and displays the cursor according to the first position information.
The detailed description of step 802 may refer to the related description in step 4 in fig. 6, and is not repeated herein.
803. And the electronic equipment responds to the sliding ending operation and determines at least one piece of target position information according to the acquired sliding event.
In one sliding operation, when the electronic device detects the last touch event, namely an end event is detected, the electronic device determines that the sliding end operation is received.
And the target position information is used for representing the predicted position information of the finger on the touch area.
The specific description of step 803 may refer to the related description in step 5 in fig. 6, and is not repeated here.
804. The electronic equipment converts each target position information in the at least one target position information into second position information of the cursor in the display, and draws and displays the cursor according to each second position information.
The detailed description of step 804 may refer to the related description in step 7 in fig. 6, and is not repeated herein.
From the above, when the finger of the user stops moving in the touch area, the electronic device realizes that the control cursor continues to move for a distance according to inertia.
The embodiment of the present application further provides a chip system, as shown in fig. 9, the chip system includes at least one processor 1101 and at least one interface circuit 1102. The processor 1101 and the interface circuit 1102 may be interconnected by wires. For example, interface circuit 1102 may be used to receive signals from other devices (e.g., a memory of electronic apparatus 100). As another example, the interface circuit 1102 may be used to send signals to other devices (e.g., the processor 1101). Illustratively, the interface circuit 1102 may read instructions stored in the memory and send the instructions to the processor 1101. The instructions, when executed by the processor 1101, may cause the electronic device to perform the various steps in the embodiments described above. Of course, the chip system may further include other discrete devices, which is not specifically limited in this embodiment of the present application.
It is to be understood that the electronic devices and the like described above include hardware structures and/or software modules for performing the respective functions in order to realize the functions described above. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present embodiments.
In the embodiment of the present application, the electronic device and the like may be divided into functional modules according to the method example, for example, each functional module may be divided according to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, the division of the modules in the embodiment of the present invention is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A method for moving a cursor, comprising:
the electronic equipment responds to the sliding operation in the touch area and detects a plurality of sliding events;
the electronic equipment responds to the operation of leaving the touch area, and determines target position information according to at least two reference events, wherein the at least two reference events comprise an end event, and the end event is the last detected sliding event in the multiple sliding events;
and the electronic equipment draws the moving track of the cursor according to the target position information.
2. The cursor moving method according to claim 1, wherein each sliding event includes a touch time, and the touch time refers to a time when the electronic device detects the sliding event;
the at least two reference events further include a proximity event, the proximity event is an event meeting a preset rule in the plurality of sliding events, and the preset rule means that a time difference between a touch time in the proximity event and a touch time in the end event is smaller than a preset value.
3. The method according to claim 1 or 2, wherein each sliding event includes position information of a touch point on the touch area when the electronic device detects the sliding event, the at least two reference events include the end event and a first event, and the first event is a sliding event previous to the end event;
the electronic device determines target location information according to at least two reference events, including:
the electronic equipment determines a speed according to the touch time and the position information in the end event and the touch time and the position information in the first event, wherein the speed is used for indicating the speed when the electronic equipment leaves the touch area;
the electronic device determines the target position information according to the speed, a preset motion deceleration and a first time interval, wherein the first time interval is a time interval between a touch moment included in the end event and a moment corresponding to the target position information, and the moment corresponding to the target position information is preset.
4. The method according to claim 1 or 2, wherein each sliding event includes position information of a touch point on the touch area when the electronic device detects the sliding event, the at least two reference events include the end event, a first event and a second event, the first event is a sliding event before the end event, and the second event is a second sliding event before the end event;
the electronic device determines target location information according to at least two reference events, including:
the electronic equipment determines circle center position information and circle radius of a perfect circle according to three position information in the end event, the first event and the second event;
the electronic device determines an arc distance according to the touch time and position information in the end event, the touch time and position information in the first event, a preset movement deceleration and a first time interval, wherein the arc distance is used for representing the distance of an arc between the position information in the end event and the target position information;
and the electronic equipment determines the target position information according to the circle center position information, the circle radius and the arc distance.
5. The method of claim 1 or 2, wherein each sliding event comprises position information of a touch point on the touch area when the electronic device detects the sliding event, and the at least two reference events comprise the plurality of sliding events;
the electronic device determines target location information according to at least two reference events, including:
and the electronic equipment inputs the position information and the touch time included by each sliding event in the plurality of sliding events and a first time interval into a preset track prediction model to obtain the target position information.
6. The method of moving a cursor of any one of claims 1-5, further comprising:
the electronic device stores touch time and position information included in each of the plurality of slide events.
7. The method of moving a cursor of claim 5, further comprising:
the electronic equipment acquires sampling data, wherein the sampling data comprises position information and touch time of a finger or a touch pen sliding on the touch area;
and the electronic equipment trains the sampling data in a machine learning mode to obtain the track prediction model.
8. The cursor movement method according to claim 7, wherein the sampling data is a historical sliding event of a user using the electronic device acquired by the electronic device; or the sampling data is historical sliding events of the whole network users from the cloud end received by the electronic equipment; or the sampling data is historical sliding events of the crowd of a preset type; alternatively, the sampled data is all historical slip events within a preset time period.
9. The method for moving a cursor according to claim 7 or 8, wherein the training of the sampled data by the electronic device in a machine learning manner to obtain the trajectory prediction model comprises:
the electronic equipment determines all sliding tracks according to the sampling data;
the electronic equipment is classified according to the shape of the sliding track to obtain a plurality of classification sets;
and the electronic equipment integrates the classification sets to determine a track prediction model, or the electronic equipment determines the track prediction model corresponding to each classification set.
10. The method of moving a cursor of claim 5, further comprising:
the electronic equipment receives the track prediction model sent by a cloud, and the track prediction model is obtained by the cloud training.
11. An electronic device, characterized in that the electronic device comprises; a memory and a processor; the memory is coupled with the processor; wherein the memory has stored therein computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform a method of movement of a cursor as claimed in any one of claims 1-10.
12. A computer-readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform a method of moving a cursor as claimed in any one of claims 1-10.
CN202110927091.3A 2021-08-12 2021-08-12 Cursor moving method and electronic equipment Active CN113805770B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110927091.3A CN113805770B (en) 2021-08-12 2021-08-12 Cursor moving method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110927091.3A CN113805770B (en) 2021-08-12 2021-08-12 Cursor moving method and electronic equipment

Publications (2)

Publication Number Publication Date
CN113805770A true CN113805770A (en) 2021-12-17
CN113805770B CN113805770B (en) 2022-08-12

Family

ID=78893557

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110927091.3A Active CN113805770B (en) 2021-08-12 2021-08-12 Cursor moving method and electronic equipment

Country Status (1)

Country Link
CN (1) CN113805770B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114371878A (en) * 2022-01-10 2022-04-19 深圳中微电科技有限公司 Display IP module hardware mouse implementation method based on cursor-free layer
CN116048313A (en) * 2022-08-25 2023-05-02 荣耀终端有限公司 Cursor control method, cursor control device and storage medium
CN114371878B (en) * 2022-01-10 2024-05-14 深圳中微电科技有限公司 DISPLAY IP module hardware mouse implementation method based on non-cursor layer

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103513811A (en) * 2012-06-29 2014-01-15 北京汇冠新技术股份有限公司 Touch trajectory tracking method
CN103631419A (en) * 2012-08-27 2014-03-12 腾讯科技(深圳)有限公司 Cursor positioning method and system based on remote control touch pad
CN103930854A (en) * 2011-09-09 2014-07-16 莫韦公司 Method of controlling a cursor by measurements of the attitude of a pointer and pointer implementing said method
CN105094411A (en) * 2014-05-09 2015-11-25 宏达国际电子股份有限公司 Electronic apparatus, drawing method using the same, and computer program product
CN106325713A (en) * 2016-08-17 2017-01-11 厦门印天电子科技有限公司 Inertia movement method of sliding object in whiteboard software
CN108958569A (en) * 2017-05-19 2018-12-07 腾讯科技(深圳)有限公司 Control method, device, system, terminal and the smart television of smart television
CN112486399A (en) * 2020-12-22 2021-03-12 安徽鸿程光电有限公司 Touch track display method, device, equipment and storage medium
CN112506413A (en) * 2020-12-16 2021-03-16 Oppo广东移动通信有限公司 Touch point prediction method and device, terminal equipment and computer readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103930854A (en) * 2011-09-09 2014-07-16 莫韦公司 Method of controlling a cursor by measurements of the attitude of a pointer and pointer implementing said method
CN103513811A (en) * 2012-06-29 2014-01-15 北京汇冠新技术股份有限公司 Touch trajectory tracking method
CN103631419A (en) * 2012-08-27 2014-03-12 腾讯科技(深圳)有限公司 Cursor positioning method and system based on remote control touch pad
CN105094411A (en) * 2014-05-09 2015-11-25 宏达国际电子股份有限公司 Electronic apparatus, drawing method using the same, and computer program product
CN106325713A (en) * 2016-08-17 2017-01-11 厦门印天电子科技有限公司 Inertia movement method of sliding object in whiteboard software
CN108958569A (en) * 2017-05-19 2018-12-07 腾讯科技(深圳)有限公司 Control method, device, system, terminal and the smart television of smart television
CN112506413A (en) * 2020-12-16 2021-03-16 Oppo广东移动通信有限公司 Touch point prediction method and device, terminal equipment and computer readable storage medium
CN112486399A (en) * 2020-12-22 2021-03-12 安徽鸿程光电有限公司 Touch track display method, device, equipment and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114371878A (en) * 2022-01-10 2022-04-19 深圳中微电科技有限公司 Display IP module hardware mouse implementation method based on cursor-free layer
CN114371878B (en) * 2022-01-10 2024-05-14 深圳中微电科技有限公司 DISPLAY IP module hardware mouse implementation method based on non-cursor layer
CN116048313A (en) * 2022-08-25 2023-05-02 荣耀终端有限公司 Cursor control method, cursor control device and storage medium
CN116048313B (en) * 2022-08-25 2024-04-16 荣耀终端有限公司 Cursor control method, cursor control device and storage medium

Also Published As

Publication number Publication date
CN113805770B (en) 2022-08-12

Similar Documents

Publication Publication Date Title
US10401964B2 (en) Mobile terminal and method for controlling haptic feedback
CN107810470B (en) Portable device and method for changing screen thereof
CN108139778B (en) Portable device and screen display method of portable device
TWI475468B (en) Portable devices, data transmission systems and display sharing methods thereof
WO2021244443A1 (en) Split-screen display method, electronic device, and computer readable storage medium
CN109284001B (en) Method for performing a function of a device and device for performing the method
EP3287884B1 (en) Display device and method of controlling the same
JP6306307B2 (en) Information transmission method and system, and apparatus thereof
US20150012881A1 (en) Method for controlling chat window and electronic device implementing the same
AU2013356799B2 (en) Display device and method of controlling the same
EP2741207B1 (en) Method and system for providing information based on context, and computer-readable recording medium thereof
EP2752754B1 (en) Remote mouse function method and terminals
KR20150014553A (en) Apparatus and method for constructing multi vision screen
CN104049745A (en) Input control method and electronic device supporting the same
CN104969163A (en) Display method and device of application interface and electronic device
CN105229585A (en) Display device and user interface screen supplying method thereof
US20180329598A1 (en) Method and apparatus for dynamic display box management
KR102204141B1 (en) Electro device for reminding task and method for controlling thereof
KR20160046622A (en) Wearable device and method for transmitting contents
EP3657311A1 (en) Apparatus including a touch screen and screen change method thereof
US20140164186A1 (en) Method for providing application information and mobile terminal thereof
CN107111446A (en) The method and system of control device
EP3051410A1 (en) An apparatus and associated methods for provision of wireless power
KR20140090112A (en) Method and apparatus for pairing electronic devices
KR102057196B1 (en) Method and system for transmitting information, device and computer readable recording medium thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant