CN110262747B - Method and device for controlling terminal, terminal and storage medium - Google Patents

Method and device for controlling terminal, terminal and storage medium Download PDF

Info

Publication number
CN110262747B
CN110262747B CN201910541290.3A CN201910541290A CN110262747B CN 110262747 B CN110262747 B CN 110262747B CN 201910541290 A CN201910541290 A CN 201910541290A CN 110262747 B CN110262747 B CN 110262747B
Authority
CN
China
Prior art keywords
terminal
area
cursor
display screen
touch sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910541290.3A
Other languages
Chinese (zh)
Other versions
CN110262747A (en
Inventor
林进全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910541290.3A priority Critical patent/CN110262747B/en
Publication of CN110262747A publication Critical patent/CN110262747A/en
Application granted granted Critical
Publication of CN110262747B publication Critical patent/CN110262747B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a method, a device, a terminal and a storage medium for controlling the terminal, and belongs to the technical field of computers. Because the terminal can obtain the corresponding sliding speed according to the sliding distance, the method provided by the embodiment can continuously control the movement of the cursor after the touch sensor receives a section of sliding distance, thereby simplifying the operation of controlling the movement of the cursor and improving the efficiency of controlling the terminal.

Description

Method and device for controlling terminal, terminal and storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method and a device for controlling a terminal, the terminal and a storage medium.
Background
With the development of electronic technology, the functions of the terminal are increasingly rich. The terminal can provide various functions for the user through interaction with the user.
In some application scenarios, the terminal can receive a touch signal of a user through the touch screen, and execute an operation corresponding to the touch signal, so as to achieve an effect that the user directly controls the terminal through the touch operation.
Disclosure of Invention
The embodiment of the application provides a method and a device for controlling a terminal, the terminal and a storage medium. The technical scheme is as follows:
according to an aspect of the present application, there is provided a method of controlling a terminal, the method including:
acquiring a sliding distance recognized on a touch sensor, wherein the sliding distance is a distance corresponding to a touch operation acted on the touch sensor;
acquiring a moving speed corresponding to the sliding distance according to a preset mapping relation;
controlling a cursor to move according to the moving speed, wherein the cursor is used for controlling an object displayed in a display screen;
and when receiving the stop signal, controlling the cursor to stop moving.
According to another aspect of the present application, there is provided an apparatus for controlling a terminal, the apparatus including:
the distance acquisition module is used for acquiring a sliding distance identified on the touch sensor, wherein the sliding distance is a distance corresponding to a touch operation acted on the touch sensor;
the speed acquisition module is used for acquiring the moving speed corresponding to the sliding distance according to a preset mapping relation;
the cursor moving module is used for controlling a cursor to move according to the moving speed, and the cursor is used for controlling an object displayed in the display screen;
and the movement stopping module is used for controlling the cursor to stop moving when the stop signal is received.
According to another aspect of the present application, there is provided a terminal comprising a processor and a memory, the memory having stored therein at least one instruction, the instruction being loaded and executed by the processor to implement a method of controlling a terminal as provided in the implementations of the present application.
According to another aspect of the present application, there is provided a computer-readable storage medium having at least one instruction stored therein, the instruction being loaded and executed by a processor to implement a method of controlling a terminal as provided in the implementations of the present application.
The beneficial effects brought by the technical scheme provided by the embodiment of the application can include:
the sliding distance identified on the touch sensor can be acquired, the sliding distance is the distance corresponding to the touch operation acting on the touch sensor, the moving speed corresponding to the sliding distance is acquired according to the preset mapping relation, the cursor is controlled to move according to the moving speed, and when a stop signal is received, the cursor is controlled to stop moving, wherein the sliding distance is the distance corresponding to the touch operation acting on the touch sensor, and the cursor is used for controlling the object displayed in the display screen. Because the terminal can obtain the corresponding sliding speed according to the sliding distance, the method provided by the embodiment can continuously control the movement of the cursor after the touch sensor receives a section of sliding distance, thereby simplifying the operation of controlling the movement of the cursor and improving the efficiency of controlling the terminal.
Drawings
In order to more clearly describe the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments of the present application will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a block diagram of a terminal according to an exemplary embodiment of the present application;
fig. 2 is a flowchart of a method for controlling a terminal according to an exemplary embodiment of the present application;
FIG. 3 is a schematic diagram of a user interface provided based on the embodiment shown in FIG. 2;
fig. 4 is a flowchart of a method for controlling a terminal according to another exemplary embodiment of the present application;
FIG. 5 is a schematic diagram of an interaction provided based on the embodiment shown in FIG. 4;
fig. 6 is a flowchart of a method for controlling a terminal provided based on the embodiment shown in fig. 4;
FIG. 7 is a schematic diagram of a cursor display provided based on the embodiment shown in FIG. 6;
fig. 8 is a block diagram of an apparatus for controlling a terminal according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
In the description of the present application, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In the description of the present application, it is to be noted that, unless otherwise explicitly specified or limited, the terms "connected" and "connected" are to be interpreted broadly, e.g., as being fixed or detachable or integrally connected; can be mechanically or electrically connected; may be directly connected or indirectly connected through an intermediate. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art. Further, in the description of the present application, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
In order to make the solution shown in the embodiments of the present application easy to understand, several terms appearing in the embodiments of the present application will be described below.
Sliding distance: refers to a distance corresponding to a touch operation applied to the touch sensor. In one possible implementation, the user performs a touch operation on the surface of the touch sensor by a finger. When the touch operation is a slide operation, the terminal recognizes the slide distance through the touch sensor. For example, when the surface of the touch operation touch sensor slides by 32 mm, the touch sensor of the terminal determines the detected 32 mm as the sliding distance by sensing data.
Presetting a mapping relation: the preset mapping relationship may be a table storing the corresponding relationship, or may be a functional relational expression. The preset mapping relation is used for indicating the corresponding relation between the sliding distance and the moving speed. In one possible approach, the sliding distance and the moving speed are in one-to-one correspondence. In the present application, the direction of the moving speed can be determined according to the direction corresponding to the sliding distance. The direction corresponding to the sliding distance is a direction in which the touch start point points to the touch end point, and the direction may be determined as a direction of the moving speed.
Interactable objects: a control that can respond to a touch operation is referred to, and the control may be displayed in a display screen in a graphic form. Alternatively, the interactable object can be at least one of a launch icon, a slider, a check box, a button, or an input box.
Cursor: and an icon displayed in a display screen of the terminal, the icon being used to indicate a current operation position. In one possible approach, when the currently displayed user interface is an input interface, the cursor may be a flashing short line in the input interface. In another possible manner, when the currently displayed user interface is a desktop or an interface that simultaneously displays a plurality of operation objects, the cursor may be a background color block that highlights the operation object, for example, an operation object that deepens the background color. In yet another possible implementation, the cursor may also be a small icon displayed at the uppermost layer of the current user interface, such as a small icon like an arrow.
Optionally, a cursor is used to control objects displayed in the display screen, and control operations include, but are not limited to, copy, move, delete, or open, etc.
For example, the method for controlling the terminal according to the embodiment of the present application may be applied to a terminal, which has a display screen and functions to control the terminal. The terminal may include a mobile phone, a tablet computer, a laptop computer, an intelligent digital camera, an MP4 player terminal, an MP5 player terminal, a learning machine, a point-to-read machine, an electronic book, an electronic dictionary, or a vehicle-mounted terminal, which is not limited in this embodiment of the present application.
Referring to fig. 1, fig. 1 is a block diagram of a terminal according to an exemplary embodiment of the present application, and as shown in fig. 1, the terminal includes a processor 120, a memory 140, a display screen 160, and a touch sensor 180, where the memory 140 stores at least one instruction, and the instruction is loaded and executed by the processor 120 to implement a method for controlling the terminal according to various method embodiments of the present application. The display screen 160 is used to display a cursor as well as an imaged object. The touch sensor 180 is used to recognize a touch operation by a user.
In the present application, the terminal 100 is an electronic device having a function of controlling the terminal by a cursor. When the terminal 100 acquires the sliding distance identified by the touch sensor 180, the terminal 100 can acquire a moving speed corresponding to the sliding distance according to a preset mapping relationship, control the cursor to move according to the moving speed, control the cursor to be used for controlling the object displayed in the display screen 160, and control the cursor to stop moving when the terminal 100 receives the stop signal.
Processor 120 may include one or more processing cores. The processor 120 connects various parts within the overall terminal 100 using various interfaces and lines, and performs various functions of the terminal 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 140 and calling data stored in the memory 140. Optionally, the processor 120 may be implemented in at least one hardware form of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 120 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 120, but may be implemented by a single chip.
The Memory 140 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 140 includes a non-transitory computer-readable medium. The memory 140 may be used to store instructions, programs, code sets, or instruction sets. The memory 140 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like; the storage data area may store data and the like referred to in the following respective method embodiments.
The display screen 160 may be provided on a front panel of the terminal. In one possible implementation, the image provided by the processor 120 is displayed in the display screen 160.
The touch sensor 180 may be disposed on a rear panel or a side bezel of the terminal. In one possible implementation manner, the touch sensor 180 sets a touch sensing area on a rear panel or a side frame of the terminal for acquiring a touch operation. In one possible approach, the touch sensor 180 may be integrated with a fingerprint sensor, and the fingerprint sensor and the pressure sensor are stacked together to share a touch sensitive area. In one possible approach, the pressure sensor is superimposed on the upper layer of the fingerprint sensor.
Alternatively, the touch sensor 180 can recognize a position of a touch operation, a touch pressure, or a sliding trace.
Referring to fig. 2, fig. 2 is a flowchart of a method for controlling a terminal according to an exemplary embodiment of the present application. The method for controlling the terminal can be applied to the terminal shown in fig. 1. In fig. 2, a method of controlling a terminal includes:
step 210, obtaining a sliding distance recognized on the touch sensor, where the sliding distance is a distance corresponding to a touch operation applied to the touch sensor.
In the embodiment of the application, the terminal can acquire the sliding distance identified on the touch sensor. Alternatively, the user can perform a click operation, a double click operation, a long press operation, a slide operation, or a drag operation on a touch sensor of the terminal. In one possible implementation, when the touch operation applied to the touch sensor is an operation that generates a sliding distance, such as a sliding operation or a dragging operation, the terminal can acquire the sliding distance recognized on the touch sensor.
And step 220, acquiring the moving speed corresponding to the sliding distance according to a preset mapping relation.
In the embodiment of the application, after the terminal acquires the sliding distance, the terminal can acquire the moving speed corresponding to the sliding distance according to the preset mapping relation.
In a possible implementation manner, when the preset mapping relationship is a table, the terminal can search for a sliding distance in the table, and when the sliding distance is found, the moving speed corresponding to the sliding distance is obtained.
Referring to table one, a table for indicating the sliding distance versus the moving speed is shown.
Watch 1
Sliding distance (0,1]Millimeter (1,2]Millimeter (2,3]Millimeter (3,4]Millimeter (4,5]Millimeter
Speed of movement 2 mm/s 4 mm/s 6 mm/s 8 mm/s 10 mm/s
In table one, a correspondence relationship for indicating a sliding distance and a moving speed is shown. The terminal can pre-store the table, and when the sliding distance is acquired on the touch sensor, the terminal can acquire the moving speed in a table lookup mode. For example, when the terminal acquires that the sliding distance is 3.2 mm, the terminal determines that the 3.2 mm is within the (3, 4] mm interval, and determines the moving speed corresponding to the (3, 4] mm interval as 8 mm/sec as the moving speed.
Note that the sliding distance in table one is shown in the form of a section. In other implementations, the sliding distance may also be shown in specific values.
In another possible implementation manner, when the preset mapping relation is a functional relation, the terminal can input the sliding distance as an independent variable into the functional relation, and the dependent variable, that is, the moving speed, is obtained through calculation of the functional relation.
Alternatively, the ratio between the sliding distance and the moving speed may be adjusted by the user. In one adjustment, the terminal can set the sensitivity of the touch sensor at the setup interface. If the sliding distance is kept unchanged, when the sensitivity of the touch sensor is higher, the terminal obtains a value of the moving speed V1; when the sensitivity of the touch sensor is low, the terminal will obtain a value of the moving speed of V2. Wherein V1 is greater than V2.
And step 230, controlling a cursor to move according to the moving speed, wherein the cursor is used for controlling an object displayed in the display screen.
In the embodiment of the application, the terminal can control the cursor to move according to the moving speed determined in the previous step. It should be noted that, when controlling the movement of the cursor, the direction of the movement speed also needs to be determined. In one possible approach, the terminal will determine the direction of the moving speed according to the touch operation. For example, the terminal determines a direction in which the start point of the touch operation points to the end point of the touch operation as the direction of the movement speed.
In a possible implementation manner, the terminal determines the magnitude of the moving speed according to the sliding distance, and determines the direction of the moving speed according to the touch operation. After the magnitude and the direction of the moving speed are determined, the terminal controls the cursor to move according to the magnitude and the direction of the moving speed.
And step 240, controlling the cursor to stop moving when the stop signal is received.
In the embodiment of the application, the terminal can control the cursor to stop moving when receiving the stop signal. The stop signal may be triggered by the terminal according to a touch operation received by the touch sensor, or may be triggered by the cursor moving to a designated area.
In one possible stop signal triggered approach, the stop signal is a cursor movement to a target edge triggered signal, the target edge being an edge of a user interface displayed in the display screen.
It should be noted that the target edge may be designed according to the edge of the user interface displayed in the display screen. For example, when the user interface is a rectangular interface or a rounded rectangular interface, the target edge may be the four sides of a rectangle or a rounded rectangle. When the user interface is circular, the target edge may be the circumference of the circular interface. When the user interface is an irregular graphic, the target edge may be a boundary on the periphery of the irregular image. When the cursor moves to the edge of the user interface, a stop signal is triggered.
Referring to fig. 3, fig. 3 is a schematic diagram of a user interface provided based on the embodiment shown in fig. 2. In the user interface 300 shown in fig. 3, the target edge includes a left side edge 310, an upper side edge, a lower side edge, and a right side edge. When the cursor 320 moves to the left edge 310, a stop signal is triggered, at which time the cursor 320 is controlled by the terminal to stop moving.
Note that a touch sensor is provided in the rear panel 330 of the terminal shown in fig. 3.
In another possible stop signal triggering manner, the stop signal is a signal that is triggered when a touch operation stops acting on the touch sensor. For example, the user's finger is continuously in contact with the touch sensor, and when the user's finger is separated from the touch sensor, the touch operation stops acting on the touch sensor. At this point, a stop signal is triggered.
In yet another possible stop signal triggering manner, the stop signal is a signal triggered when the touch operation is resumed to the start point of the touch operation at the touch point of the touch sensor.
In the embodiment of the present application, the touch operation may be a persistent touch operation, and the persistent touch operation may include at least one of a long press operation, a slide operation, or a drag operation. In the above-described sustained touch operation, since the sustained touch operation can be continuously applied to the touch sensor, the sustained touch operation has a start point when the application of the touch sensor is started, and the sustained touch operation has an end point when the application of the touch sensor is ended. It will be appreciated that the starting point may indicate a location or an area in the touch sensor, and the terminal may also indicate a location or an area in the touch sensor. In the stop signal triggering mode, when the touch operation is resumed from another position to the start point of the touch operation at the touch point of the touch sensor, the stop signal is triggered.
When the terminal receives the stop signal, the terminal controls the cursor to stop moving. Alternatively, in one possible approach, the terminal continuously displays the cursor at the position where the movement is stopped. In another possible manner, the terminal may hide the cursor after continuously displaying the cursor at the position where the movement is stopped for a predetermined time.
In summary, the method for controlling the terminal provided in this embodiment can obtain the sliding distance identified on the touch sensor, where the sliding distance is a distance corresponding to the touch operation performed on the touch sensor, and obtain the moving speed corresponding to the sliding distance according to the preset mapping relationship, so as to control the cursor to move according to the moving speed, and when receiving the stop signal, control the cursor to stop moving, where the sliding distance is a distance corresponding to the touch operation performed on the touch sensor, and the cursor is an object used for controlling the display in the display screen. Because the terminal can obtain the corresponding sliding speed according to the sliding distance, the method provided by the embodiment can continuously control the movement of the cursor after the touch sensor receives a section of sliding distance, thereby simplifying the operation of controlling the movement of the cursor and improving the efficiency of controlling the terminal.
Referring to fig. 4, fig. 4 is a flowchart of a method for controlling a terminal according to another exemplary embodiment of the present application. The method for controlling the terminal can be applied to the terminal shown in fig. 1. In fig. 4, the method of controlling a terminal includes:
step 411, when receiving a touch operation, detecting whether an interactive object is displayed in the display screen.
In the embodiment of the present application, an interactable object, which is one of objects in a display screen, is used for an object that performs a corresponding operation according to an input signal. When the terminal receives touch operation on the touch sensor, the terminal detects whether an interactive object is displayed in the display screen.
In a possible implementation manner, the terminal can detect whether an interactive object is displayed in the display screen by acquiring the attribute of the currently displayed user interface. If the number of the user interfaces currently displayed by the terminal is one and only one, the terminal can directly judge whether the interactive objects are displayed in the display screen according to the attribute of the user interface. If the number of the user interfaces currently displayed by the terminal is multiple, the terminal can judge whether the interactive objects are displayed in the display screen according to the attributes of the user interfaces.
For example, when only the user interface ui1 is displayed in the display screen of the terminal, if the attribute of the user interface ui1 indicates that the interface contains the interactable object, the terminal determines that the interactable object is displayed in the display screen; if the attribute of the user interface ui1 indicates that no interactable object is contained in the interface, the terminal determines that no interactable object exists in the display screen. Optionally, in an implementation manner, the terminal can further determine whether an interactable object exists in the objects in the display state from the attributes of the user interface ui1, which is not limited in this application.
When the user interface ui1 and the user interface ui2 are displayed in the display screen of the terminal, the terminal determines that an interactable object does not exist in the display screen when the attribute of the user interface ui1 indicates that the interactable object is not included in the interface, and the attribute of the user interface ui2 indicates that the interactable object is not included in the interface. The terminal determines that an interactable object is displayed in the display screen when the attribute of the user interface ui1 indicates that the interactable object is included in the interface, or the attribute of the user interface ui2 indicates that the interactable object is included in the interface.
In another possible implementation manner, the terminal can also directly acquire the attribute of the object displayed in the display screen, and further determine whether the interactive object is displayed in the currently displayed user interface by traversing the attribute of the object displayed in the display screen.
For example, an object t1, an object t2, and an object t3 are displayed in the display screen. The terminal can acquire the respective attributes of the above-described object t1, object t2, and object t 3. When the interactable object exists in the object t1, the object t2 and the object t3, the terminal determines that the interactable object is displayed in the display screen. When none of the object t1, the object t2, and the object t3 is an interactable object, the terminal will determine that no interactable object exists in the display screen.
It should be noted that the terminal applied in this embodiment includes a touch sensor, an area where the touch sensor recognizes a touch operation is a recognition area, the recognition area is located on a rear panel or a side frame of the terminal, and the display screen is located in a front panel of the terminal.
Step 412, when the interactive object is displayed on the display screen, displaying a cursor at a preset position.
In the embodiment of the present application, the preset position may be the center of the screen, or may be a position where the interactive object is located. In a possible implementation manner, if at least two interactable objects are displayed in the display screen, the preset position may also be the center of a polygon with the center of the at least two interactable objects as a vertex.
For example, an interactable object t4, an interactable object t5, and an interactable object t6 are displayed in the user interface of the terminal. The center of the interactable object t4 is point A1, the center of the interactable object t5 is point A2, the center of the interactable object t6 is point A3, and point B is the center of the triangle A1A2A 3. According to the embodiment of the application, the point B is used as the preset position so as to shorten the time for moving the cursor to the interactive object.
In step 420, the sliding distance identified on the touch sensor is obtained.
The execution procedure of step 420 is the same as that of step 210, and is not described herein again.
And 431, acquiring sliding displacement corresponding to the sliding distance, wherein the sliding displacement is displacement corresponding to the touch operation.
In the embodiment of the application, the terminal can obtain the sliding displacement corresponding to the sliding distance. Note that the terminal can acquire a track of a touch operation through the touch sensor. The trajectory includes a start point of the touch operation and an end point of the touch operation. The terminal can determine the sliding displacement according to the starting point of the touch operation and the end point of the touch operation.
For example, if the coordinates of the start point of the touch operation are (1,1) and the coordinates of the start point of the touch operation are (4,5), the slide displacement is a displacement in which the point (1,1) points to the point (4,5) and the corresponding distance is 5.
And step 432, acquiring the moving speed according to a preset proportionality coefficient k and the sliding displacement.
In a possible implementation manner of the present application, a corresponding relationship between the sliding displacement and the moving speed may be set in the terminal. Where v denotes a moving speed, k denotes a scaling factor, and s denotes a sliding displacement. In this embodiment, v may be proportional to s. In other possible implementations, the sliding displacement and the moving speed may be positively correlated.
In one possible implementation manner, the terminal can split the displacement s into a horizontal displacement s1 and a vertical displacement s2, and calculate the horizontal moving speed v1 and the vertical moving speed v2 through a proportionality coefficient k respectively.
And step 440, controlling the cursor to move according to the moving speed.
The execution process of step 440 is the same as the execution process of step 230, and is not described herein again.
And step 450, controlling the cursor to stop moving when the stop signal is received.
The execution of step 450 is the same as the execution of step 240, and will not be described herein.
In step 461, when the touch pressure value identified by the touch sensor is not less than the first threshold, a pressing operation is applied to the first position where the cursor is located.
In the embodiment of the application, the touch sensor can also acquire a touch pressure value and correspondingly control the cursor according to the touch pressure value. When the touch pressure value identified by the touch sensor is not less than the first threshold value, the terminal applies a pressing operation to the first position where the cursor is located.
In one possible approach, if an interactable object is present in the first location, the terminal will apply a press operation against the interactable object.
And 462, when the touch pressure value identified by the touch sensor is not greater than the second threshold value, applying a lifting operation to the second position where the cursor is located.
In the embodiment of the application, the terminal can continue to monitor the touch pressure value on the touch sensor after responding to the pressing operation acting on the first position, and when the touch pressure value is not greater than the second threshold, the terminal applies the lifting operation to the second position where the cursor is located. In combination with the pressing action already applied to the first position in the previous steps, if there is an interactable object in the first position, the interactable object will also receive the lifting operation applied by the terminal at the second position.
It should be noted that the second threshold is smaller than the first threshold.
In one possible approach, if the distance between the first location and the second location is not greater than the target threshold, the terminal completes a one-click operation on the interactable object.
Referring to fig. 5, fig. 5 is an interaction diagram provided based on the embodiment shown in fig. 4. When the finger of the user contacts a point 511 in the touch sensor of the rear panel of the terminal, the terminal displays a cursor 521 in the display screen, the cursor 521 being located at a center 512 among three objects, an interactable object 531, an interactable object 532, and an interactable object 533. Center 512 is the center of a triangle having vertices at center 513 of interactable object 531, center 514 of interactable object 532, and center 515 of interactable object 533. When the terminal detects that the touch operation acting on the touch sensor is a direction pointing from the point 512 to the point 513, such as sliding to the point 516 on the surface of the touch sensor, the moving speed is obtained according to the preset mapping relation, and the cursor 521 is controlled to move from the point 512 to the point 513. When the cursor 521 moves to point 513, the user's finger returns to point 511 in the touch sensor, triggering a stop signal. The terminal stops the cursor 521 at the point 513 according to the stop signal.
At this time, when the terminal detects that the value of the touch pressure acting on the touch sensor is greater than the first threshold value, and since the first position where the terminal detects that the cursor 521 stays is the point 513, and the interactable object 531 exists on the point 513, the terminal applies a pressing operation to the interactable object 531. Subsequently, when the terminal detects that the value of the touch pressure acting on the touch sensor is not greater than the second threshold value and the terminal detects that the second position where the cursor 521 stays is the point 513, the terminal applies a lift-off operation to the interactable object 531 existing on the point 513, thereby implementing a click operation on the interactable object 531. It is noted that in this implementation, the distance between the first location and the second location is less than the target threshold. In some implementations, the first location may coincide with the second location.
In another possible manner, if the distance between the first location and the second location is greater than the target threshold, the terminal completes one drag operation on the interactable object.
Similar to the click operation, the terminal can also move the interactive object according to the sliding on the touch sensor after performing the press operation on the interactive object, stop moving when receiving the stop signal again, and drag the interactive object to the position where the interactive object stops when receiving the lift operation again.
In another possible mode, if there is no interactable object at the first position, the terminal may further use the first position as a vertex of a rectangular frame, and a rectangular frame appears along with the movement of the cursor, when the interactable object covered by the rectangular frame is in the selected state. It should be noted that, when the terminal applies a lift-off operation to an interactable object in the rectangular frame, the terminal may display the candidate operation list. In one possible approach, the list of candidate operations includes at least one of delete, copy, cut, unload, kill, or share.
And 471, when the interactive object does not exist in the display screen and the user interface comprises a display sub-area and an undisplayed sub-area, controlling the slider to move in the preset area according to the first speed.
In the embodiment of the application, the terminal can identify whether the interactive object exists in the display screen. And when the interactive object does not exist in the display screen, the currently displayed content cannot be interacted, and the content is used for displaying related information to the user. When the terminal comprises the display subarea and the undisplayed subarea in the user interface, the user interface is not completely displayed in the display screen. It should be noted that the display sub-area and the non-display sub-area belong to the same user interface, and the user interface can alternately display the display sub-area and the non-display sub-area on the display screen by moving the slider and the like.
In one possible implementation, the displayed sub-regions and the undisplayed sub-regions are based on the concept of specifying a time of day. At a given moment, the portion of the user interface displayed in the display screen is the displayed sub-region, and the portion of the user interface not displayed in the display screen is the undisplayed sub-region. As the slider slides, the non-displayed sub-area may move into the display screen to be displayed, thereby becoming a displayed sub-area. Accordingly, the display sub-area may move out of the display screen, thereby becoming an undisplayed sub-area.
For example, the user interface may be any user interface requiring scrolling viewing, such as a web page, a chat interface, an electronic book interface, or a file manager interface.
It should be noted that the predetermined area may be an edge area of the user interface, and the area is a display area of the slider, and the slider may slide in the predetermined area. In one possible implementation, the slider can be displayed in a highlighted, color-changing, flashing or colored manner to achieve an obvious display effect.
In one possible approach, the direction of movement of the slider is the same as the direction of the touch operation acting on the touch sensor.
In another possible approach, the direction of movement of the slider is opposite to the direction of the touch operation acting on the touch sensor.
Step 472, according to the second speed, controlling the display sub-area to move out of the display screen and the non-display sub-area to move into the display screen.
In the embodiment of the application, the second speed and the first speed are opposite in direction, and the speed of the second speed is greater than that of the first speed. The terminal can control the display sub-area to move out of the display screen according to the second speed, and simultaneously control the undisplayed sub-area to move into the display screen, so that the terminal can realize the viewing of the long page or the wide page through simple operation on the touch display, the operation of viewing the multi-content page by a user is simplified, and the viewing efficiency of the multi-content page is improved.
In summary, in this embodiment, when the touch sensor on the back panel of the terminal detects a touch operation, it is detected whether an interactive object is displayed in the display screen, and when the interactive object is displayed in the display screen, a cursor is displayed at a preset position, where the preset position may be a position closer to the interactive object. Because the terminal can display the cursor on the preset position, the terminal can shorten the distance from the cursor to the interactive object, and the operation efficiency of the terminal for the interactive object is improved.
The method for controlling the terminal provided by this embodiment can also identify a touch pressure value through the touch sensor, apply a push-down operation to a first position where the cursor is located when the touch pressure value is not less than a first threshold, and apply a push-up operation to a second position where the cursor is located when the touch pressure value identified by the touch sensor is not more than a second threshold. Wherein the second threshold is less than the first threshold. The terminal can apply clicking, dragging, frame selecting or other operations to the interactive object in the terminal by recognizing the change of the touch pressure value, the screen on the front panel is not shielded when the operation is performed on the interactive object, the operation efficiency is improved, the visual area of the interactive object is expanded, and the operability of the terminal is enhanced.
The method for controlling the terminal provided in this embodiment can also control the slider to move in the predetermined area according to the first speed when the interactive object does not exist in the display screen and the user interface includes the display sub-area and the non-display sub-area, and control the display sub-area to move out of the display screen and the non-display sub-area to move into the display screen according to the second speed. And the second speed is opposite to the first speed in direction, and the speed of the second speed is greater than that of the first speed. Because the touch sensor is positioned behind the terminal, the design can view the multi-content page in the terminal without shielding when a user holds the terminal, and the effect of displaying the multi-content page by the terminal is improved.
Referring to fig. 6, fig. 6 is a flowchart of a method for controlling a terminal according to the embodiment shown in fig. 4. The method for controlling the terminal can be applied to the terminal shown in fig. 1. Step 412 in fig. 4 can be replaced by step 412a and step 412 b. Alternatively, step 412 can be replaced by step 412c and step 412 d. Alternatively, step 412 can be replaced by step 412 e. Alternatively, step 412 can be replaced by step 412 f. The above steps are stated as follows:
step 412a, when an interactable object is displayed in the display screen, detecting whether an interactable object with an area smaller than an area threshold exists.
In the embodiment of the application, the terminal can traverse the area of each interactive object when the interactive object is displayed in the display screen. Optionally, the area refers to an area where the interactable object can be touch controlled.
And step 412b, when an interactable object with the area smaller than the area threshold value exists, displaying a cursor on the object with the smallest area of the interactable object.
In the embodiment of the application, when the area of the interactable object displayed in the display screen is smaller than the area threshold, the terminal displays a cursor on the interactable object with the smallest area.
For example, when the area threshold is 0.5 square centimeters, the area of the interactable object O1 displayed in the display screen is 0.8 square centimeters, the area of the interactable object O2 is 0.4 square centimeters, and the area of the interactable object O3 is 0.3 square centimeters. At this time, the terminal will display icons on the interactable object O3 so that the user can control the interactable object O3 having the smallest difficult-to-control area through the cursor.
Step 412c, when the interactive objects are displayed on the display screen, detecting whether the object distance is smaller than a distance threshold.
In the embodiment of the application, when the interactive objects are displayed in the display screen, the terminal can detect the object distance between every two interactive objects. Wherein the object distance is the distance between two interactable objects. Optionally, the object spacing is the distance between the touchable regions of two interactable objects.
In step 412d, when the object distance is smaller than the distance threshold, a cursor is displayed in an area between two interactable objects with the smallest object distance.
In the embodiment of the application, when the object distance is smaller than the distance threshold, the terminal acquires two interactable objects with the minimum object distance, and displays a cursor in an area between the two interactable objects, so that a user can accurately perform distinguished control on the two interactable objects with too close distance through the cursor.
And step 412e, when the interactive object is displayed in the display screen, displaying a cursor in the area outside the one-hand holding area.
In the embodiment of the application, the single-hand holding area refers to an area which can be touched by fingers when the display screen is held by a single hand. It should be noted that, when an interactive object is displayed on the display screen, the terminal can determine the current one-handed holding area of the terminal. Firstly, the terminal can determine whether the user holds the terminal currently by the left hand or the right hand, and after determining the holding hand, the single-hand holding area corresponding to the holding hand is determined as the current single-hand holding area of the terminal. Subsequently, a cursor is displayed in an area outside the one-handed holding area.
For example, if the terminal is held by the left hand, the terminal determines that the one-handed holding area corresponding to the left hand is the current one-handed holding area, which is the area that can be touched by the fingers of the left hand in the one-handed holding state.
Referring to fig. 7, fig. 7 is a schematic display diagram of a cursor provided based on the embodiment shown in fig. 6. In fig. 7, when the terminal determines that an interactable object is displayed in the display screen, the terminal displays a cursor 740 at a point 710 in the user interface. Where region 720 is a one-handed holding region, region 730 is a region of the display screen other than the one-handed holding region, and point 710 is the geometric center of region 730.
In one possible implementation, the terminal displays the cursor at the geometric center in the area outside the one-handed holding area.
And step 412f, when the interactive object is displayed in the display screen, displaying a cursor in the center of the area of the interactive object.
In the embodiment of the present application, the center of the region is equidistant from the center of each interactable object. Referring to fig. 5, the interactable objects in fig. 5 include interactable object 531, interactable object 532, and interactable object 533. The distance of region center 512 from center 513 of interactable object 531, and the distance of region center 512 from center 514 of interactable object 532, and the distance of region center 512 from center 515 of interactable object 533 are equal. The terminal displays a cursor at the region center 512.
In summary, the embodiments disclosed in the present application can display the cursor on the interactive object with the smallest area when the area of the interactive object is smaller, thereby shortening the distance from the cursor to the interactive object with the smaller area, and facilitating the user to quickly control the interactive object with the smaller area and not suitable for control.
Optionally, the cursor can be displayed in the area between the two interactive objects when the distance between the two interactive objects is too small, so that the distance between the cursor and the two interactive objects is shortened, a user can quickly control the interactive objects which are easy to touch mistakenly, and the control efficiency of the interactive objects is improved.
Optionally, the method and the device can also display the cursor in the area outside the one-hand holding area when the user holds the terminal with one hand, so that the user can control the interactive object in the area which is not easy to touch with one hand, and the control range of the user can be improved under the conditions of not changing the resolution of the user interface and no shielding.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Referring to fig. 8, fig. 8 is a block diagram of a device for controlling a terminal according to an exemplary embodiment of the present disclosure. The means for controlling the terminal may be implemented as all or part of the terminal in software, hardware or a combination of both. The device includes:
a distance obtaining module 810, configured to obtain a sliding distance identified on a touch sensor, where the sliding distance is a distance corresponding to a touch operation performed on the touch sensor;
a speed obtaining module 820, configured to obtain a moving speed corresponding to the sliding distance according to a preset mapping relationship;
a cursor moving module 830, configured to control a cursor to move according to the moving speed, where the cursor is used to control an object displayed in the display screen;
a stop moving module 840, configured to control the cursor to stop moving when the stop signal is received.
In an alternative embodiment, the stop signal involved in the apparatus is a signal triggered by the cursor moving to a target edge, the target edge being an edge of a user interface displayed in the display screen; or, the stop signal is a signal triggered when the touch operation stops acting on the touch sensor; or, the stop signal is a signal triggered when the touch operation is restored to the starting point of the touch operation at the touch point of the touch sensor.
In an optional embodiment, the apparatus is disposed in a terminal including a touch sensor, an area where the touch sensor recognizes a touch operation is a recognition area, the recognition area is located on a rear panel or a side frame of the terminal, the display screen is located on a front panel of the terminal, and the apparatus further includes an object detection module and a cursor display module; the object detection module is used for detecting whether an interactive object is displayed in the display screen when the touch operation is received, wherein the interactive object is used for executing corresponding operation according to an input signal; and the cursor display module is used for displaying the cursor at a preset position when the interactive object is displayed in the display screen.
In an optional embodiment, the speed obtaining module 820 is configured to obtain a sliding displacement corresponding to the sliding distance, where the sliding displacement is a displacement corresponding to the touch operation; and acquiring the moving speed according to a preset proportionality coefficient k and the sliding displacement.
In an optional embodiment, the device comprises a pressing module and a lifting module, wherein the pressing module is used for applying a pressing operation to a first position where the cursor is located when a touch pressure value identified by the touch sensor is not less than a first threshold value; the lifting module is used for applying a lifting operation to a second position where the cursor is located when the touch pressure value identified by the touch sensor is not larger than a second threshold value, wherein the second threshold value is smaller than the first threshold value.
In an optional embodiment, the apparatus further comprises a slider moving module and a page moving module, wherein the slider moving module is configured to control the slider to move according to a first speed in a predetermined area when the interactable object is not present in the display screen and the user interface comprises a displayed sub-area and an undisplayed sub-area; the page moving module is configured to control the display sub-area to move out of the display screen and the non-display sub-area to move into the display screen according to a second speed, where the second speed is opposite to the first speed in direction, and a speed of the second speed is greater than a speed of the first speed.
The embodiment of the present application further provides a computer-readable medium, where at least one instruction is stored, and the at least one instruction is loaded and executed by the processor to implement the method for controlling a terminal according to the above embodiments.
It should be noted that: in the method for controlling a terminal according to the above embodiment, only the division of the functional modules is illustrated, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the above described functions. In addition, the apparatus for controlling a terminal and the method embodiment for controlling a terminal provided in the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiment and are not described herein again.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the implementation of the present application and is not intended to limit the present application, and any modifications, equivalents, improvements, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (12)

1. A method for controlling a terminal, applied to a terminal including a touch sensor, wherein an area where the touch sensor recognizes a touch operation is a recognition area, the recognition area is located on a rear panel or a side frame of the terminal, and a display screen is located on a front panel of the terminal, the method comprising:
determining a holding hand of the terminal, wherein the holding hand comprises a left hand holding the terminal or a right hand holding the terminal;
after the holding hand is determined, determining a holding area corresponding to the holding hand as a current single-hand holding area of the terminal, and displaying a cursor outside the single-hand holding area, wherein the single-hand holding area is an area which can be touched by the holding hand in a single-hand holding state;
when the touch operation is received, detecting whether an interactive object is displayed in the display screen, wherein the interactive object is used for executing corresponding operation according to an input signal;
when the interactive object is displayed in the display screen, displaying the cursor at a preset position; the preset position is the center of a polygon with the centers of at least two interactive objects as vertexes, or the center of the display screen;
acquiring a sliding distance identified on the touch sensor, wherein the sliding distance is a distance corresponding to a touch operation acted on the touch sensor;
acquiring a moving speed corresponding to the sliding distance according to a preset mapping relation;
controlling the cursor to move according to the moving speed, wherein the cursor is used for controlling an object displayed in the display screen;
and when receiving the stop signal, controlling the cursor to stop moving.
2. The method of claim 1, wherein the stop signal is a signal triggered by movement of the cursor to a target edge, the target edge being an edge of a user interface displayed in the display screen;
or the like, or, alternatively,
the stop signal is a signal triggered when the touch operation stops acting on the touch sensor;
or the like, or, alternatively,
the stop signal is a signal triggered when the touch operation is restored to the starting point of the touch operation at the touch point of the touch sensor.
3. The method of claim 1, wherein the preset position comprises:
the area outside the single-hand holding area refers to an area which can be touched by fingers when the display screen is held by a single hand;
and/or the presence of a gas in the gas,
a center of a region of the interactable objects, the center of the region being equidistant from a center of each of the interactable objects.
4. The method of claim 1, wherein displaying the cursor at a preset position when the interactive object is displayed in the display screen comprises:
when the interactive objects are displayed in the display screen, detecting whether the interactive objects with the areas smaller than an area threshold exist or not;
when the interactable object having an area smaller than the area threshold exists, displaying the cursor on the object having the smallest area of the interactable object.
5. The method according to any one of claims 1 to 4, wherein displaying the cursor at a preset position when the interactive object is displayed in the display screen comprises:
when the interactive objects are displayed in the display screen, detecting whether an object distance is smaller than a distance threshold value, wherein the object distance is a distance between the two interactive objects;
when the object distance is smaller than a distance threshold value, displaying the cursor in an area between two interactive objects with the smallest object distance.
6. The method according to claim 1, wherein the obtaining of the moving speed corresponding to the sliding distance according to a preset mapping relationship comprises:
acquiring sliding displacement corresponding to the sliding distance, wherein the sliding displacement is displacement corresponding to the touch operation;
and acquiring the moving speed according to a preset proportionality coefficient k and the sliding displacement.
7. The method of claim 1, further comprising:
when the touch pressure value identified by the touch sensor is not less than a first threshold value, applying a pressing operation to a first position where the cursor is located;
when the touch pressure value identified by the touch sensor is not larger than a second threshold value, a lifting operation is applied to a second position where the cursor is located;
wherein the second threshold is less than the first threshold.
8. The method of claim 2, further comprising:
when the interactive object does not exist in the display screen and the user interface comprises a display sub-area and an undisplayed sub-area, controlling the slider to move according to a first speed in a preset area;
according to the second speed, the display sub-area is controlled to move out of the display screen, and the non-display sub-area is controlled to move into the display screen;
wherein the second speed and the first speed are opposite in direction, and the velocity of the second speed is greater than the velocity of the first speed.
9. An apparatus for controlling a terminal, applied to a terminal including a touch sensor, wherein an area where the touch sensor recognizes a touch operation is a recognition area, the recognition area is located on a rear panel or a side frame of the terminal, and a display screen is located on a front panel of the terminal, the apparatus comprising:
the terminal comprises a holding hand module used for determining a terminal, and the holding hand module is used for determining the holding hand of the terminal, wherein the holding hand comprises a left hand used for holding the terminal or a right hand used for holding the terminal;
after the holding hand is determined, determining a holding area corresponding to the holding hand as a current one-hand holding area of the terminal, and displaying the cursor outside the one-hand holding area, wherein the one-hand holding area is an area which can be touched by the holding hand in a one-hand holding state; the distance acquisition module is used for acquiring a sliding distance identified on the touch sensor, wherein the sliding distance is a distance corresponding to a touch operation acted on the touch sensor;
the module is used for receiving touch operation, detecting whether an interactive object is displayed in the display screen when the touch operation is received, wherein the interactive object is used for executing corresponding operation according to an input signal;
a module for displaying a cursor at a preset position, wherein when the interactive object is displayed in the display screen, the cursor is displayed at the preset position; the preset position is the center of a polygon with the centers of at least two interactive objects as vertexes, or the center of the display screen;
the speed acquisition module is used for acquiring the moving speed corresponding to the sliding distance according to a preset mapping relation;
the cursor moving module is used for controlling a cursor to move according to the moving speed, and the cursor is used for controlling an object displayed in the display screen;
and the movement stopping module is used for controlling the cursor to stop moving when the stop signal is received.
10. A terminal, characterized in that the terminal comprises a processor, a memory connected to the processor, and program instructions stored on the memory, which when executed by the processor implement a method of controlling a terminal according to any of claims 1 to 8.
11. A terminal according to claim 10, characterized in that the terminal comprises a touch sensor located on a rear panel and/or a side frame of the terminal and a display screen located on a front panel of the terminal.
12. A computer-readable storage medium having stored thereon program instructions, characterized in that the program instructions, when executed by a processor, implement a method of controlling a terminal according to any one of claims 1 to 8.
CN201910541290.3A 2019-06-21 2019-06-21 Method and device for controlling terminal, terminal and storage medium Active CN110262747B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910541290.3A CN110262747B (en) 2019-06-21 2019-06-21 Method and device for controlling terminal, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910541290.3A CN110262747B (en) 2019-06-21 2019-06-21 Method and device for controlling terminal, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN110262747A CN110262747A (en) 2019-09-20
CN110262747B true CN110262747B (en) 2021-05-07

Family

ID=67920233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910541290.3A Active CN110262747B (en) 2019-06-21 2019-06-21 Method and device for controlling terminal, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN110262747B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111273827B (en) * 2020-01-17 2021-10-22 维沃移动通信有限公司 Text processing method and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103513908A (en) * 2012-06-29 2014-01-15 国际商业机器公司 Method and device used for controlling cursor on touch screen
CN106155529A (en) * 2015-04-10 2016-11-23 中兴通讯股份有限公司 Method for controlling mobile terminal and mobile terminal
CN109189285A (en) * 2018-08-16 2019-01-11 恒生电子股份有限公司 Operation interface control method and device, storage medium, electronic equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727230B (en) * 2008-10-17 2012-06-27 中国移动通信集团公司 Method and device for controlling cursor of touch screen, and mobile communication terminal
US8643616B1 (en) * 2011-07-29 2014-02-04 Adobe Systems Incorporated Cursor positioning on a touch-sensitive display screen
EP2770413A3 (en) * 2013-02-22 2017-01-04 Samsung Electronics Co., Ltd. An apparatus for providing a cursor in electronic devices and a method thereof
CN104714841A (en) * 2013-12-13 2015-06-17 乐视网信息技术(北京)股份有限公司 Method and device for switching page
CN104090721B (en) * 2014-06-13 2017-03-15 小米科技有限责任公司 terminal control method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103513908A (en) * 2012-06-29 2014-01-15 国际商业机器公司 Method and device used for controlling cursor on touch screen
CN106155529A (en) * 2015-04-10 2016-11-23 中兴通讯股份有限公司 Method for controlling mobile terminal and mobile terminal
CN109189285A (en) * 2018-08-16 2019-01-11 恒生电子股份有限公司 Operation interface control method and device, storage medium, electronic equipment

Also Published As

Publication number Publication date
CN110262747A (en) 2019-09-20

Similar Documents

Publication Publication Date Title
US9146676B2 (en) Method and apparatus for notifying a user about a manipulation region on the back surface of the apparatus
US9063647B2 (en) Multi-touch uses, gestures, and implementation
US9524097B2 (en) Touchscreen gestures for selecting a graphical object
KR101572307B1 (en) Information processing apparatus, control method thereof, and storage medium
US20100229090A1 (en) Systems and Methods for Interacting With Touch Displays Using Single-Touch and Multi-Touch Gestures
US9639265B2 (en) Distance-time based hit-testing for displayed target graphical elements
US9623329B2 (en) Operations for selecting and changing a number of selected objects
CN111475097B (en) Handwriting selection method and device, computer equipment and storage medium
US20160179289A1 (en) Object operation system, non-transitory computer-readable storage medium storing object operation control program, and object operation control method
US20140359538A1 (en) Systems and methods for moving display objects based on user gestures
US9430089B2 (en) Information processing apparatus and method for controlling the same
US10282087B2 (en) Multi-touch based drawing input method and apparatus
US9477398B2 (en) Terminal and method for processing multi-point input
CN104951213A (en) Method for preventing false triggering of edge sliding gesture and gesture triggering method
JP2014182814A (en) Drawing device, drawing method and drawing program
US20150205483A1 (en) Object operation system, recording medium recorded with object operation control program, and object operation control method
US9395838B2 (en) Input device, input control method, and input control program
US10042445B1 (en) Adaptive display of user interface elements based on proximity sensing
EP3008556B1 (en) Disambiguation of indirect input
US9823890B1 (en) Modifiable bezel for media device
US10318047B2 (en) User interface for electronic device, input processing method, and electronic device
CN110262747B (en) Method and device for controlling terminal, terminal and storage medium
CN107807785B (en) Method and system for selecting object on touch screen
WO2022199540A1 (en) Unread message identifier clearing method and apparatus, and electronic device
KR20130038785A (en) Touch screen control method using bezel area

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant