CN107450824B - Object deleting method and terminal - Google Patents

Object deleting method and terminal Download PDF

Info

Publication number
CN107450824B
CN107450824B CN201610388214.XA CN201610388214A CN107450824B CN 107450824 B CN107450824 B CN 107450824B CN 201610388214 A CN201610388214 A CN 201610388214A CN 107450824 B CN107450824 B CN 107450824B
Authority
CN
China
Prior art keywords
gesture
terminal
preset
deleted
track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610388214.XA
Other languages
Chinese (zh)
Other versions
CN107450824A (en
Inventor
张强
陈建
张景瑜
黄雪妍
黄康敏
刘波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201610388214.XA priority Critical patent/CN107450824B/en
Publication of CN107450824A publication Critical patent/CN107450824A/en
Application granted granted Critical
Publication of CN107450824B publication Critical patent/CN107450824B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides an object deleting method and a terminal, relates to the technical field of electronic terminals, and can solve the problem that the picture deleting operation steps are complicated in the prior art. The method comprises the following steps: the terminal receives a first gesture operation instruction; the terminal determines whether a gesture indicated by a first gesture operation instruction is a preset deleting gesture, the deleting gesture is used for indicating to delete an object in the terminal, a track of the preset deleting gesture comprises a contact point generating a continuous track, the track needs to meet at least one of preset conditions, and the preset conditions comprise: an included angle between the track and the screen boundary of the terminal is greater than or equal to a first preset threshold value; the operation speed corresponding to the track reaches a second preset threshold; the touch force corresponding to the track reaches a third preset threshold; if the determination is yes, the terminal determines the object to be deleted, cuts the object to be deleted into at least two parts, and displays the cut at least two parts in a user interface of the terminal.

Description

Object deleting method and terminal
Technical Field
The invention relates to the technical field of electronic terminals, in particular to an object deleting method and a terminal.
Background
With the popularization of various mobile terminals such as mobile phones, tablet computers, smart glasses and the like, functions of taking pictures, browsing pictures and the like are frequently used by users, and a large number of media files such as pictures, videos and the like are stored in the terminals. In order to save the storage space of the terminal or facilitate browsing of the terminal by the user, the user usually cleans the media files in the terminal periodically to release the storage space, so as to keep the terminal operating in an optimal state.
In the current smart phone, taking an ios system as an example, as shown in fig. 1, a method for deleting a single photo by a user mainly comprises the steps of clicking a photo icon, browsing the single photo, and clicking a trash can icon at the lower right corner to delete the photo; as shown in fig. 2, when a user needs to delete photos in batches, the user enters a multi-browsing mode by clicking a 'photo' icon, then clicks a 'selection' at the upper right, selects a picture to be deleted, and finally clicks a 'trash can' icon at the lower right of a screen to delete the picture.
Disclosure of Invention
The embodiment of the invention provides an object deleting method and a terminal, which can solve the problem that the picture deleting operation steps are complicated in the prior art.
In one aspect, an object deleting method is provided, including: the terminal receives a first gesture operation instruction; the terminal determines whether a gesture indicated by a first gesture operation instruction is a preset deleting gesture, the deleting gesture is used for indicating to delete an object in the terminal, a track of the preset deleting gesture comprises a contact point generating a continuous track, the track needs to meet at least one of preset conditions, and the preset conditions comprise: an included angle between the track and the screen boundary of the terminal is greater than or equal to a first preset threshold value; the operation speed corresponding to the track reaches a second preset threshold value; the touch force corresponding to the track reaches a third preset threshold; if the determination is yes, the terminal determines the object to be deleted, cuts the object to be deleted into at least two parts, and displays the cut at least two parts in a user interface of the terminal.
Therefore, the object can be directly deleted in the user interface currently displayed on the terminal through the preset deleting gesture so as to be cut, the cut object is in a deleting state, the deleting efficiency of the object can be improved, and the user experience is also improved. Through the increase of the preset conditions, the accuracy of the terminal for identifying the deleting gesture can be improved, and the gesture confusion of a user for looking up a previous picture or a next picture is avoided.
In one possible design, when the track of the preset deletion gesture includes a contact point generating a continuous track, the terminal determining whether the gesture indicated by the first gesture operation instruction is the preset deletion gesture includes: the terminal determines whether the track of the gesture corresponding to the first finger operation instruction is a straight line, and the track which is larger than or equal to 1/N of the length of the straight line is in the display area of the object, wherein N is a positive integer; if the gesture indicated by the first gesture operation instruction is the preset deleting gesture, the terminal determines that the gesture indicated by the first gesture operation instruction is the preset deleting gesture.
Therefore, the terminal can be quickly indicated to delete the instruction of the object by drawing an approximate straight line in the object through the user gesture.
In one possible design, the determining, by the terminal, an object to be deleted, cutting the object to be deleted into at least two parts, and displaying the at least two cut parts in a user interface of the terminal includes: the terminal determines the object where the track is located as an object to be deleted; the terminal determines the position of the object to be deleted to be cut according to the position coordinates contained in the track, or the terminal determines the preset cutting position as the position to be cut; the terminal cuts the object to be deleted into at least two parts according to the position to be cut, rotates and/or moves the at least two parts after cutting in opposite directions, and displays the at least two parts after rotating and/or moving in a user interface.
Therefore, the straight line drawn by the gesture track of the user can be determined as the deletion position of the object in the deletion state, and the terminal can cut the object to be deleted according to the deletion position.
In one possible design, when the track of the preset deletion gesture includes a contact point generating a continuous track, the terminal determining whether the gesture indicated by the first gesture operation instruction is the preset deletion gesture includes: the terminal determines whether the gesture indicated by the first gesture operation instruction comprises a third contact and a fourth contact, whether the touch force of the third contact exceeds a fourth preset threshold value, and continuously moves towards the edge direction of the terminal screen by a first preset distance; if the gesture indicated by the first gesture operation instruction is the preset deleting gesture, the terminal determines that the gesture indicated by the first gesture operation instruction is the preset deleting gesture.
Therefore, the deleting gesture is determined through detection of the touch force and the stroke distance of the two contacts of the gesture in the object, and the terminal can quickly determine whether the gesture of the user is a preset deleting gesture.
In one possible design, when the track of the preset delete gesture includes a contact point that generates a continuous track, the terminal determines an object to be deleted, cuts the object to be deleted into at least two parts, and displays the cut at least two parts in a user interface of the terminal includes: the terminal determines an object contacted by the third contact and the fourth contact together as an object to be deleted; and the terminal cuts the object to be deleted into at least two parts in the vertical direction of the connecting line of the third contact and the fourth contact by taking the middle position of the connecting line of the third contact and the fourth contact as a reference point, and the cutting positions of the at least two parts are displayed in a zigzag manner in a user interface of the terminal.
Thus, by calculating the trajectory of the gesture, the deletion position in the object can be specified, and the jagged display position can be calculated at the deletion position so that the jagged object is the object to be deleted.
In one possible design, when the track of the preset deletion gesture includes a contact point generating a continuous track, and the track intersects with at least one edge of the object, the terminal determines whether the gesture indicated by the first gesture operation instruction is the preset deletion gesture, including: the terminal determines whether a fifth contact corresponding to the first gesture operation instruction generates a continuous track in any object, and the track extends to the outside of a display area of any object or intersects two different edges of any object; if the gesture indicated by the first gesture operation instruction is the preset deleting gesture, the terminal determines that the gesture indicated by the first gesture operation instruction is the preset deleting gesture.
Therefore, when the terminal displays the preview effect images of the plurality of objects, the object to be deleted can be identified from the plurality of objects through a preset deletion gesture.
In one possible design, when at least two objects are simultaneously displayed in the user interface, before cutting the object to be deleted, the method further includes: and when the terminal detects that the touch force of the contact point contacted with any object in the user interface reaches a sixth preset threshold value or the time of the contact point contacted with any object in the user interface exceeds a seventh preset threshold value, the terminal reduces at least two objects according to a preset proportion.
Therefore, when the preview effect images of a plurality of objects appear in the terminal screen, in order to avoid misoperation, the plurality of objects can be reduced according to a preset proportion to trigger a deletion mode, so that a user can delete the objects in the deletion mode, and more operation space is reserved for the user.
In one possible design, the determining, by the terminal, an object to be deleted, cutting the object to be deleted into at least two parts, and displaying the at least two cut parts in a user interface of the terminal includes: the terminal determines any object as an object to be deleted; the terminal fixes the center point of the object to be deleted, and the area, closest to the fifth contact, in the object to be deleted generates viscous deformation along with the movement of the fifth contact; and the terminal cuts the object to be deleted into at least two parts in the vertical direction of the track of the fifth contact, and displays the cut at least two parts in the user interface of the terminal.
Therefore, when the object generates viscous deformation, the cutting position of the object to be deleted can be determined according to the track of the gesture, for example, the object is cut by the vertical line of the track or a sawtooth-shaped cutting effect is generated.
In one possible design, the method further includes: the terminal determines whether a second gesture operation instruction is received within a preset time period, the gesture indicated by the second gesture operation instruction is a preset recovery gesture, and the recovery gesture is used for indicating recovery of the deleted object; if so, the terminal displays the state of the deleted object before cutting in a user interface; and if not, the terminal deletes the object to be deleted.
Therefore, when a user processes an object into a deleted state through a preset deleting gesture in the terminal, the object in the deleted state is not completely deleted, and the user may repel the object in the deleted state, so that a preset time period can be preset, when the gesture for recovering deletion is received within the preset time period, the object in the deleted state can be recovered to a normal display state, and when the gesture for recovering deletion is not received, the terminal can also automatically and completely delete the object in the deleted state, so that the storage space of the terminal is saved.
In one possible design, the terminal determines whether a second gesture operation instruction is received within a preset time period, the gesture indicated by the second gesture operation instruction is a preset recovery gesture, and the recovery gesture is used for indicating to recover the deleted object and includes: the terminal determines whether two contacts are detected to be simultaneously positioned in at least two parts within a preset time period or not, the two contacts move towards the cutting position direction by more than a second preset distance, and if the two contacts do not move towards the cutting position direction, the terminal determines that a second gesture operation instruction is received; or the terminal determines whether the first contact is detected in the at least two parts within a preset time period, the touch force reaches a fifth preset threshold, the second contact moves towards the cutting position in the at least two parts by more than a third preset distance, and if the touch force is detected in the at least two parts, the terminal determines that the second gesture operation instruction is received.
Thus, the object in the deleted state can be quickly deleted by the recovery gesture different from the deletion gesture.
In one possible design, the method further comprises the steps that the terminal inquires the display state of the at least one object before displaying the at least one object in the user interface, wherein the display state comprises a normal state and a deletion state; if the display state of a first object in the at least one object is a normal state, the terminal displays the first object in the user interface; and if the display state of the second object in the at least one object is the deletion state, the terminal displays the second object in the user interface in at least two parts after cutting.
Therefore, when the terminal displays a plurality of objects in the user interface, if the object deleting method of the present invention is adopted, part of the objects may be in a deleted state, that is, cut into at least two parts, before the terminal displays the plurality of objects, the terminal may search for the objects in a normal state and a deleted state to differentially display the plurality of objects.
In another aspect, a terminal is provided, including: the acquisition module is used for receiving a first gesture operation instruction; the recognition module is used for determining whether a gesture indicated by the first gesture operation instruction is a preset deleting gesture, the deleting gesture is used for indicating to delete an object in the terminal, the track of the preset deleting gesture comprises a contact point generating a continuous track, the track needs to meet at least one of preset conditions, and the preset conditions comprise: an included angle between the track and the screen boundary of the terminal is greater than or equal to a first preset threshold value; the operation speed corresponding to the track reaches a second preset threshold; the touch force corresponding to the track reaches a third preset threshold; the processing module is used for determining an object to be deleted and cutting the object to be deleted into at least two parts if the determination is positive; and the display module is used for displaying the at least two cut parts in a user interface of the terminal.
In one possible design, when the trace of the preset delete gesture includes a contact point that generates a continuous trace, the recognition module is configured to: determining whether the track of the gesture corresponding to the first finger operation instruction is a straight line, wherein the track which is larger than or equal to 1/N of the length of the straight line is in the display area of the object, and N is a positive integer; if yes, determining that the gesture indicated by the first gesture operation instruction is a preset deleting gesture.
In one possible design, the processing module is to: determining an object where the track is located as an object to be deleted; determining the position of the object to be deleted to be cut according to the position coordinates contained in the track, or determining the preset cutting position as the position to be cut; cutting the object to be deleted into at least two parts according to the position to be cut, and rotating and/or moving the at least two parts after cutting in opposite directions; the display module is used for: displaying the rotated and/or moved at least two portions in the user interface.
In one possible design, when the trace of the preset delete gesture includes a contact point that generates a continuous trace, the recognition module is configured to: determining whether the gesture indicated by the first gesture operation instruction comprises a third contact and a fourth contact, and whether the touch force of the third contact exceeds a fourth preset threshold, wherein the track of the fourth contact continuously moves towards the edge direction of the terminal screen by more than a first preset distance; and if so, determining that the gesture indicated by the first gesture operation instruction is a preset deleting gesture.
In one possible design, the processing module is to: determining an object contacted by the third contact and the fourth contact together as an object to be deleted; cutting the object to be deleted into at least two parts in the vertical direction of the connecting line of the third contact and the fourth contact by taking the middle position of the connecting line of the third contact and the fourth contact as a reference point; the display module is used for: and displaying the cutting positions of the at least two parts in a sawtooth shape in a user interface of the terminal.
In one possible design, when the trace of the preset delete gesture includes a contact point that generates a continuous trace, the recognition module is configured to: determining whether a fifth contact point corresponding to the first gesture operation instruction generates a continuous track in any object, wherein the track extends to the outside of a display area of any object, or the track intersects two different edges of any object; if yes, determining that the gesture indicated by the first gesture operation instruction is a preset deleting gesture.
In one possible design, the processing module is to: and when the recognition module is used for detecting that the touch force of the contact point contacted with any object in the user interface reaches a sixth preset threshold value or the time of the contact point contacted with any object in the user interface exceeds a seventh preset threshold value, reducing at least two objects according to a preset proportion.
In one possible design, the processing module is to: determining any object as an object to be deleted; fixing the central point of the object to be deleted, and generating viscous deformation by following the movement of the fifth contact in the area, closest to the fifth contact, of the object to be deleted; cutting the object to be deleted into at least two parts in the vertical direction of the track of the fifth contact; the display module is used for: and displaying the at least two cut parts in a user interface of the terminal.
In one possible design, the identification module is configured to: determining whether a second gesture operation instruction is received within a preset time period, wherein a gesture indicated by the second gesture operation instruction is a preset recovery gesture, and the recovery gesture is used for indicating recovery of the deleted object; the display module is used for: if yes, displaying the state of the deleted object before cutting in a user interface; the processing module is used for: and if not, deleting the object to be deleted.
In one possible design, the identification module is configured to: determining whether two contacts are detected to be simultaneously positioned in at least two parts within a preset time period or not, and the two contacts move towards the cutting position direction by more than a second preset distance; if so, the terminal determines that a second gesture operation instruction is received; or, the identification module is used for: and determining whether the first contact is detected in at least two parts within a preset time period, the touch force reaches a fifth preset threshold value, the second contact moves in the at least two parts towards the cutting position by more than a third preset distance, and if so, determining that a second gesture operation instruction is received.
In one possible design, the processing module is further to: querying a display state of the at least one object before displaying the at least one object in the user interface, the display state including a normal state and a deleted state; the display module is further configured to: if the display state of a first object in the at least one object is a normal state, displaying the first object in the user interface; and if the display state of a second object in the at least one object is a deletion state, displaying the second object in the user interface in at least two cut parts.
Therefore, the object deleting method and the terminal provided by the invention can process the object to be deleted into a deleted state in the current display position of the object, namely, the object to be deleted is cut into at least two parts through the preset gesture, so that the object deleting efficiency can be improved, the object deleting interest is increased, and the user experience is optimized.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic diagram of a display for deleting a photo according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a display for deleting a photo according to an embodiment of the present invention;
fig. 3 is a schematic diagram of an internal structure of a terminal according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of an object deleting method according to an embodiment of the present invention;
fig. 5 is a schematic flowchart of an object deleting method according to an embodiment of the present invention;
FIG. 6 is a schematic diagram illustrating a calculation of a trajectory angle of a gesture according to an embodiment of the present invention;
FIG. 7 is a schematic diagram illustrating a gesture trajectory stroked in a terminal screen according to an embodiment of the present invention;
FIG. 8 is a diagram illustrating a delete gesture cut picture according to an embodiment of the present invention;
FIG. 9 is a schematic diagram illustrating a delete gesture for deleting a picture in a terminal screen according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of a stroking trajectory when a photo is deleted by various delete gestures according to an embodiment of the present invention;
FIG. 11 is a diagram illustrating recovery of an object being deleted according to an embodiment of the present invention;
fig. 12 is a flowchart illustrating an object deleting method according to an embodiment of the present invention;
FIG. 13 is a schematic diagram of a delete gesture and delete state for deleting a photo according to an embodiment of the present invention;
FIG. 14 is a schematic diagram illustrating a jagged erasure state according to an embodiment of the present invention;
FIG. 15 is a schematic diagram illustrating a generation process of a jagged erasure status according to an embodiment of the present invention;
fig. 16 is a flowchart illustrating an object deleting method according to an embodiment of the present invention;
fig. 17 is a schematic diagram of a trigger deletion mode according to an embodiment of the present invention;
fig. 18 is a schematic view of a picture according to an embodiment of the present invention undergoing viscous deformation;
fig. 19 is a schematic view illustrating an effect of viscosity deformation of a picture according to an embodiment of the present invention;
fig. 20 is a schematic diagram illustrating an effect of a picture subjected to viscosity deformation within a preset range according to an embodiment of the present invention;
fig. 21 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the invention, the terminal is used for shooting and storing media files, and comprises various terminals with touch screens or capable of being controlled by body sense, for example, the terminal can be wearable equipment such as a smart phone, a tablet computer, a smart watch and smart glasses, and can also be a television with a touch screen or capable of being controlled by body sense. For a terminal capable of being networked, the terminal can be connected with a server providing various services through a network, the server can be a server providing cloud services, a social server and the like, and various media file objects shot or stored through the terminal can be uploaded to the server for storage.
Fig. 3 is a schematic diagram of an internal structure of the terminal according to the present invention, and in the present invention, the terminal may include an acquisition module 301, a processing module 302, a communication module 303, a storage module 304, a display module 305, and an identification module 306. The acquisition module 301 is configured to acquire data such as audio and video through a sensor device such as a camera or a microphone, and acquire a gesture of a user through the camera, an infrared sensor, a touch screen, and the like; the processing module 302 is used for controlling hardware devices and application software of various parts of the terminal. Specifically, when receiving an instruction to delete an object sent by the identification module 306, delete the object; when the preset condition is met, caching the deleted object; deleting the cached deleted object if the gesture of restoring deletion of the user is not received within the preset time; the processing module 302 is further configured to process the cached deleted object into a recoverable display state before the cached deleted object is not deleted again or completely deleted, and process the object into a recovering display state when the object meeting the deletion-resuming condition is deleted again; the communication module 303 is configured to receive an instruction sent by another device using a communication method such as cellular, ethernet, wifi, bluetooth, infrared, and the like, and also send data of the terminal to a cloud or another device; the storage module 304 is used for storing a software program of the terminal, storing data, operating software, and the like, and may be one or more of a Random Access Memory (RAM), an Erasable Programmable Read Only Memory (EPROM), a Solid State Drive (SSD), an SD Card (Secure Digital Memory Card), and the like. The method specifically comprises the steps of storing various deletable objects, caching various objects before being completely deleted, or storing a preset gesture for deleting or restoring the deleted objects, and establishing a corresponding relation between the preset gesture and a deleting or restoring deleting instruction. The display module 305 is used for displaying an operation interface and an operation result of a user, and can receive and display data of the processing module 302; and also for displaying the deleted object as a deleted state or a recoverable display state, or for displaying the recovered object as a recovering display state. The recognition module 306 is configured to recognize a gesture operation instruction of the user to determine whether the gesture operation instruction is a preset gesture for deleting or restoring the deleted object, and when the gesture operation instruction is determined to be yes, send a message for deleting or restoring the deleted object to the processing module 302.
In order to solve the problem of complicated operation steps for deleting pictures in the terminal in the prior art, an embodiment of the present invention provides an object deleting method, as shown in fig. 4, including;
401. the terminal receives a first gesture operation instruction.
When a finger of a user contacts with a screen of the terminal, the terminal can collect and identify a track of the contact of the finger with the screen, and the track is a first gesture operation instruction received by the terminal.
402. The terminal determines whether a gesture indicated by a first gesture operation instruction is a preset deleting gesture, the deleting gesture is used for indicating to delete an object in the terminal, a track of the preset deleting gesture comprises a contact point generating a continuous track, the track needs to meet at least one of preset conditions, and the preset conditions comprise: an included angle between the track and the screen boundary of the terminal is greater than or equal to a first preset threshold value; the operation speed corresponding to the track reaches a second preset threshold; and the touch force corresponding to the track reaches a third preset threshold value.
Optionally, the terminal may determine, according to whether the first gesture operation instruction is a preset deletion gesture stored in the terminal, for example, whether a trajectory corresponding to the first finger operation instruction is a straight line, and a trajectory greater than or equal to 1/N of the length of the straight line is in a display area of any object, N is a positive integer, and the trajectory satisfies one of preset conditions, and if yes, the terminal determines that the gesture indicated by the first gesture operation instruction is the preset deletion gesture.
403. If the determination is yes, the terminal determines the object to be deleted, cuts the object to be deleted into at least two parts, and displays the cut at least two parts in a user interface of the terminal.
If the terminal determines that the first gesture operation instruction is a preset deleting gesture stored in the terminal, the terminal determines an object to be deleted, specifically, the object to be deleted can be determined according to a track position corresponding to the first gesture operation instruction, and if a single object is displayed in the current user interface, the terminal determines that the single object is the object to be deleted; and if a plurality of objects are displayed in the current user interface, the terminal determines that the object contacted by the track of the gesture is the object to be deleted. Then, the object to be deleted may be cut into at least two parts according to the trajectory, for example, when the trajectory is a straight line, the object may be cut into two parts according to the determined straight line, or the cut part may be displayed in a zigzag shape, and the cut at least two parts are displayed in the user interface. Thus, when a user needs to delete an object, the user does not need to select the object and click the "trash can" icon to delete the photo as in the prior art, but can directly cut the selected object in the object through the trajectory of the gesture, so that the deleted state of the object is displayed in the current user interface before the object is not completely deleted.
Therefore, an embodiment of the present invention provides an object deleting method, where a terminal receives a first gesture operation instruction, determines whether a gesture indicated by the first gesture operation instruction is a preset deleting gesture, where the deleting gesture is used to indicate to delete an object in the terminal, and if the determination is yes, the terminal determines the object to be deleted, cuts the object to be deleted into at least two parts, and displays the at least two cut parts in a user interface of the terminal.
According to the invention, the cutting effect of the object displayed in the user interface after cutting is different according to the difference of the deleting gestures. For convenience of description, since the technical solution of the present invention is mainly used to delete an object, the object deleted by the technical solution of the present invention is referred to as an "object to be deleted", "deleted object", and the like in the description of the embodiment; the objects that can be deleted may include various types of files such as photos, videos, and applications, for convenience of description, the embodiment of the present invention is described by taking photos as an example, and the related user interfaces mainly include a user interface for browsing a single photo and a user interface for browsing multiple photos. In a user interface for browsing a plurality of photos, an object operated by a user is actually a preview of the plurality of photos, and for convenience of description, the user is called the photos in the embodiment; the center point of the photograph and the edge of the photograph described in the embodiments are actually the center point and the edge of the preview image indicating the photograph; in the photo described in the embodiments, the contact points indicating the gesture fall within the display range of the photo on the screen.
A method for executing the technical solution of the present invention on a terminal with a touch screen is described below by taking a single photo as an example, and fig. 5 is a schematic flow chart of an object deleting method provided by an embodiment of the present invention, where the method includes:
501. the terminal receives a first gesture operation instruction.
When the user finger contacts the screen of the terminal, the acquisition module of the terminal can acquire the gesture of the user and send the gesture to the recognition module, so that the recognition module can recognize the first gesture operation instruction according to the track of the finger contacting the screen.
502. The terminal determines whether the track of the gesture corresponding to the first gesture operation instruction is a straight line, and the track which is larger than or equal to 1/N of the length of the straight line is in the display area of the object, wherein N is a positive integer.
For example, the recognition module may recognize the position coordinates of the gesture touch on the screen, and if the user finger leaves the screen again within a very short time after touching or no displacement occurs, the process ends; if continuous displacement is generated after the finger touches the screen, whether the touch track is approximate to a straight line is further judged, and the judging method can be as follows: judging whether the track points touched by the gesture of the user meet the judgment standard of straight lines or not, calculating the maximum distance between each touch point in the track and the connecting line of the starting point and the end point of the touch track, and judging that the track is approximate to a straight line if the maximum distance meets the threshold of the judgment standard of straightness. The specific calculation process can be as follows:
a. and the terminal records the position coordinates of all touch points contained in the track generated by the gesture touch.
The established coordinate system may use a center point of the screen as an origin, or may use other points in the screen as origins, which is not limited in the present application. And the touch starting point is recorded as a starting point start _ point, the touch ending point is recorded as an end point end _ point, and the set of all track points is recorded as [ start _ point, p1, p2, …, pn, end _ point ].
b. The terminal acquires an equation of a straight line F connecting the starting point and the end point through the position coordinates of the starting point and the end point;
c. and the terminal calculates the distance from other track points except the starting point and the end point in the track point set to the straight line F.
I.e. a set of distances [ d1, d2, … dn ] from each point in the set [ p1, p2, …, pn ] to the line F is calculated.
d. The terminal calculates the maximum distance dmax in the distances of the straight lines F of other track points, and if dmax is smaller than or equal to a set straight line judgment standard threshold, the track is determined to be approximate to a straight line; otherwise, determining that the track is not a straight line.
In order to distinguish the confusion between the gesture of switching the display of the photos back and forth by the user (for example, sliding the mobile phone to the right to switch to the previous photo, and sliding the mobile phone to the left to switch to the next photo) and the deletion gesture in the present invention, before determining that the finger indicated by the first operation instruction is the preset deletion gesture, the terminal may further determine whether the trajectory meets at least one of the preset conditions. The preset conditions may include: an included angle between the track and the screen boundary of the terminal is greater than or equal to a first preset threshold value; the operation speed corresponding to the track reaches a second preset threshold; and the touch force corresponding to the track reaches a third preset threshold value. For example, the preset conditions may be: the included angle between the track and the screen boundary of the terminal is greater than or equal to a first preset threshold, and meanwhile, the operation speed corresponding to the track reaches a second preset threshold, which can be realized when the screen of the terminal is a capacitive screen; alternatively, the preset condition may be: the included angle between the track and the screen boundary of the terminal is greater than or equal to a first preset threshold, and meanwhile, the touch force corresponding to the track reaches a third preset threshold, which can be realized when the screen of the terminal is a pressure screen.
And if the included angle between the track and the screen boundary of the terminal is larger than or equal to the first preset threshold value, continuing to execute the process. For example, as shown in fig. 6, which is a schematic diagram of calculating a trajectory angle of a gesture, the calculation process may be: recording position coordinates of a gesture starting point start _ point (x1, y1) and an end point end _ point (x2, y2) to obtain a vector V of a gesture direction (x2-x1, y2-y 1); determining a direction vector H in the horizontal direction (x2-x1,0) according to the starting point and the end point; the included angle degree between the two vectors can be obtained by using a calculation formula of the included angle between the two vectors: by
Figure BDA0001007160070000121
Obtaining a degree (arccos) (theta), and then determining whether the included angle is greater than or equal to a first preset threshold according to the included angle degree. When the terminal is displayed in a vertical screen, the screen boundary of the terminal can be a shorter screen boundary of the terminal, and when the terminal is displayed in a horizontal screen, the screen boundary of the terminal can be a longer screen boundary of the terminal. Can be used forOptionally, the value range of the first preset threshold of the included angle may be greater than or 30 degrees and less than or equal to 90 degrees.
Or judging whether the operation speed corresponding to the track reaches a second preset threshold value, if not, ending the process, and if so, continuing to execute; or judging the touch force corresponding to the track, namely the force for pressing the screen during the operation of the user, if the force does not reach a third preset threshold value, ending the process, and if the force reaches the third preset threshold value, continuing to execute the process.
When the trajectory is determined to be a straight line, further, the length of the straight line of the trajectory may be continuously determined, if a trajectory of 1/N of the length of the straight line is within the display area of the current object, the gesture is determined to be a preset deletion gesture, for example, N may be 3, and when a length of 1/3 less than the trajectory is within the display area of the current photo, for example, a dot in the diagram shown in fig. 7 represents a starting point of the gesture, an arrow represents a moving trajectory of the gesture, and the trajectory of 1/3 less than the trajectory falls within the current photo, the flow ends.
503. If the gesture indicated by the first gesture operation instruction is the preset deleting gesture, the terminal determines that the gesture indicated by the first gesture operation instruction is the preset deleting gesture.
504. And the terminal determines the object where the track is located as the object to be deleted.
For example, the current screen displays a single photo, and according to the trajectory of the gesture in the photo, the object where the trajectory is located is determined as the object to be deleted.
505. And the terminal determines the position of the object to be deleted to be cut according to the position coordinate contained in the track, or the terminal determines the preset cutting position as the position to be cut.
After determining the track of the deleting gesture and the photo to be deleted, the terminal may determine the position of the straight line determined according to the position coordinate of the track as the position to be cut in the photo to be deleted, or may determine the preset cutting position as the position to be cut after determining the deleting gesture and the object to be deleted regardless of the position of the gesture track, that is, the photo is cut at the same position regardless of the position of the gesture track.
506. And cutting the object to be deleted into at least two parts according to the position to be cut, rotating the at least two cut parts in opposite directions, and displaying the at least two rotated parts in a user interface.
After the position to be cut is determined, the processing module can process the photo to be deleted into a cut effect, the cut effect is displayed on a screen through the display module, and the cut photo can comprise at least two parts.
Exemplarily, fig. 8 is a schematic diagram of a photo which is not cut to be cut, an ABCD schematic diagram in fig. 8 (1) shows a picture which needs to be cut, a line segment SE in fig. 8 (2) shows a line segment generated by a gesture operation track of a user, i.e. the above-mentioned determined straight line, wherein a point S shows a starting point of the gesture operation, a point E shows an end point of the gesture operation, a diagram (3) in fig. 8 shows a focus MN generated by extending the line segment SE generated by the gesture operation of the user in a bidirectional manner with a picture area ABCD, the picture is divided into two parts by the line segment MN, a diagram (4) in fig. 8 shows a picture after the picture is cut according to the line segment MN, the MON and M ' O ' N ' parts in the picture are filled with transparent color, and then the two parts of the picture generated after the cutting are rotated in opposite directions by a certain angle, so as to generate an effect diagram as shown in fig. 8 (5), and displaying the rotated two-part picture in a screen.
As shown in fig. 9, which is an exemplary diagram of deleting a photo, when a user wants to delete a photo 901 (fig. a) currently displayed on a screen, which starts to touch the screen at a start point 902 and continues to quickly draw a trace 903 (an arrow on the trace 903 is used to indicate the direction of the drawing), the photo 901 is displayed in a deleted state (a cut state, fig. b). In other words, the fracture of the cut in fig. b is the deleted state. The split is preferably displayed at the place where the gesture track of the user is drawn, so that the user can experience more truly, and the split can be displayed at the same position no matter where the gesture track of the user is drawn, so that the implementation difficulty is reduced.
Optionally, the status of the cut may be displayed for a predetermined period of time, such as 10 seconds, to prevent the user from mistakenly deleting the action, and in general, the user is quickly aware of the mistaken deleting action and immediately takes remedial action. During this period of time, even if the user turns to the previous photo and returns to view the deleted photo, or enters the multi-view mode to view other photos (fig. c), the deleted photo will show the cut state (901 in fig. c).
FIG. 10 is various delete gestures that may exist, including swiping different angles in the left or right direction so that the trace of the swipe spans different locations of the photograph, but are not limited to the gestures shown in FIG. 10.
Optionally, the method may further include:
507. the terminal determines whether a second gesture operation instruction is received within a preset time period, the gesture indicated by the second gesture operation instruction is a preset recovery gesture, and the recovery gesture is used for indicating recovery of the deleted object.
If the user needs to restore the object in the deleted state within the preset time period, the object in the deleted state can be restored according to the preset restoring gesture. Fig. 11 is a schematic diagram of recovering an object being deleted according to an embodiment of the present invention. The terminal can determine whether two contacts are detected to be simultaneously positioned in at least two parts of the object within a preset time period or not, the two contacts move towards the cutting position direction by more than a second preset distance, and if the two contacts do not move towards the cutting position direction, the terminal determines that a second gesture operation instruction is received.
Illustratively, two fingers of the user are contacted on the photo in the deleted state, the contact point of one finger is located in one part of the cut-off object, the contact point of the other finger is located in the other part of the cut-off object, namely, the two fingers are located on two sides of the cut-off crack, the two fingers move towards the crack direction at the same time, if the movement exceeds a second preset distance, the gesture is recognized as a preset recovery gesture, meanwhile, the display module is used for displaying that the two cut-off parts move towards the crack direction while the two fingers move towards the crack direction, and finally the photo is displayed in a state before the deleted state.
Optionally, the terminal may determine whether the first contact is detected in the at least two portions within a preset time period, the touch force reaches a fifth preset threshold, and the second contact moves in the at least two portions toward the cutting position by more than a third preset distance, and if the determination is yes, the terminal determines that the second gesture operation instruction is received. That is, one finger of the user may press on one side of the crack again, the force reaches the fifth preset threshold, the other finger touches on the other side of the crack and moves towards the crack direction beyond the third preset threshold, and then the terminal recognizes that the instruction corresponding to the gesture is the second gesture operation instruction, so as to restore the touched object.
508. If so, the terminal displays the state of the deleted object before cutting in a user interface; and if not, the terminal deletes the object to be deleted.
If the instruction of the current gesture is determined to be the second gesture operation instruction, the display module can display an animation for recovering deletion, and the two cut parts are closed and recovered to the state before the gesture is deleted.
Therefore, the object can be cut into at least two parts through the first gesture operation instruction, the purpose of quickly deleting the object is achieved, the object in the deleted state can be restored to the state before the deleted state through the second gesture operation instruction, and interestingness of object deletion and restoration is achieved.
In another implementation manner, an embodiment of the present invention provides an object deleting method, as shown in fig. 12, including:
121. the terminal receives a first gesture operation instruction.
When the user finger contacts the screen of the terminal, the acquisition module of the terminal can acquire the gesture of the user and send the gesture to the recognition module, so that the recognition module can recognize the first gesture operation instruction according to the track of the finger contacting the screen.
122. The terminal determines whether the gesture indicated by the first gesture operation instruction comprises a third contact and a fourth contact, whether the touch force of the third contact exceeds a fourth preset threshold value, and the track of the fourth contact continuously moves towards the edge direction of the terminal screen by more than a first preset distance.
Illustratively, as shown in fig. 13, a photo is displayed on the current user interface, one finger of the user presses (132 in fig. 13) a point in the photo (131 in fig. 13), and another finger continuously swipes (133 in fig. 13) to one side of the photo, so as to determine whether the force of the third contact point touched by the one finger with the screen exceeds a fourth preset threshold, if so, the execution is continued, and if not, the flow is ended; and then judging whether the distance of the other gesture continuously stroking towards the edge of the screen exceeds a first preset distance, if so, determining that the gesture indicated by the first gesture operation instruction is a preset deleting gesture, and if not, ending the process.
123. If the gesture indicated by the first gesture operation instruction is the preset deleting gesture, the terminal determines that the gesture indicated by the first gesture operation instruction is the preset deleting gesture.
124. And the terminal determines an object contacted by the third contact and the fourth contact together as the object to be deleted.
That is, the object touched by the two fingers of the user is the object to be deleted.
125. And the terminal cuts the object to be deleted into at least two parts in the vertical direction of the connecting line of the third contact and the fourth contact by using the reference point at the middle position of the connecting line of the third contact and the fourth contact, and the cutting positions of the at least two parts are displayed in a zigzag manner in a user interface of the terminal.
After the terminal determines that the current gesture is the preset deleting gesture and the current object to be deleted, the current gesture can be processed into a tearing jagged effect according to the current gesture, and the tearing jagged effect is displayed on a screen by the display module.
For example, as shown in fig. 14, it is assumed that the line segment MN is a line segment in a vertical direction of a connection line between the third contact point and the fourth contact point, different reference points O, B, D, F … and the like are respectively taken in the MN according to a certain interval d, the interval d is a parameter set by a user for generating different saw-tooth effects, and a default value provided by the system may be taken when the interval d is not set. Then, the coordinates of the point A are determined according to the center point A 'of the OB and the height h of the sawtooth shape, the position of A' and the parameter value h set by the user for generating the sawtooth shape effect are not set, and the default provided by the system is taken when the position is not setA value; the equation of the straight line OA can be uniquely determined from the position coordinates of the point O and the point a: straight line OA: k is1X+b1From the position coordinates of point a and point B, the equation of the straight line AB can be uniquely determined: line AB, Y, k2X+b2Then, based on the positional relationship of the in-plane parallel straight lines, the straight line equation of the straight line BC parallel to the straight line OA can be determined from the equation of the straight line OA: a straight line BC: k is1X+b1And+ △ d, where △ d is the distance between two parallel lines, similarly, the equations of all parallel lines DE of line OA, etc. and the equations of all parallel lines CD, EF of line BC can be found, whereby a regular tear-forming sawtooth pattern can be found from all the equations of the lines found, or an irregularly shaped sawtooth pattern can be created by adding an initial reference line.
For example, as shown in fig. 15, ABCD in (1) in fig. 15 represents a photograph to be deleted, SE in (2) in fig. 15 represents a line segment generated by a trajectory of a first gesture operation instruction of a user, a point S therein represents a start point of a gesture operation, i.e., a heavy pressing position of a gesture, a point E represents an end point of the gesture operation, i.e., a position of a gesture stroke, a diagram (3) in fig. 15 represents a reference straight line MN formed by generating a tearing effect according to a perpendicular bisector of the line segment SE generated by the trajectory, a diagram (4) in fig. 15 forms a tearing-caused jagging effect based on the reference straight line MN of the tearing effect with reference to the above-mentioned principle of generating a jagged effect, thereby generating a photograph after tearing two portions of the photograph are generated according to the generated jagging, rotating the generated two portions of jagged photograph by a certain displacement, generating a photograph having a tearing effect, and a photograph of the tearing effect is saved.
In this method, the photo with the jagged tearing effect may also be restored to the normal state when the preset time period is reached, and the implementation manner may refer to step 507 and step 508.
Therefore, the object can be processed into the photo with the zigzag tearing effect through the first gesture operation instruction in the object displayed on the user interface, so that the object is represented to be in a deleting state, a complicated photo deleting process in the prior art can be avoided, and photo deleting efficiency is improved.
In another implementation manner, if at least two objects are displayed in the current user interface, for example, when multiple photos are simultaneously displayed on the screen, as shown in fig. 16, an embodiment of the present invention further provides an object deleting method, including:
161. and when the terminal detects that the touch force of the contact point contacted with any object in the user interface reaches a sixth preset threshold value or the time of the contact point contacted with any object in the user interface exceeds a seventh preset threshold value, the terminal reduces at least two objects according to a preset proportion.
When the embodiment is different from the above two implementation manners, the embodiment does not directly operate in the object according to the delete gesture, but may trigger the delete mode first by long pressing or re-pressing the object to be deleted, and then perform the delete operation, which is suitable for a scene in which a plurality of deletable objects are displayed in the screen, because the scene has a large number of operable objects on the screen and is prone to misoperation, the delete mode may be triggered by long pressing or re-pressing the object to be deleted, specifically, it may be detected that the touch strength of a touch point in contact with any object in the user interface reaches a sixth preset threshold, or the time in the user interface in contact with any object exceeds a seventh preset threshold, the terminal determines that the delete mode is to be triggered, and sets preset proportions of at least two objects in the screen to be reduced, for example, all photos in the screen may be reduced by a certain proportion by taking the center point of the photo itself as the center, as shown in fig. 17, if the size of the image is reduced to 80% of the original size of the image, the user is prompted to enter the deletion mode currently, and an operation space is also reserved for the subsequent deletion operation.
162. The terminal receives a first gesture operation instruction.
When the user finger contacts the screen of the terminal, the acquisition module of the terminal can acquire the gesture of the user and send the gesture to the recognition module, so that the recognition module can recognize the first gesture operation instruction according to the track of the finger contacting the screen.
163. The terminal determines whether the fifth contact point corresponding to the first gesture operation instruction generates a continuous track in any object, and the track extends to the outside of the display area of any object or intersects two different edges of any object.
Illustratively, when the delete mode is triggered, when the user's finger touches either object again, it is determined whether the fifth contact with the screen produces a continuous trace that extends all the way to the outside of the photograph, or the trace intersects two different edges of the photograph in the case of a preview.
164. If the gesture indicated by the first gesture operation instruction is the preset deleting gesture, the terminal determines that the gesture indicated by the first gesture operation instruction is the preset deleting gesture.
165. And the terminal determines any object as an object to be deleted.
Namely, any object where the track continuously generated by the fifth contact of the finger of the user is located is the object to be deleted.
166. And the terminal fixes the central point of the object to be deleted, and the area, closest to the fifth contact, in the object to be deleted generates viscous deformation along with the movement of the fifth contact.
When the fifth contact points pass through the object to be deleted, the center point of the photo to be deleted can be fixed, and the area closest to the fifth contact points can generate viscous deformation to a certain range according to the movement of the fifth contact points, for example, the size of the area is within 120% of the display size of the photo.
As shown in fig. 18 (1), ABCD represents a photograph area to be processed, and the ABCD rectangular area may be divided into (N +1) × (N +1) small rectangles on average, where N is greater than or equal to 1. Each dot in fig. 18 indicates a vertex of each small rectangle formed by division. Taking side AB as an example, side AB is divided into vertex A, vertex 1, …, vertex N-1, vertex N and vertex B, and the larger the value of N set during division, the smoother the curve effect formed continuously between points. Fig. 2 in fig. 18 shows the change of each vertex when the photograph is deformed. Taking the deformation effect of the side AB as an example, assuming that the vertices 1 and …, the vertex N, and the vertex B are deformed and shifted to new coordinate positions, the vertices 1 and …, the vertex N, and the vertex B are sequentially connected to form a new deformation curve. The process of generating new coordinates by displacing the vertices 1 and …, the vertex N, and the vertex B according to the user operation is described below with reference to the deformation process.
The principle of calculating the deformation displacement by the user touch operation is shown in the following diagram, and the specific generation process may be: as shown in fig. 19, (1) taking the process that the point B at the upper right corner deforms along with the user operation as an example, the position of the vertex B moves to the point B' along with the movement of the fifth contact point; (2) according to the coordinate positions of the vertex A, the vertex B and the vertex B ', a vector AB and a vector B ' B can be determined, a cubic Bezier curve can be uniquely determined by the vertex A, the vector AB, the vertex B and the vector B ' B by utilizing the definition of the cubic Bezier curve, and the position of any vertex in the curve AB can be determined by a cubic Bezier curve equation. And (3) changing the coordinates of the vertex B ' along with the movement of the fifth contact, generating updated vertex position information according to the step (2), and generating a deformation curve between the vertex B ' and the vertex B ' in the same way.
167. And the terminal cuts the object to be deleted into at least two parts in the vertical direction of the track of the fifth contact, and displays the cut at least two parts in the user interface of the terminal.
After the object to be deleted generates viscous deformation according to the track of the fifth contact, the object to be deleted may be cut into at least two parts by a straight line in the vertical direction according to the vertical direction of the track of the fifth contact as the position to be cut, the effect of the deleted object is as shown in fig. 8 or fig. 9, and the specific implementation manner is similar to that in step 506; or, tearing the object to be deleted into a zigzag shape according to the vertical direction of the track of the fifth contact as the position to be cut, wherein the effect of the deleted object is as shown in fig. 13, and the specific implementation manner is similar to step 125.
By applying the method in this embodiment, as shown in fig. 20, it is assumed that the screen is currently displayed as a list interface of photos, the preview icon of multiple photos is fully distributed on the interface, one of the photos is pressed for a long time or pressed for a long time, the preview icon of the photos in the list of photos on the screen is reduced to a preset proportion, the user selects one of the photos to click on, the finger does not loosen and moves to the outside of the preview icon of the photo, the preview icon of the photo generates viscous deformation along with manual movement, the deformation stops after the maximum reaches a preset range, for example, 120% in fig. 20 is the preset range, and then the photo is processed into a torn jagged or straight-cut effect graph to indicate that the photo is deleted.
Therefore, the object deleting method provided by the invention can process the object to be deleted into a deleting state under the condition that the display position of the object is not changed, namely, the object to be deleted is cut into at least two parts through a preset gesture, so that the object deleting efficiency can be improved, the interest of object deletion is increased, and the user experience is optimized.
By using the schematic terminal structure diagram in fig. 3, in combination with the embodiment of the present invention, each module in the terminal may be configured to:
the acquisition module 301 is used for receiving a first gesture operation instruction;
the identification module 306 is configured to determine whether a gesture indicated by the first gesture operation instruction is a preset deletion gesture, where the deletion gesture is used to indicate to delete an object in the terminal, a track of the preset deletion gesture includes a contact point generating a continuous track, and the track needs to meet at least one of preset conditions, where the preset conditions include: an included angle between the track and the screen boundary of the terminal is greater than or equal to a first preset threshold value; the operation speed corresponding to the track reaches a second preset threshold; the touch force corresponding to the track reaches a third preset threshold;
a processing module 302, configured to determine an object to be deleted and cut the object to be deleted into at least two parts if the determination is yes;
a display module 305, configured to display the cut at least two parts in a user interface of the terminal.
Optionally, when the track of the preset delete gesture includes a contact point that generates a continuous track, the recognition module 306 is configured to:
determining whether the track of the gesture corresponding to the first finger operation instruction is a straight line, wherein the track which is larger than or equal to 1/N of the length of the straight line is in the display area of the object, and N is a positive integer;
if yes, determining that the gesture indicated by the first gesture operation instruction is a preset deleting gesture.
Optionally, the processing module 302 is configured to:
determining an object where the track is located as an object to be deleted; determining the position of the object to be deleted to be cut according to the position coordinates contained in the track, or determining the preset cutting position as the position to be cut;
cutting the object to be deleted into at least two parts according to the position to be cut, and rotating and/or moving the at least two parts after cutting in opposite directions;
the display module 305 is configured to: displaying the rotated and/or moved at least two portions in the user interface.
Optionally, the identifying module 306 is configured to:
determining whether the gesture indicated by the first gesture operation instruction comprises a third contact and a fourth contact, and whether the touch force of the third contact exceeds a fourth preset threshold, wherein the track of the fourth contact continuously moves towards the edge direction of the terminal screen by more than a first preset distance;
and if so, determining that the gesture indicated by the first gesture operation instruction is a preset deleting gesture.
Optionally, the processing module 302 is configured to:
determining an object contacted by the third contact and the fourth contact together as an object to be deleted;
cutting the object to be deleted into at least two parts in the vertical direction of the connecting line of the third contact and the fourth contact by taking the middle position of the connecting line of the third contact and the fourth contact as a reference point;
the display module 305 is configured to: and displaying the cutting positions of the at least two parts in a sawtooth shape in a user interface of the terminal.
Optionally, the identifying module 306 is configured to: determining whether a fifth contact point corresponding to the first gesture operation instruction generates a continuous track in any object, wherein the track extends to the outside of a display area of any object, or the track intersects two different edges of any object;
if yes, determining that the gesture indicated by the first gesture operation instruction is a preset deleting gesture.
Optionally, the processing module 302 is configured to: and when the recognition module is used for detecting that the touch force of the contact point contacted with any object in the user interface reaches a sixth preset threshold value or the time of the contact point contacted with any object in the user interface exceeds a seventh preset threshold value, reducing at least two objects according to a preset proportion.
Optionally, the processing module 302 is configured to: determining any object as an object to be deleted; fixing the central point of the object to be deleted, and generating viscous deformation by following the movement of the fifth contact in the area, closest to the fifth contact, of the object to be deleted; cutting the object to be deleted into at least two parts in the vertical direction of the track of the fifth contact;
the display module 305 is configured to: and displaying the at least two cut parts in a user interface of the terminal.
Optionally, the identifying module 306 is configured to: determining whether a second gesture operation instruction is received within a preset time period, wherein a gesture indicated by the second gesture operation instruction is a preset recovery gesture, and the recovery gesture is used for indicating recovery of the deleted object;
the display module 305 is configured to: if yes, displaying the state of the deleted object before cutting in a user interface; the processing module is used for: and if not, deleting the object to be deleted.
Optionally, the identifying module 306 is configured to: determining whether two contacts are detected to be simultaneously positioned in at least two parts within a preset time period or not, and the two contacts move towards the cutting position direction by more than a second preset distance; if so, the terminal determines that a second gesture operation instruction is received;
or, the identification module 306 is configured to: and determining whether the first contact is detected in at least two parts within a preset time period, the touch force reaches a fifth preset threshold value, the second contact moves in the at least two parts towards the cutting position by more than a third preset distance, and if so, determining that a second gesture operation instruction is received.
Optionally, the processing module 302 is further configured to: querying a display state of the at least one object before displaying the at least one object in the user interface, the display state including a normal state and a deleted state;
the display module 305 is further configured to: if the display state of a first object in the at least one object is a normal state, displaying the first object in the user interface; and if the display state of a second object in the at least one object is a deletion state, displaying the second object in the user interface in at least two cut parts.
Therefore, according to the terminal provided by the invention, whether the first gesture operation instruction of the user is the preset deleting gesture is identified, if yes, the terminal cuts the object to be deleted into at least two parts and displays the two parts in the user interface of the terminal, and therefore, the efficiency of deleting the object by the user can be improved.
An embodiment of the present invention further provides a terminal 21, as shown in fig. 21, including: memory 211, processor 212, receiver 213, transmitter 214, and display 215. The processor 212 is used for controlling and managing the actions of the terminal. For example, processor 212 is configured to support the terminal in performing processes 401-403 in FIG. 4, processes 501-508 in FIG. 5, processes 121-125 in FIG. 12, processes 161-167 in FIG. 16, and/or other processes for the techniques described in embodiments of the present invention. The memory 211 is used for storing program codes and data of the terminal. The network interface is used to support the communication of the terminal with other network entities, including a receiver 213 and a transmitter 214. For example, the network interface is used to support the terminal to communicate with the server. The display 215 is used to display an operation interface used in the terminal. The display 215 may include a touch screen, also referred to as a control panel, which may collect touch operations by a user (e.g., operations by a user on or near the touch screen using a finger, a stylus, or any other suitable object or accessory) and drive the corresponding connection device according to a predetermined program. Alternatively, the touch screen may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor, and can receive and execute commands sent by the processor.
In particular, processor 212 implements embodiments of the present invention for: receiving a first gesture operation instruction; determining whether a gesture indicated by a first gesture operation instruction is a preset deleting gesture, wherein the deleting gesture is used for indicating to delete an object in the terminal, a track of the preset deleting gesture comprises a contact point generating a continuous track, the track needs to meet at least one preset condition, and the preset condition comprises: an included angle between the track and the screen boundary of the terminal is greater than or equal to a first preset threshold value; the operation speed corresponding to the track reaches a second preset threshold; the touch force corresponding to the track reaches a third preset threshold; if so, determining the object to be deleted, and cutting the object to be deleted into at least two parts; the display 215 is used to display the cut at least two parts in the user interface of the terminal.
The processor 212 and the display 215 can be used in the embodiments of the present invention, which are not described herein in detail.
Therefore, according to the terminal provided by the invention, whether the first gesture operation instruction of the user is the preset deleting gesture is identified, if yes, the terminal cuts the object to be deleted into at least two parts and displays the two parts in the user interface of the terminal, and therefore, the efficiency of deleting the object by the user can be improved.
In the several embodiments provided in the present application, it should be understood that the disclosed terminal and method can be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is merely a logical division, and in actual implementation, there may be other divisions, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing module, or each module may be physically included alone, or two or more modules may be integrated into one module. The integrated module can be realized in a hardware form, and can also be realized in a form of hardware and a software functional module.
The integrated module implemented in the form of a software functional module may be stored in a computer-readable storage medium. The software functional module is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (12)

1. An object deletion method, comprising:
the terminal receives a first gesture operation instruction;
the terminal determines whether the gesture indicated by the first gesture operation instruction is a preset deleting gesture or not, and the method comprises the following steps: when the track of the preset deleting gesture comprises a contact point generating a continuous track, if the terminal determines that the contact point corresponding to the first gesture operation instruction generates the continuous track in any object, and the track extends to the outside of a display area of the any object, or the track intersects with two different edges of the any object, the terminal determines that the gesture indicated by the first gesture operation instruction is the preset deleting gesture;
the deleting gesture is used for indicating to delete the object in the terminal, the track of the preset deleting gesture comprises a contact point generating a continuous track, the track needs to meet at least one preset condition, and the preset condition comprises: an included angle between the track and the screen boundary of the terminal is greater than or equal to a first preset threshold value; the operation speed corresponding to the track reaches a second preset threshold; the touch force corresponding to the track reaches a third preset threshold; the terminal comprises a plurality of objects, a server and a server, wherein the objects in the terminal are a plurality of objects;
if the object to be deleted is determined to be any object where a track continuously generated by a contact of a finger of a user is located, the terminal determines the object to be deleted, cuts the object to be deleted into at least two parts, and displays the at least two cut parts in a user interface of the terminal.
2. The method of claim 1, wherein when at least two objects are simultaneously displayed in the user interface, prior to cutting the object to be deleted, the method further comprises:
and when the terminal detects that the touch force of the contact point contacted with any object in the user interface reaches a sixth preset threshold value, or the time of the contact point contacted with any object in the user interface exceeds a seventh preset threshold value, the terminal reduces the at least two objects according to a preset proportion.
3. The method according to claim 1 or 2, wherein the terminal determines an object to be deleted, cuts the object to be deleted into at least two parts, and displays the at least two cut parts in a user interface of the terminal comprises:
the terminal determines any object as the object to be deleted;
the terminal fixes the central point of the object to be deleted, and the area, closest to the contact, in the object to be deleted generates viscous deformation along with the movement of the contact;
and the terminal cuts the object to be deleted into the at least two parts in the vertical direction of the track of the contact point and displays the at least two cut parts in a user interface of the terminal.
4. The method according to claim 1 or 2, characterized in that the method further comprises:
the terminal determines whether a second gesture operation instruction is received within a preset time period, wherein a gesture indicated by the second gesture operation instruction is a preset recovery gesture, and the recovery gesture is used for indicating recovery of a deleted object;
if so, the terminal displays the state of the deleted object before cutting in the user interface; and if not, the terminal deletes the object to be deleted.
5. The method according to claim 4, wherein the terminal determines whether a second gesture operation instruction is received within a preset time period, the gesture indicated by the second gesture operation instruction is a preset recovery gesture, and the recovery gesture is used for indicating to recover the deleted object and comprises:
the terminal determines whether two contacts are detected to be simultaneously positioned in the at least two parts within the preset time period or not, the two contacts move towards the cutting position direction by more than a second preset distance, and if the two contacts do not move towards the cutting position direction, the terminal determines that the second gesture operation instruction is received;
or, the terminal determines whether the first contact is detected in the at least two parts within the preset time period, the touch force reaches a fifth preset threshold, the second contact is detected in the at least two parts and moves to the cutting position direction by more than a third preset distance, and if yes, the terminal determines that the second gesture operation instruction is received.
6. The method according to claim 1 or 2, characterized in that the method further comprises:
the terminal inquires the display state of at least one object before the at least one object is displayed in the user interface, wherein the display state comprises a normal state and a deletion state;
if the display state of a first object in the at least one object is the normal state, the terminal displays the first object in the user interface; and if the display state of a second object in the at least one object is the deletion state, the terminal displays the second object in the user interface in the at least two cut parts.
7. A terminal, comprising:
the acquisition module is used for receiving a first gesture operation instruction;
the recognition module is configured to determine whether a gesture indicated by the first gesture operation instruction is a preset deletion gesture, and when a track of the preset deletion gesture includes a contact point generating a continuous track, if it is determined that the contact point corresponding to the first gesture operation instruction generates a continuous track in any object and the track extends to the outside of a display area of the any object or the track intersects with two different edges of the any object, determine that the gesture indicated by the first gesture operation instruction is the preset deletion gesture; the deleting gesture is used for indicating to delete the object in the terminal, the track of the preset deleting gesture comprises a contact point generating a continuous track, the track needs to meet at least one preset condition, and the preset condition comprises: an included angle between the track and the screen boundary of the terminal is greater than or equal to a first preset threshold value; the operation speed corresponding to the track reaches a second preset threshold; the touch force corresponding to the track reaches a third preset threshold; the terminal comprises a plurality of objects, a server and a server, wherein the objects in the terminal are a plurality of objects;
the processing module is used for determining an object to be deleted and cutting the object to be deleted into at least two parts if the determination is positive, wherein the object to be deleted is any object where a track continuously generated by a contact of a finger of a user is located;
and the display module is used for displaying the at least two cut parts in a user interface of the terminal.
8. The terminal of claim 7, wherein the processing module is configured to: and when the recognition module is used for detecting that the touch force of the contact point contacted with any object in the user interface reaches a sixth preset threshold value or the time of the contact point contacted with any object in the user interface exceeds a seventh preset threshold value, reducing the at least two objects according to a preset proportion.
9. The terminal of claim 7 or 8, wherein the processing module is configured to: determining any object as the object to be deleted; fixing the central point of the object to be deleted, and generating viscous deformation in an area, closest to the contact, of the object to be deleted along with the movement of the contact; the object to be deleted is cut into the at least two parts in the vertical direction of the locus of the contact points;
the display module is used for: and displaying the at least two cut parts in a user interface of the terminal.
10. The terminal of claim 7 or 8, wherein the identification module is configured to: determining whether a second gesture operation instruction is received within a preset time period, wherein a gesture indicated by the second gesture operation instruction is a preset recovery gesture, and the recovery gesture is used for indicating recovery of a deleted object;
the display module is used for: if yes, displaying the state of the deleted object before cutting in the user interface; the processing module is used for: and if not, deleting the object to be deleted.
11. The terminal of claim 10, wherein the identification module is configured to: determining whether two contacts are detected to be simultaneously positioned in the at least two parts within the preset time period or not, wherein the two contacts move towards the cutting position by more than a second preset distance; if so, the terminal determines to receive the second gesture operation instruction;
or, the identification module is configured to: determining whether the first contact is detected in the at least two parts within the preset time period, the touch force reaches a fifth preset threshold value, and the second contact moves in the at least two parts towards the cutting position by more than a third preset distance, if so, determining that the second gesture operation instruction is received;
the display module is used for: and displaying the at least two cut parts in a user interface of the terminal.
12. The terminal of claim 7 or 8, wherein the processing module is further configured to: querying a display status of at least one object before displaying the at least one object in the user interface, the display status including a normal status and a deleted status;
the display module is further configured to: if the display state of a first object in the at least one object is the normal state, displaying the first object in the user interface; and if the display state of a second object in the at least one object is the deletion state, displaying the second object in the user interface in the at least two cut parts.
CN201610388214.XA 2016-06-01 2016-06-01 Object deleting method and terminal Active CN107450824B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610388214.XA CN107450824B (en) 2016-06-01 2016-06-01 Object deleting method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610388214.XA CN107450824B (en) 2016-06-01 2016-06-01 Object deleting method and terminal

Publications (2)

Publication Number Publication Date
CN107450824A CN107450824A (en) 2017-12-08
CN107450824B true CN107450824B (en) 2020-10-16

Family

ID=60485293

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610388214.XA Active CN107450824B (en) 2016-06-01 2016-06-01 Object deleting method and terminal

Country Status (1)

Country Link
CN (1) CN107450824B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111077984A (en) * 2018-10-19 2020-04-28 北京微播视界科技有限公司 Man-machine interaction method and device, electronic equipment and computer storage medium
CN110275660A (en) * 2019-06-18 2019-09-24 深圳市趣创科技有限公司 A kind of method that fingerprint fast deletes photo
CN110716678B (en) * 2019-10-08 2022-09-23 江苏集萃有机光电技术研究所有限公司 Local deletion method and processing system for display picture and display equipment
CN111700380B (en) * 2020-06-23 2022-03-18 卢孟茜 Intelligent office table
CN113127425B (en) * 2021-03-12 2023-04-21 维沃移动通信(杭州)有限公司 Picture processing method and device and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103853458A (en) * 2012-12-04 2014-06-11 华为技术有限公司 Method for clearing contents in intelligent terminal and intelligent terminal
CN104898924A (en) * 2015-05-22 2015-09-09 小米科技有限责任公司 Elimination method and device for pictures of application programs
CN104978124A (en) * 2015-06-30 2015-10-14 广东欧珀移动通信有限公司 Picture display method for terminal and terminal
CN105224206A (en) * 2014-06-30 2016-01-06 联想(北京)有限公司 A kind of method of operation input and electronic equipment
CN105302444A (en) * 2015-10-30 2016-02-03 努比亚技术有限公司 Picture processing method and apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4059802B2 (en) * 2003-04-17 2008-03-12 株式会社サピエンス Image display method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103853458A (en) * 2012-12-04 2014-06-11 华为技术有限公司 Method for clearing contents in intelligent terminal and intelligent terminal
CN105224206A (en) * 2014-06-30 2016-01-06 联想(北京)有限公司 A kind of method of operation input and electronic equipment
CN104898924A (en) * 2015-05-22 2015-09-09 小米科技有限责任公司 Elimination method and device for pictures of application programs
CN104978124A (en) * 2015-06-30 2015-10-14 广东欧珀移动通信有限公司 Picture display method for terminal and terminal
CN105302444A (en) * 2015-10-30 2016-02-03 努比亚技术有限公司 Picture processing method and apparatus

Also Published As

Publication number Publication date
CN107450824A (en) 2017-12-08

Similar Documents

Publication Publication Date Title
CN107450824B (en) Object deleting method and terminal
US8675113B2 (en) User interface for a digital camera
KR101580914B1 (en) Electronic device and method for controlling zooming of displayed object
US9317196B2 (en) Automatic zooming for text selection/cursor placement
US8947375B2 (en) Information processing device, information processing method, and information processing program
EP2927792B1 (en) Mobile terminal allowing selection of part of the screen for screen capture
US20110072345A1 (en) Mobile terminal and operating method thereof
US9880721B2 (en) Information processing device, non-transitory computer-readable recording medium storing an information processing program, and information processing method
US9569099B2 (en) Method and apparatus for displaying keypad in terminal having touch screen
EP2824905B1 (en) Group recording method, machine-readable storage medium, and electronic device
US20150063785A1 (en) Method of overlappingly displaying visual object on video, storage medium, and electronic device
EP3961364A1 (en) Page operation method and apparatus, and terminal and storage medium
KR20150029463A (en) Method, apparatus and recovering medium for controlling user interface using a input image
CN112099707A (en) Display method and device and electronic equipment
WO2021179803A1 (en) Content sharing method and apparatus, electronic device and storage medium
US10848558B2 (en) Method and apparatus for file management
CN105718204A (en) Touch gesture based control method and device
KR101419871B1 (en) Apparatus and method for editing subtitles
CN109086113B (en) Screen capturing method and device and mobile terminal
JP2015141526A (en) Information processor, information processing method and program
CN104951227A (en) Electronic device of messaging and method thereof
CA2807866C (en) User interface for a digital camera
US9519709B2 (en) Determination of an ordered set of separate videos
CN113485590A (en) Touch operation method and device
CN115268618A (en) Method, device, system and storage medium for migrating tasks across equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant