CN102568403B - Electronic instrument and object deletion method thereof - Google Patents

Electronic instrument and object deletion method thereof Download PDF

Info

Publication number
CN102568403B
CN102568403B CN201010605710.9A CN201010605710A CN102568403B CN 102568403 B CN102568403 B CN 102568403B CN 201010605710 A CN201010605710 A CN 201010605710A CN 102568403 B CN102568403 B CN 102568403B
Authority
CN
China
Prior art keywords
initial position
distance
touch track
initial
final position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201010605710.9A
Other languages
Chinese (zh)
Other versions
CN102568403A (en
Inventor
王卓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201010605710.9A priority Critical patent/CN102568403B/en
Publication of CN102568403A publication Critical patent/CN102568403A/en
Application granted granted Critical
Publication of CN102568403B publication Critical patent/CN102568403B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides an electronic instrument and an object deletion method thereof, wherein the object deletion method comprises the steps that a first initial position and a first termination position of a first touch track, and a second initial position and a second termination position of a second touch track are gained by sensing the first touch track and the second touch track; an object can be determined according to the first initial position and the second initial position or the first termination position and the second termination position; a deletion order can be generated according to the first initial position, the second initial position, the first termination position and the second termination position; and the object is deleted.

Description

Electronic equipment and object delet method thereof
Technical field
The present invention relates to the field of electronic equipment, more specifically, the present invention relates to a kind of electronic equipment and object delet method thereof.
Background technology
In as the electronic equipment of mobile phone and portable computer, user usually needs the object to showing on electronic equipment to carry out deletion action.Described object can be for example picture, text, audio frequency and video etc.For realizing deletion action, user needs alternative, clicks a series of steps such as delete button for the object of selecting, and for user, operation is uninteresting and easy not.In addition, in the time that the object that will delete is multiple object, user need to select respectively multiple objects, and this has further increased user's inconvenience.
Summary of the invention
Because above-mentioned situation the invention provides a kind of electronic equipment and object delet method thereof, it makes user to delete object by operation simply and intuitively, has greatly improved user's experience.
According to one embodiment of the invention, a kind of object delet method is provided, comprise: sensing the first touch track and the second touch track, to obtain the first initial position of described the first touch track and the second initial position and second final position of the first final position and described the second touch track; According to described the first initial position and described the second initial position, or according to described the first final position and described the second final position, determine destination object; According to described the first initial position, described the second initial position, described the first final position and described the second final position, generate delete command; And delete described destination object.
In described object delet method, described the first touch track and described the second touch track can form simultaneously.
In described object delet method, described generation delete command can comprise: calculate the initial distance between described the first initial position and described the second initial position; Calculate the termination distance between described the first final position and described the second final position; Judge whether described termination distance is less than described initial distance; And in the time that described termination distance is less than described initial distance, generate delete command.
In described object delet method, in the time that described termination distance is less than described initial distance, generating delete command can comprise: judge whether described termination distance is less than the first preset distance; And in the time that described termination distance is less than the first preset distance, generate delete command.
In described object delet method, in the time that described termination distance is less than described initial distance, generating delete command can comprise: calculate described initial distance and the described range difference stopping between distance; Judge whether described range difference is greater than the second preset distance; And in the time that described range difference is greater than the second preset distance, generate delete command.
In described object delet method, described definite destination object can comprise: judge that whether described the first initial position and described the second initial position are corresponding to same object; And in the time judging described the first initial position and described the second initial position corresponding to same object, described object is defined as to destination object.
In described object delet method, described definite destination object can comprise: judge that whether described the first initial position and described the second initial position are corresponding to same object; And when judge described the first initial position corresponding to the first object, described the second initial position corresponding to second object and described the first object when different with described second object, described the first object, described second object and all objects of being arranged in order between described the first object and described second object are defined as to destination object.
In described object delet method, described definite destination object can comprise: determine object and object corresponding to described the second touch track that described the first touch track is corresponding; And object corresponding described the first touch track and object corresponding to described the second touch track are defined as to destination object.
According to another embodiment of the present invention, a kind of electronic equipment is provided, comprise: sensing cell, sensing the first touch track and the second touch track, to obtain the first initial position of described the first touch track and the second initial position and second final position of the first final position and described the second touch track; Object determining unit, according to described the first initial position and described the second initial position, or according to described the first final position and described the second final position, determines destination object; Order generation unit, according to described the first initial position, described the second initial position, described the first final position and described the second final position, generates delete command; And delete cells, delete described destination object.
In described electronic equipment, described order generation unit can comprise: the first computing unit, calculates the initial distance between described the first initial position and described the second initial position; The second computing unit, calculates the termination distance between described the first final position and described the second final position; The first judging unit, judges whether described termination distance is less than described initial distance; And first generation unit, in the time that described termination distance is less than described initial distance, generate delete command.
In described electronic equipment, described the first generation unit can comprise: the second judging unit, judges whether described termination distance is less than the first preset distance; And second generation unit, in the time that described termination distance is less than the first preset distance, generate delete command.
In described electronic equipment, described the first generation unit can comprise: the 3rd computing unit, calculates described initial distance and the described range difference stopping between distance; The 3rd judging unit, judges whether described range difference is greater than the second preset distance; And the 3rd generation unit, in the time that described range difference is greater than the second preset distance, generate delete command.
In described electronic equipment, described object determining unit can comprise: the 4th judging unit, judges that whether described the first initial position and described the second initial position are corresponding to same object; And first determining unit, in the time judging described the first initial position and described the second initial position corresponding to same object, described object is defined as to destination object.
In described electronic equipment, described object determining unit can comprise: the 4th judging unit, judges that whether described the first initial position and described the second initial position are corresponding to same object; And second determining unit, when judge described the first initial position corresponding to the first object, described the second initial position corresponding to second object and described the first object when different with described second object, described the first object, described second object and all objects of being arranged in order between described the first object and described second object are defined as to destination object.
In described electronic equipment, described object determining unit can comprise: the 3rd determining unit, determine object and object corresponding to described the second touch track that described the first touch track is corresponding; And the 4th determining unit, object corresponding described the first touch track and object corresponding to described the second touch track are defined as to destination object.
By according to the electronic equipment of the embodiment of the present invention and object delet method thereof, make user to delete object by operation simply and intuitively, greatly improve user's experience.
Accompanying drawing explanation
Fig. 1 is that diagram is according to the process flow diagram of the step of the object delet method of the embodiment of the present invention;
Fig. 2 is that diagram is according to the block diagram of the main configuration of the electronic equipment of the embodiment of the present invention;
Fig. 3 is the block diagram illustrating more in detail according to the main configuration of the electronic equipment of the embodiment of the present invention;
Fig. 4 is the block diagram illustrating more in detail according to the main configuration of the electronic equipment of the embodiment of the present invention;
Fig. 5 A and 5B are the block diagrams illustrating more in detail according to the main configuration of the electronic equipment of the embodiment of the present invention;
Fig. 6 A-6D is that diagram application is according to the demonstration schematic diagram of the electronic equipment of the object delet method of the embodiment of the present invention.
Embodiment
Describe in detail according to the electronic equipment of the embodiment of the present invention and object delet method thereof below with reference to accompanying drawing.
Be applied in the electronic equipment such as function/smart mobile phone, portable computer (notebook, flat computer or e-book), desk-top computer or all-in-one etc. according to the object delet method of the embodiment of the present invention.Described electronic equipment is during such as function/smart mobile phone, portable computer or all-in-one, and described electronic equipment can comprise display unit; Described electronic equipment is can be connected with a display unit during such as desk-top computer.Described electronic equipment also comprises sensing cell, gives directions the touch location contacting of things and electronic equipment for responding to user finger or other, thus acquisition touch track.Described sensing cell and described display unit can arranged stacked on the surface of described electronic equipment.For example, described sensing cell can be layered in the top of described display unit.Certainly, described sensing cell also can be arranged in and the separated specific region of described display unit.In addition, described electronic equipment can comprise storage unit.In described storage unit, store multiple objects.Described object can be for example picture, text, audio frequency and video etc.In addition, described object delet method for example can be applicable to delete the applied environment such as associated person information or deleting short message, mail in the address list of electronic equipment.In addition, it is pointed out that described storage unit can be included in described electronic equipment, also can be separated with described electronic equipment., described electronic equipment itself needn't comprise described storage unit.
With reference to accompanying drawing 1, the object delet method according to the embodiment of the present invention is described below.
As shown in Figure 1, described object delet method comprises:
Step S101: sensing the first touch track and the second touch track, to obtain the first initial position of described the first touch track and the second initial position and second final position of the first final position and described the second touch track.
Particularly, in the time that user wants to delete the object showing on described electronic equipment, user gives directions things (following, take user's finger as example) to make contraction gesture with at least two.The termination between two fingers is apart from the gesture that is less than initial distance for contraction gesture described in the embodiment of the present invention, and it for example can comprise following two kinds of situations.In the first situation, the first gesture that the first finger is made is for pressing gesture, and the second gesture that the second finger is made is slip gesture, and termination between two fingers distance is less than initial distance.In the second situation, the second gesture that the first gesture that the first finger is made and the second finger are made is slip gesture, and termination between two fingers distance is less than initial distance.Described the first finger and described the second finger can be made respectively gesture in different timing, also can make gesture in identical timing.
Described object delet method is for example by described sensing cell, described in sensing, shrink the first touch track and second touch track of gesture, to obtain the first initial position of described the first touch track and the second initial position and second final position of the first final position and described the second touch track.In above-mentioned the first situation, the sensing result that described object delet method obtains is: the first touch track forming by described first gesture only comprises a touch point, and the second touch track forming by described the second gesture comprises multiple touch points.That is to say, in the case, described the first initial position and described the first final position are identical.In above-mentioned the second situation, the sensing result that described object delet method obtains is: the first touch track forming by described first gesture and the second touch track forming by described the second gesture include multiple touch points.It is pointed out that in above-mentioned two situations, described the first touch track and described the second touch track can form in different timing.Preferably, described the first touch track and described the second touch track form simultaneously.
Step S102: according to described the first initial position and described the second initial position, or according to described the first final position and described the second final position, determine destination object.
Due to as mentioned above sensing cell and display unit can arranged stacked on the surface of described electronic equipment, therefore, described object delet method can pass through sensed touch location, determines destination object.Particularly, described object delet method can be determined destination object by two kinds of modes.In first kind of way, described object delet method judges that whether described the first initial position and described the second initial position are corresponding to same object.In the time judging described the first initial position and described the second initial position corresponding to same object, described object is defined as destination object by described object delet method.In the case, described destination object is single object.In the time judging that described the first initial position and described the second initial position do not correspond to same object,, when judge described the first initial position corresponding to the first object, described the second initial position corresponding to second object and described the first object when different with described second object, described the first object, described second object and all objects of being arranged in order between described the first object and described second object are defined as destination object by described object delet method.In the case, described destination object is multiple objects.
It is pointed out that and described above the operation of determining destination object according to described the first initial position and described the second initial position.The operation of determining destination object according to described the first final position and described the second final position similarly., described object delet method judges that whether described the first final position and described the second final position are corresponding to same object.In the time judging described the first final position and described the second final position corresponding to same object, described object is defined as destination object by described object delet method.In the case, described destination object is single object.In the time judging that described the first final position and described the second final position do not correspond to same object,, when judge described the first final position corresponding to the first object, described the second final position corresponding to second object and described the first object when different with described second object, described the first object, described second object and all objects of being arranged in order between described the first object and described second object are defined as destination object by described object delet method.In the case, described destination object is multiple objects.
In the second way, described object delet method can be determined object and object corresponding to described the second touch track that described the first touch track is corresponding, and object corresponding described the first touch track and object corresponding to described the second touch track are defined as to destination object.Particularly, all touch points that described object delet method can comprise by described the first touch track, determine the corresponding object in all touch points of described the first touch track.Similarly, all touch points that described object delet method can comprise by described the second touch track, determine the corresponding object in all touch points of described the second touch track.After this, object corresponding described the first touch track and object corresponding to described the second touch track are defined as destination object by described object delet method.Figuratively, all objects of all objects of described the first touch track institute " process " and described the second touch track institute " process " are defined as destination object by described object delet method.
Certainly, described object delet method determines that the mode of destination object is not limited to above-mentioned two kinds of modes, those skilled in the art can be based in described the first touch track and described the second touch track each position, touch point, determine destination object with various pre-defined rules, within it is included in scope of the present invention.
In addition, also it is pointed out that preferably, described sensing cell and described display unit arranged stacked are on the surface of described electronic equipment as mentioned above.Certainly, described sensing cell can with the surface of described display unit arranged stacked at described electronic equipment.In the case, described object delet method can, by setting in advance the corresponding relation between the viewing area on sensing region and the display unit on sensing cell, be determined destination object, and it is included within the scope of the present invention equally.
For the clearer operation of describing intuitively step S102, be described below with reference to the schematic diagram of Fig. 6.Fig. 6 A-6D is that diagram application is according to the demonstration schematic diagram of the electronic equipment of the object delet method of the embodiment of the present invention.As shown in Fig. 6 A and 6C, on described electronic equipment, shown five object A, B, C, D and E, and user makes a contraction gesture by thumb and forefinger.In Fig. 6 A and 6C, the curve with arrow is the direction of motion of two fingers.Here for example suppose that the track forming by the gesture of thumb is the first touch track, and the track forming by the gesture of forefinger is the second touch track.
Fig. 6 A illustrates the situation of deleting single object by described contraction gesture, wherein illustrates the initial position of described thumb and forefinger.By first kind of way as above, described object delet method judges that the first initial position of described the first touch track and the second initial position of described the second touch track are all corresponding to same object A, determines that thus destination object is object A.Alternatively, by the second way as above, described object delet method determines that object corresponding to described the first touch track is object A, and determines that object corresponding to described the second touch track is object A, thereby described object A is defined as to destination object.
Fig. 6 C illustrates the situation of deleting multiple objects by described contraction gesture, and the initial position that wherein illustrates described thumb and forefinger corresponds respectively to object E and object A.In addition, suppose that the final position of described thumb is corresponding to object D, and the final position of described forefinger is corresponding to object B.By first kind of way as above, described object delet method judges that the first initial position of described the first touch track and the second initial position of described the second touch track are corresponding to different objects, all objects (, object B, C and D) that (in the present embodiment for order) is from top to bottom arranged between object A and object E thus by object A, object E and are in order defined as destination object.Alternatively, by the second way as above, described object delet method determines that object corresponding to described the first touch track is object E and D, and determines that object corresponding to described the second touch track is object A and B, thereby described object A, B, D and E are defined as to destination object.
Step S103: according to described the first initial position, described the second initial position, described the first final position and described the second final position, generate delete command.
Particularly, described object delet method calculates the initial distance between described the first initial position and described the second initial position.In addition, described object delet method calculates the termination distance between described the first final position and described the second final position.After this, described object delet method judges whether described termination distance is less than described initial distance.In the time that described termination distance is less than described initial distance, described object delet method generates delete command.That is to say, in the time that user makes contraction gesture as above, described object delet method generates delete command.In the time that described termination distance is more than or equal to described initial distance, described object delet method finishes.
In addition,, in the time that described termination distance is less than described initial distance, described object delet method can further judge whether described termination distance is less than the first preset distance, and in the time that described termination distance is less than the first preset distance, generates delete command.Wherein, described the first preset distance is suitably determined according to actual needs by those skilled in the art, does not specifically limit at this.
Alternatively, now, described object delet method also can not calculate described initial distance and judge whether described termination distance is less than described initial distance.That is to say, described object delet method can only calculate the termination distance between described the first final position and described the second final position, judge whether described termination distance is less than the first preset distance, and in the time that described termination distance is less than the first preset distance, generate delete command.
In addition,, in the time that described termination distance is less than described initial distance, described object delet method can further calculate described initial distance and the described range difference stopping between distance, and judges whether described range difference is greater than the second preset distance.In the time that described range difference is greater than the second preset distance, described object delet method generates delete command.That is to say, when the amplitude of making contraction gesture as above and described contraction gesture as user reaches to a certain degree, described object delet method generates delete command.In the time that described range difference is less than or equal to the second preset distance, described object delet method finishes.Equally, the second preset distance described here is suitably determined according to actual needs by those skilled in the art, does not specifically limit at this.
Step S104: delete described destination object.
Generating after the delete command for described destination object as mentioned above, at step S104, described object delet method is deleted described destination object.Described object delet method can, by described destination object permanent delet from storage unit, also can for example be deleted described destination object with the form of " recycle bin " known to those skilled in the art from current storage area.For example, described object delet method can, as shown in Fig. 6 B and Fig. 6 D, show a recycle bin icon on electronic equipment, and described destination object is directly transferred in recycle bin.Certainly, show a recycle bin icon on electronic equipment after, described object delet method also can directly not transferred to described destination object in " recycle bin ", but make take position corresponding to " recycle bin " when stopping the gesture of touch point when sensing user, described destination object is transferred in recycle bin.The operation that described object delet method is deleted described destination object is known to those skilled in the art, is not described in detail in this.
It is pointed out that preferably, in the time deleting described destination object, described object delet method can show a still image or a dynamic image.For example, described still image can be the image of a spitball.Described dynamic image can be described a paper that is painted with image and is kneaded into the process of spitball, so that obtain, user's impression is more directly perceived, experience is abundanter.
Preferably, described image also can be determined according to the type of described destination object.Particularly, in the process of the described spitball image of generation, described object delet method obtains the file attribute of destination object, and according to described file attribute, determines described image.For example, when the file attribute of the described destination object obtaining when described object delet method is audio frequency, described object delet method determines that described image is a note image.When the file attribute of the described destination object obtaining when described object delet method is video, described object delet method determines that described image is a film image.In addition, in the time that described object delet method determines that the destination object that will delete is multiple object, described object delet method can obtain respectively the file attribute of each destination object, and to each file attribute differential count, determines described image according to the highest file attribute of count value.Alternatively, described object delet method also can obtain respectively the file attribute of each destination object, determines multiple images according to obtained multiple file attributes, thereby shows that the paper that is painted with respectively described image is kneaded into multiple dynamic images or the still image of spitball.Certainly, described object delet method also can be by a synthetic described multiple image sets image, and shows that the paper that is painted with this image is kneaded into dynamic image or the still image of spitball.
More preferably, the fold degree of described spitball and size can be determined according to described the first touch track and described the second touch track.For example, the fold degree of described spitball and size can be determined according to the range difference between described initial distance and described termination distance.Range difference between described initial distance and described termination distance is larger, and the fold degree of described spitball is larger, and size is less.Vice versa.Particularly, for example, suppose that described range difference is pcm, the spitball diameter of the spitball image generating is d cm, in the process of generation spitball image, described object delet method can calculate described range difference, and based on described range difference, determines described spitball diameter by following formula (1):
d=k×p+D (1)
Wherein, k is scale-up factor, and D is a constant, and its value can suitably be determined according to actual needs by those skilled in the art, specifically not limit at this.
Alternatively, described object delet method can calculate described range difference, and based on described range difference, determines described spitball diameter by following formula (2):
d = d 1 0 &le; p < P 1 d = d 2 P 1 &le; p < P 2 . . . d = d n P n - 1 &le; p &le; P n d = d n p &GreaterEqual; P n - - - ( 2 )
Wherein, d 1, d 2... d nfor one group of default spitball diameter value, P 1, P 2... P nfor the threshold value of one group of default range difference, its value all can be determined according to actual needs by those skilled in the art, does not specifically limit at this.
After determining the diameter of described spitball, described object delet method generates spitball according to described diameter.Certainly, described object delet method is not limited to generate described image by above-mentioned formula (1) and (2).Those skilled in the art can generate described image by other variety of ways completely, and it all within the scope of the invention.In addition, it will be appreciated by those skilled in the art that described object delet method determines that according to described range difference the operation of fold size of described spitball is similar to the above, be not described in detail in this.
In addition, preferably, the destination object of deleting with the form of transferring in described " recycle bin " can be restored to former storage area.Particularly, when under described " recycle bin " catalogue, described object delet method senses the first touch track and the second touch track, thereby while obtaining the first initial position of described the first touch track and the second initial position of the first final position and described the second touch track and the second final position, determine described destination object in the mode identical with the operation of above-mentioned steps S102, be not described in detail in this.After this, described object delet method generates and recovers order in the mode contrary with the operation of step S103.Particularly, described object delet method calculates the initial distance between described the first initial position and described the second initial position.In addition, described object delet method calculates the termination distance between described the first final position and described the second final position.After this, described object delet method judges whether described termination distance is greater than described initial distance.In the time that described termination distance is greater than described initial distance, described object delet method generates and recovers order.That is to say, in the time that user makes the expansion gesture contrary with above-mentioned contraction gesture, described object delet method generates and recovers order.In the time that described termination distance is less than or equal to described initial distance, described object delet method finishes.
In addition,, in the time that described termination distance is greater than described initial distance, described object delet method can further judge whether described termination distance is greater than the 3rd preset distance, and in the time that described termination distance is greater than the first preset distance, generates and recover order.Wherein, described the 3rd preset distance is suitably determined according to actual needs by those skilled in the art, does not specifically limit at this.
Alternatively, now, described object delet method also can not calculate described initial distance and judge whether described termination distance is greater than described initial distance.That is to say, described object delet method can only calculate the termination distance between described the first final position and described the second final position, judge whether described termination distance is greater than the 3rd preset distance, and in the time that described termination distance is greater than the 3rd preset distance, generates and recover order.
In addition,, in the time that described termination distance is greater than described initial distance, described object delet method can further calculate the described range difference stopping between distance and described initial distance, and judges whether described range difference is greater than the 4th preset distance.In the time that described range difference is greater than the 4th preset distance, described object delet method generates and recovers order.That is to say, when the amplitude of making expansion gesture as above and described expansion gesture as user reaches to a certain degree, described object delet method generates and recovers order.In the time that described range difference is less than or equal to the 4th preset distance, described object delet method finishes.Equally, the 4th preset distance described here is suitably determined according to actual needs by those skilled in the art, does not specifically limit at this.
Certainly,, in the operating process of recovery object, described object delet method also can show a still image or dynamic image similarly.For example, described still image can be the image of a paper of paving.Described dynamic image can be to describe a spitball to be launched into the image of the process of the paper of paving, so that obtain, user's impression is more directly perceived, experience is abundanter.Its concrete operations are similar to the above, are not described in detail in this.
Thus, the object delet method according to the embodiment of the present invention has been described.According in the object delet method of the embodiment of the present invention, obtain the first initial position of described the first touch track and the second initial position and second final position of the first final position and described the second touch track, according to described the first initial position and described the second initial position, or according to described the first final position and described the second final position, determine destination object, and according to described the first initial position, described the second initial position, described the first final position and described the second final position, generate delete command, thereby delete described destination object.By described object delet method, make user to delete object by operation simply and intuitively, greatly improve and enriched user's experience.
In addition, by the object delet method of the embodiment of the present invention, user not only can delete object by operation simply and intuitively, can also be by the object that equally simply and intuitively operation recovery is deleted, and further improve and enriched user's experience.
In addition, by the object delet method of the embodiment of the present invention, user not only can once delete an object by operating simply and intuitively, and can once delete multiple objects by this operation, further improves and has enriched user's experience.
Object delet method according to the embodiment of the present invention has more than been described.Describe according to the electronic equipment of the embodiment of the present invention below with reference to Fig. 2-Fig. 5.
Fig. 2 is that diagram is according to the block diagram of the main configuration of the electronic equipment of the embodiment of the present invention.As shown in Figure 2, comprise sensing cell 201, object determining unit 202, order generation unit 203 and delete cells 204 according to the electronic equipment 200 of the embodiment of the present invention.
Sensing cell 201 sensing the first touch track and the second touch track, to obtain the first initial position of described the first touch track and the second initial position and second final position of the first final position and described the second touch track.Particularly, in the time that user wants to delete the object showing on described electronic equipment, user gives directions things (following, take user's finger as example) to make as above with reference to the contraction gesture in the object delet method as described in figure 1 with at least two.In above-mentioned the first situation, the sensing result that described sensing cell 201 obtains is: the first touch track forming by described first gesture only comprises a touch point, and the second touch track forming by described the second gesture comprises multiple touch points.That is to say, in the case, described the first initial position and described the first final position are identical.In above-mentioned the second situation, the sensing result that described sensing cell 201 obtains is: the first touch track forming by described first gesture and the second touch track forming by described the second gesture include multiple touch points.It is pointed out that in above-mentioned two situations, described the first touch track and described the second touch track can form in different timing.Preferably, described the first touch track and described the second touch track form simultaneously.
Object determining unit 202 is according to described the first initial position and described the second initial position, or according to described the first final position and described the second final position, determines destination object.Particularly, described object determining unit 202 can be determined destination object by two kinds of configurations.Describe in detail below with reference to Fig. 3 and Fig. 4.
In the first configuration, as shown in Figure 3, described electronic equipment 300 comprises the sensing cell identical with Fig. 2 201, order generation unit 203 and delete cells 204, is not described in detail in this.Specifically describe the object determining unit 202 in electronic equipment 300 below.Described object determining unit 202 comprises: the 4th judging unit 301, judges that whether described the first initial position and described the second initial position are corresponding to same object; And first determining unit 302, in the time judging described the first initial position and described the second initial position whether corresponding to same object, described object is defined as to destination object.In the case, described destination object is single object.In addition, described object determining unit 202 also comprises: the second determining unit 303, in the time judging that described the first initial position and described the second initial position do not correspond to same object,, when judge described the first initial position corresponding to the first object, described the second initial position corresponding to second object and described the first object when different with described second object, described the first object, described second object and all objects of being arranged in order between described the first object and described second object are defined as to destination object.In the case, described destination object is multiple objects.
It is pointed out that having described described object determining unit 202 determines destination object according to described the first initial position and described the second initial position above.Certainly, described object determining unit 202 also can be determined destination object according to described the first final position and described the second final position., described the 4th judging unit 301 judges that whether described the first final position and described the second final position are corresponding to same object.In the time judging described the first final position and described the second final position corresponding to same object, described object is defined as destination object by described the first determining unit 302.In the case, described destination object is single object.In the time judging that described the first final position and described the second final position do not correspond to same object,, when judge described the first final position corresponding to the first object, described the second final position corresponding to second object and described the first object when different with described second object, described the first object, described second object and all objects of being arranged in order between described the first object and described second object are defined as destination object by described the second determining unit 303.In the case, described destination object is multiple objects.
In the second configuration, as shown in Figure 4, described electronic equipment 400 comprises the sensing cell identical with Fig. 2 201, order generation unit 203 and delete cells 204, is not described in detail in this.Specifically describe the object determining unit 202 in electronic equipment 400 below.Described object determining unit 202 comprises: the 3rd determining unit 401, determine object and object corresponding to described the second touch track that described the first touch track is corresponding; And the 4th determining unit 402, object corresponding described the first touch track and object corresponding to described the second touch track are defined as to destination object.Particularly, all touch points that described the 3rd determining unit 401 can comprise by described the first touch track, determine the corresponding object in all touch points of described the first touch track.Similarly, all touch points that described the 3rd determining unit 401 can comprise by described the second touch track, determine the corresponding object in all touch points of described the second touch track.After this, object corresponding described the first touch track and object corresponding to described the second touch track are defined as destination object by described the 4th determining unit 402.Figuratively, all objects of all objects of described the first touch track institute " process " and described the second touch track institute " process " are defined as destination object by described the 4th determining unit 402.
Certainly, described object determining unit 202 is not limited to determine destination object by above-mentioned two kinds of configurations, those skilled in the art can be based in described the first touch track and described the second touch track each position, touch point, determine destination object with various pre-defined rules, within it is included in scope of the present invention.
Order generation unit 203, according to described the first initial position, described the second initial position, described the first final position and described the second final position, generates delete command.Particularly, as shown in Figure 5 A and 5B, described order generation unit 203 comprises: the first computing unit 501, calculates the initial distance between described the first initial position and described the second initial position; The second computing unit 502, calculates the termination distance between described the first final position and described the second final position; The first judging unit 503, judges whether described termination distance is less than described initial distance; And first generation unit 504, in the time that described termination distance is less than described initial distance, generate delete command.That is to say, in the time that user makes contraction gesture as above, described the first generation unit 504 generates delete command.In the time that described termination distance is more than or equal to described initial distance, described the first generation unit 504 does not generate delete command.
In addition, described the first generation unit 504 can further comprise: the second judging unit 5041, judges whether described termination distance is less than the first preset distance; And second generation unit 5042, in the time that described termination distance is less than the first preset distance, generate delete command.
Alternatively, as shown in Figure 5 B, described the first generation unit 504 can further comprise: the 3rd computing unit 5043, calculates described initial distance and the described range difference stopping between distance; The 3rd judging unit 5044, judges whether described range difference is greater than the second preset distance; And the 3rd generation unit 5045, in the time that described range difference is greater than the second preset distance, generate delete command.
Described the first preset distance and described the second preset distance are suitably determined according to actual needs by those skilled in the art, specifically do not limit at this.
Delete cells 204 is deleted described destination object.Described delete cells 204 can, by described destination object permanent delet from storage unit, also can for example be deleted described destination object with the form of " recycle bin " known to those skilled in the art from current storage area.The operation that described delete cells 204 is deleted described destination object is known to those skilled in the art, is not described in detail in this.
In addition, preferably, the destination object of deleting with the form of transferring in described " recycle bin " can be restored to former storage area.Particularly, described electronic equipment can be by described sensing cell 201 sensing the first touch track and the second touch track, thereby obtain the first initial position of described the first touch track and the second initial position and second final position of the first final position and described the second touch track, and determine described destination object by described object determining unit 202, be not described in detail in this.After this, described order generation unit 203 calculates the initial distance between described the first initial position and described the second initial position, and calculates the termination distance between described the first final position and described the second final position.After this, described order generation unit 203 judges whether described termination distance is greater than described initial distance.In the time that described termination distance is greater than described initial distance, described order generation unit 203 generates and recovers order.That is to say, in the time that user makes the expansion gesture contrary with above-mentioned contraction gesture, described order generation unit 203 generates and recovers order.In the time that described termination distance is less than or equal to described initial distance, described order generation unit 203 does not generate and recovers order.
In addition,, in the time that described termination distance is greater than described initial distance, described order generation unit 203 can further judge whether described termination distance is greater than the 3rd preset distance, and in the time that described termination distance is greater than the first preset distance, generates and recover order.Wherein, described the 3rd preset distance is suitably determined according to actual needs by those skilled in the art, does not specifically limit at this.
Alternatively, now, described order generation unit 203 also can not calculate described initial distance and judge whether described termination distance is greater than described initial distance.That is to say, described order generation unit 203 can only calculate the termination distance between described the first final position and described the second final position, judge whether described termination distance is greater than the 3rd preset distance, and in the time that described termination distance is greater than the 3rd preset distance, generates and recover order.
In addition,, in the time that described termination distance is greater than described initial distance, described order generation unit 203 can further calculate the described range difference stopping between distance and described initial distance, and judges whether described range difference is greater than the 4th preset distance.In the time that described range difference is greater than the 4th preset distance, described order generation unit 203 generates and recovers order.When the amplitude of making expansion gesture as above and described expansion gesture as user reaches to a certain degree, described order generation unit 203 generates and recovers order.In the time that described range difference is less than or equal to the 4th preset distance, described order generation unit 203 does not generate and recovers order.Equally, the 4th preset distance described here is suitably determined according to actual needs by those skilled in the art, does not specifically limit at this.
Thus, the electronic equipment according to the embodiment of the present invention has been described.By described electronic equipment, make user to delete object by operation simply and intuitively, greatly improve and enriched user's experience.
In addition, by the electronic equipment of the embodiment of the present invention, user not only can delete object by operation simply and intuitively, can also be by the object that equally simply and intuitively operation recovery is deleted, and further improve and enriched user's experience.
In addition, by the electronic equipment of the embodiment of the present invention, user not only can once delete an object by operating simply and intuitively, and can once delete multiple objects by this operation, further improves and has enriched user's experience.
Above, described according to the electronic equipment of the embodiment of the present invention and object delet method thereof referring to figs. 1 through Fig. 6.
It should be noted that, in this manual, term " comprises ", " comprising " or its any other variant are intended to contain comprising of nonexcludability, thereby the process, method, article or the equipment that make to comprise a series of key elements not only comprise those key elements, but also comprise other key elements of clearly not listing, or be also included as the intrinsic key element of this process, method, article or equipment.The in the situation that of more restrictions not, the key element being limited by statement " comprising ... ", and be not precluded within process, method, article or the equipment that comprises described key element and also have other identical element.
Finally, also it should be noted that, above-mentioned a series of processing not only comprise the processing of carrying out by time series with order described here, and comprise processing parallel or that carry out respectively rather than in chronological order.
Through the above description of the embodiments, those skilled in the art can be well understood to the mode that the present invention can add essential hardware platform by software and realize, and can certainly all implement by hardware.Based on such understanding, what technical scheme of the present invention contributed to background technology can embody with the form of software product in whole or in part, this computer software product can be stored in storage medium, as ROM/RAM, magnetic disc, CD etc., comprise that some instructions (can be personal computers in order to make a computer equipment, server, or the network equipment etc.) carry out the method described in some part of each embodiment of the present invention or embodiment.
Above the present invention is described in detail, has applied specific case herein principle of the present invention and embodiment are set forth, the explanation of above embodiment is just for helping to understand method of the present invention and core concept thereof; , for one of ordinary skill in the art, according to thought of the present invention, all will change in specific embodiments and applications, in sum, this description should not be construed as limitation of the present invention meanwhile.

Claims (9)

1. an object delet method, comprising:
Sensing the first touch track and the second touch track, to obtain the first initial position of described the first touch track and the second initial position and second final position of the first final position and described the second touch track;
According to described the first initial position and described the second initial position, or according to described the first final position and described the second final position, determine destination object;
According to described the first initial position, described the second initial position, described the first final position and described the second final position, generate delete command; And
Delete described destination object;
Wherein, described definite destination object comprises one of following three kinds of modes:
Judge that whether described the first initial position and described the second initial position are corresponding to same object; And in the time judging described the first initial position and described the second initial position corresponding to same object, described object is defined as to destination object; Or
Judge that whether described the first initial position and described the second initial position are corresponding to same object; And when judge described the first initial position corresponding to the first object, described the second initial position corresponding to second object and described the first object when different with described second object, described the first object, described second object and all objects of being arranged in order between described the first object and described second object are defined as to destination object; Or
Determine object and object corresponding to described the second touch track that described the first touch track is corresponding; And object corresponding described the first touch track and object corresponding to described the second touch track are defined as to destination object.
2. object delet method as claimed in claim 1, wherein, described the first touch track and described the second touch track form simultaneously.
3. object delet method as claimed in claim 1, wherein, described generation delete command comprises:
Calculate the initial distance between described the first initial position and described the second initial position;
Calculate the termination distance between described the first final position and described the second final position;
Judge whether described termination distance is less than described initial distance; And
In the time that described termination distance is less than described initial distance, generate delete command.
4. object delet method as claimed in claim 3 wherein, generates delete command and comprises in the time that described termination distance is less than described initial distance:
Judge whether described termination distance is less than the first preset distance; And
In the time that described termination distance is less than the first preset distance, generate delete command.
5. object delet method as claimed in claim 3 wherein, generates delete command and comprises in the time that described termination distance is less than described initial distance:
Calculate described initial distance and the described range difference stopping between distance;
Judge whether described range difference is greater than the second preset distance; And
In the time that described range difference is greater than the second preset distance, generate delete command.
6. an electronic equipment, comprising:
Sensing cell, sensing the first touch track and the second touch track, to obtain the first initial position of described the first touch track and the second initial position and second final position of the first final position and described the second touch track;
Object determining unit, according to described the first initial position and described the second initial position, or according to described the first final position and described the second final position, determines destination object;
Order generation unit, according to described the first initial position, described the second initial position, described the first final position and described the second final position, generates delete command; And
Delete cells, deletes described destination object;
Wherein, described object determining unit is one of following three kinds of modes:
Described object determining unit comprises the 4th judging unit, judges that whether described the first initial position and described the second initial position are corresponding to same object; And first determining unit, in the time judging described the first initial position and described the second initial position corresponding to same object, described object is defined as to destination object;
Described object determining unit comprises: the 4th judging unit, judges that whether described the first initial position and described the second initial position are corresponding to same object; And second determining unit, when judge described the first initial position corresponding to the first object, described the second initial position corresponding to second object and described the first object when different with described second object, described the first object, described second object and all objects of being arranged in order between described the first object and described second object are defined as to destination object;
Described object determining unit comprises: the 3rd determining unit, determine object and object corresponding to described the second touch track that described the first touch track is corresponding; And the 4th determining unit, object corresponding described the first touch track and object corresponding to described the second touch track are defined as to destination object.
7. electronic equipment as claimed in claim 6, wherein, described order generation unit comprises:
The first computing unit, calculates the initial distance between described the first initial position and described the second initial position;
The second computing unit, calculates the termination distance between described the first final position and described the second final position;
The first judging unit, judges whether described termination distance is less than described initial distance; And
The first generation unit, in the time that described termination distance is less than described initial distance, generates delete command.
8. electronic equipment as claimed in claim 7, wherein, described the first generation unit comprises:
The second judging unit, judges whether described termination distance is less than the first preset distance; And
The second generation unit, in the time that described termination distance is less than the first preset distance, generates delete command.
9. electronic equipment as claimed in claim 7, wherein, described the first generation unit comprises:
The 3rd computing unit, calculates described initial distance and the described range difference stopping between distance;
The 3rd judging unit, judges whether described range difference is greater than the second preset distance; And
The 3rd generation unit, in the time that described range difference is greater than the second preset distance, generates delete command.
CN201010605710.9A 2010-12-24 2010-12-24 Electronic instrument and object deletion method thereof Active CN102568403B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201010605710.9A CN102568403B (en) 2010-12-24 2010-12-24 Electronic instrument and object deletion method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201010605710.9A CN102568403B (en) 2010-12-24 2010-12-24 Electronic instrument and object deletion method thereof

Publications (2)

Publication Number Publication Date
CN102568403A CN102568403A (en) 2012-07-11
CN102568403B true CN102568403B (en) 2014-06-04

Family

ID=46413685

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010605710.9A Active CN102568403B (en) 2010-12-24 2010-12-24 Electronic instrument and object deletion method thereof

Country Status (1)

Country Link
CN (1) CN102568403B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019592B (en) * 2012-11-30 2017-04-05 北京小米科技有限责任公司 A kind of method of selection interface mark, device and mobile terminal
CN103870165B (en) * 2012-12-17 2017-06-27 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN104346033B (en) * 2013-08-09 2017-09-22 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN104407784A (en) * 2013-10-29 2015-03-11 贵阳朗玛信息技术股份有限公司 Information transmitting and presenting method and device
US10095389B2 (en) * 2014-08-22 2018-10-09 Business Objects Software Ltd. Gesture-based on-chart data filtering
CN106066755A (en) * 2016-05-30 2016-11-02 乐视控股(北京)有限公司 A kind of control method and device
CN106598466A (en) * 2016-12-20 2017-04-26 珠海市魅族科技有限公司 Information list control method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101086565A (en) * 2006-02-24 2007-12-12 三星电子株式会社 Display device and method of controlling touch detection unit
CN101458591A (en) * 2008-12-09 2009-06-17 三星电子(中国)研发中心 Mobile phone input system with multi-point touch screen hardware structure
CN101464749A (en) * 2008-10-03 2009-06-24 友达光电股份有限公司 Method for processing touch control type input signal, its processing apparatus and computer system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007086386A1 (en) * 2006-01-27 2007-08-02 Matsushita Electric Industrial Co., Ltd. Device with touch sensor
TWM361674U (en) * 2009-02-19 2009-07-21 Sentelic Corp Touch control module

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101086565A (en) * 2006-02-24 2007-12-12 三星电子株式会社 Display device and method of controlling touch detection unit
CN101464749A (en) * 2008-10-03 2009-06-24 友达光电股份有限公司 Method for processing touch control type input signal, its processing apparatus and computer system
CN101458591A (en) * 2008-12-09 2009-06-17 三星电子(中国)研发中心 Mobile phone input system with multi-point touch screen hardware structure

Also Published As

Publication number Publication date
CN102568403A (en) 2012-07-11

Similar Documents

Publication Publication Date Title
CN102568403B (en) Electronic instrument and object deletion method thereof
US20220326817A1 (en) User interfaces for playing and managing audio items
CN107797658B (en) Equipment, method and graphic user interface for tactile mixing
KR102000253B1 (en) Device, method, and graphical user interface for navigating user interface hierachies
KR101755029B1 (en) Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
KR101933201B1 (en) Input device and user interface interactions
KR101958517B1 (en) Device, method, and graphical user interface for transitioning between touch input to display output relationships
JP6359820B2 (en) Coding dynamic haptic effects
CN105264477B (en) Equipment, method and graphic user interface for mobile user interface object
AU2014221602B2 (en) Apparatus and method for providing haptic feedback to input unit
TWI431510B (en) Terminal and method for performing function therein
CN107402906B (en) Dynamic content layout in grid-based applications
JP2019079574A (en) Information processing apparatus, information processing method, and computer program
US10331297B2 (en) Device, method, and graphical user interface for navigating a content hierarchy
TW201232354A (en) Electronic device and information display method thereof
TW201227460A (en) Method and apparatus for providing different user interface effects for different implementation characteristics of a touch event
CN102541319A (en) Electronic equipment and display processing method thereof
CN101923423A (en) Graphical operation interface of electronic device and implementation method thereof
CN102541399A (en) Electronic equipment and display switching method thereof
CN105378769A (en) Navigating a calendar
CN105094620A (en) Information browsing method and mobile terminal
CN106896998A (en) A kind of processing method and processing device of operation object
CN107690614A (en) Movement between multiple views
CN103034419A (en) Method for classlessly zooming digital map on mobile equipment
CN102799343A (en) Method and device for displaying index icon content

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant