CN101685343B - Method, device and electronic aid for realizing gesture identification - Google Patents

Method, device and electronic aid for realizing gesture identification Download PDF

Info

Publication number
CN101685343B
CN101685343B CN2008102230572A CN200810223057A CN101685343B CN 101685343 B CN101685343 B CN 101685343B CN 2008102230572 A CN2008102230572 A CN 2008102230572A CN 200810223057 A CN200810223057 A CN 200810223057A CN 101685343 B CN101685343 B CN 101685343B
Authority
CN
China
Prior art keywords
gesture
described
corresponding
instruction
new
Prior art date
Application number
CN2008102230572A
Other languages
Chinese (zh)
Other versions
CN101685343A (en
Inventor
李艳
李雪莲
Original Assignee
联想(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 联想(北京)有限公司 filed Critical 联想(北京)有限公司
Priority to CN2008102230572A priority Critical patent/CN101685343B/en
Publication of CN101685343A publication Critical patent/CN101685343A/en
Application granted granted Critical
Publication of CN101685343B publication Critical patent/CN101685343B/en

Links

Abstract

The invention discloses a method, a device and an electronic aid for realizing gesture identification. The method comprises the following steps: executing a command corresponding to a first gesture; and in the first process of executing the command corresponding to the first gesture, judging whether a new gesture produces, if so, executing the command corresponding to the new gesture, and if not, continuing to execute the command corresponding to the first gesture, wherein the new gesture is obtained in the first process of executing the first command. The device comprises an executing unit and a first judgment unit. The invention enables users to experience the continuous operations, reduces the operation load of users, saves the time of users and enhances the operation efficiency. The invention brings natural and vivid operation experience to users.

Description

A kind of method, device of realizing gesture identification

Technical field

The present invention relates to the Gesture Recognition field, relate in particular to a kind of method, device of realizing gesture identification.

Background technology

In Gesture Recognition, the corresponding instruction of gesture.If system is in the process of carrying out present instruction, variation has taken place in user's gesture, needs to carry out the another one instruction, and then system must abandon present instruction, and then carries out the another one instruction.For example: the user is when browsing pictures, and the user clicks " next is opened " button, and system carries out next Zhang Zhiling according to user's click.If the user sees that in navigation process is wanted a picture of deleting, the user must move to " deletion " button from " next is opened " button with finger, and the user clicks " deletion " button, and system carries out delete instruction, at last, the user also will carry out delete instruction to system and confirms.

Perhaps, as user during at browsing pictures, want to carry out " next is opened " operation, gesture is laterally to slide, if stir in the process of photo current the user, wants to delete photo current suddenly, then must interrupt the gesture of current horizontal slip earlier, abandon current operation, then, re-execute the gesture of deletion again.

The inventor is in research process, and find that there is following shortcoming at least in existing Gesture Recognition: user's operating experience is discontinuous.Particularly, when user's conversion gesture, need to carry out some unnecessary operations, such as click mentioned above " deletion " button, system's execution delete instruction is confirmed, perhaps, interrupt the gesture of current horizontal slip, re-execute the gesture of deletion etc.These unnecessary operations have not only increased user's operation burden, and have wasted user's time, have reduced operating efficiency.

Summary of the invention

In view of this, the invention provides a kind of method, device of realizing gesture identification, so that user experience is to the operation that links up.

A kind of method that realizes gesture identification, described method comprises:

Carry out the instruction corresponding with first gesture;

In first process of the described execution instruction corresponding with described first gesture, judge whether that new gesture produces, if have, carry out and the described new corresponding instruction of gesture, otherwise, continue to carry out and the corresponding instruction of described first gesture, wherein, described new gesture is carried out continuously by described first gesture and is obtained.

Alternatively, in described first process, perhaps, in second process of the described execution instruction corresponding with described new gesture, described method also comprises:

Current state according to first corresponding tables between the feedback of gesture state and interface and described first gesture or described new gesture, the current interface feedback that shows described current state correspondence, wherein, described gesture state is the various states of current gesture, described current gesture is for carrying out the gesture of page turning, and the various states of described current gesture comprise: dig one jiao of the page, dig more parts, open fully.

Alternatively, before the described execution instruction corresponding with described first gesture, perhaps, before the described execution instruction corresponding with described new gesture, described method also comprises:

According to second corresponding tables between the gesture and instruction, judge in described second corresponding tables, whether to include and described first gesture or the described new corresponding gesture of gesture;

When in described second corresponding tables, including the gesture corresponding, carry out and described first gesture or the described new corresponding instruction of gesture with described first gesture or described new gesture.

Preferably, described first gesture and described new gesture are to be obtained by the operation that sensor goes to obtain the user.

Preferably, described sensor is touch-screen, camera or time writer.

A kind of device of realizing gesture identification, described device comprises:

Performance element is used to carry out the instruction corresponding with first gesture;

First judging unit, in first process of the described execution instruction corresponding with described first gesture, judge whether that new gesture produces, if have, described performance element is carried out and the described new corresponding instruction of gesture, otherwise described performance element continues to carry out and the corresponding instruction of described first gesture, wherein, described new gesture is carried out continuously by described first gesture and is obtained.

Alternatively, described device also comprises:

First inspection unit, be used for described first process in described performance element execution, perhaps, carry out in second process of the instruction corresponding at described performance element with described new gesture, check first corresponding tables between the feedback of gesture state and interface, wherein, described gesture state is the various states of current gesture, described current gesture is for carrying out the gesture of page turning, and the various states of described current gesture comprise: dig one jiao of the page, dig more parts, open fully;

Display unit is used for the inspection carried out according to described first inspection unit, and the current state of described first gesture or described new gesture, shows the current interface feedback of described current state correspondence.

Alternatively, described device also comprises:

Second inspection unit is used to check second corresponding tables between the gesture and instruction;

Second judging unit is used for the inspection carried out according to described second inspection unit, judges whether include in described second corresponding tables and described first gesture or the described new corresponding gesture of gesture;

When including the gesture corresponding with described first gesture or described new gesture in described second corresponding tables, described performance element is carried out and described first gesture or the described new corresponding instruction of gesture.

Preferably, described first gesture and described new gesture are to be obtained by the operation that sensor goes to obtain the user.

Preferably, described sensor is touch-screen, camera or time writer.

As can be seen, because new gesture is to carry out first gesture continuously to obtain, in the process of the instruction of carrying out the first gesture correspondence, system can judge whether that new gesture produces, if have, system carries out corresponding operation according to new gesture, need not require the user to carry out unnecessary operation again, thereby, make the user experience continuous operation, reduce user's operation burden, the time of having saved the user, improved operating efficiency.

In addition, because the various gesture states of first gesture or new gesture all can carry out the interface feedback in real time, thereby, brought nature, operating experience true to nature to the user.

Description of drawings

Fig. 1 is the inventive method process flow diagram;

Fig. 2 is the inventive method specific embodiment process flow diagram;

Fig. 3 is apparatus of the present invention structural drawing.

Embodiment

For above-mentioned feature of the present invention, advantage are become apparent more, the present invention is described in detail below in conjunction with embodiment.

Please refer to Fig. 1, be the inventive method process flow diagram, can comprise:

Step 101: carry out the instruction corresponding with first gesture;

Step 102: in first process of the described execution instruction corresponding with described first gesture, judge whether that new gesture produces, if have, carry out and the described new corresponding instruction of gesture, otherwise, continue to carry out and the corresponding instruction of described first gesture, wherein, described new gesture obtains in carrying out described first instruction, first process.

Below in conjunction with the method specific embodiment each step shown in Figure 1 is elaborated.

In the Gesture Recognition, need gesture and instruction corresponding tables,, carry out corresponding operating, please refer to table 1, be gesture and instruction corresponding tables according to the instruction of user's gesture correspondence so that system, searches gesture and instruction corresponding tables according to user's gesture.In table 1, can set in advance gesture B and obtain by continuous execution of gesture A, the continuous execution of gesture C can obtain subsequent gesture.It is pointed out that predefined gesture is not limited to this in the table 1, for example, can also preestablish gesture C and obtain by continuous execution of gesture B.

In the inventive method specific embodiment, also to set up gesture state and interface feedback corresponding tables.The gesture state is the various states of current gesture, and for example: the gesture of carrying out page turning can be set a plurality of different states continuously, comprising: just dug one jiao of the page, gradually dig more parts, open or the like fully.The gesture state is many more, and the interface feedback is many more, and the interface feedback is used for various gesture states are shown, so that user's visual effect is more natural, true to nature.Gesture state and interface feedback corresponding tables please refer to table 2.

Table 1 table 2

In the table 2, gesture state A1, A2, A3 are according to the gesture A in the table 1, and with the various state that gesture A is divided into, A1 generates interfacial state a1, and A2 generates interfacial state a2, and A3 generates interfacial state a3.Equally, gesture B and gesture C also can have various states, and generate multiple interface feedback according to various states.

The inventive method specific embodiment can may further comprise the steps:

Step 201: the user imports gesture on touch-screen;

Step 202: systems inspection gesture and instruction corresponding tables;

Step 203: system judges whether the gesture of the current input of user is included in the gesture and instruction corresponding tables, if, then enter step 204, otherwise, process ends;

Step 204: system carries out the instruction of the gesture correspondence of the current input of user;

Step 205: in the process of step 204, system judges whether the user has imported new gesture, if having, returns step 202, and with the new gesture of the user input gesture as current input, otherwise, enter step 206;

Step 206: system's real-time inspection gesture state and interface feedback corresponding tables, and judge whether the gesture state of current gesture change has taken place, if, then enter step 207, otherwise, continue to carry out the instruction of current gesture correspondence;

Step 207: whether the gesture state of the current gesture after systems inspection changes is included in gesture state and the interface feedback corresponding tables, if, enter step 208, otherwise, continue to carry out the instruction of current gesture correspondence;

Step 208: system shows according to the interface feedback of current gesture state correspondence, and returns step 202.

Step 201 to step 208 couple user imports gesture on touch-screen situation has been described in detail, it is to be noted, method provided by the invention is except this sensor of touch-screen, can also carry out gesture identification by sensors such as camera, time writers, at this moment, the step of execution is similar to step 208 to step 201, difference is, if carry out gesture identification by camera, then the user need make gesture before cam lens, rather than imports gesture on touch-screen.

Below in conjunction with the method specific embodiment step 201 to step 208 is elaborated.In the method instantiation, suppose that gesture A represents page turning, expression is carried out gesture A continuously and is obtained gesture B, and gesture B represents to delete current page, please refer to Fig. 3, is the synoptic diagram of gesture A and gesture B.The interface feedback of gesture A please refer to shown in (1), and the interface of gesture B feedback please refer to shown in (2).

This instantiation can may further comprise the steps:

Steps A 1: the user imports gesture A;

Steps A 2: systems inspection gesture and instruction corresponding tables;

Steps A 3; System judges whether the gesture A of user's input is included in the gesture and instruction corresponding tables, through judging that gesture A is included in the gesture and instruction corresponding tables;

Steps A 4: system carries out operation accordingly according to the instruction a of gesture A correspondence;

Steps A 5: in the process of steps A 4, system judges whether the user has imported new gesture, if having, enters steps A 9, otherwise, enter steps A 6;

Steps A 6: system's real-time inspection gesture state and interface feedback corresponding tables, and judge whether the gesture state of gesture A change has taken place, if, then enter steps A 7, otherwise, execution command a continued;

Steps A 7: whether the gesture state of the gesture A after systems inspection changes is included in gesture state and the interface feedback corresponding tables, if, enter steps A 8, otherwise, execution command a continued;

Steps A 8: system shows that according to the interface feedback of the current gesture state correspondence of gesture A flow process finishes;

Steps A 9: systems inspection gesture and instruction corresponding tables;

Steps A 10: system judges whether the gesture B of the new input of user is included in the gesture and instruction corresponding tables, through judging that gesture B is included in the gesture and instruction corresponding tables;

Steps A 11: system carries out operation accordingly according to the instruction b of gesture B correspondence;

Steps A 12: in the process of steps A 12, system's real-time inspection gesture state and interface feedback corresponding tables, and judge whether the gesture state of gesture B change has taken place, if, then enter steps A 13, otherwise, execution command b continued;

Steps A 13: whether the change of systems inspection gesture B gesture state is included in gesture state and the interface feedback corresponding tables, if, enter steps A 14, otherwise, execution command b continued;

Steps A 14: system shows according to the interface feedback of the current gesture state correspondence of gesture B.

As can be seen, because gesture B is carried out continuously by gesture A and obtains, in the process of the instruction of carrying out gesture A correspondence, system can judge whether that gesture B produces, if have, system can carry out corresponding operation according to gesture B, need not require the user to carry out unnecessary operation again, thereby, make the user experience continuous operation, reduce user's operation burden, the time of having saved the user, improved operating efficiency.

In addition, because the various gesture states of gesture A or gesture B all can carry out the interface feedback in real time, thereby, brought natural, true to nature operating experience to the user.

Please refer to Fig. 4, be apparatus of the present invention structural drawing, comprising:

Performance element 401 is used to carry out the instruction corresponding with first gesture;

First judging unit 402, carry out in first process of the instruction corresponding at described performance element 401 with described first gesture, judge whether that new gesture produces, if have, described performance element 401 is carried out and the described new corresponding instruction of gesture, otherwise described performance element 401 continues to carry out and the corresponding instruction of described first gesture, wherein, described new gesture obtains in carrying out described first instruction, first process.

Integrating step 201 is to step 208, and device shown in Figure 4 can also comprise:

First inspection unit 403 is used for described first process carried out at described performance element 401, perhaps, carries out in second process of the instruction corresponding with described new gesture at described performance element 401, checks gesture state and interface first corresponding tables between feeding back;

Display unit 404 is used for the inspection carried out according to described first inspection unit 403, and the current state of described first gesture or described new gesture, shows the current interface feedback of described current state correspondence.

Second inspection unit 405 is used to check second corresponding tables between the gesture and instruction;

Second judging unit 406 is used for the inspection carried out according to described second inspection unit 405, judges whether include in described second corresponding tables and described first gesture or the described new corresponding gesture of gesture;

When including the gesture corresponding with described first gesture or described new gesture in described second corresponding tables, described performance element 401 is carried out and described first gesture or the described new corresponding instructions of gesture.

Be elaborated below in conjunction with step 201 to the step 208 pair operation that above-mentioned each unit is carried out.

When the user imported gesture on touch-screen, second inspection unit 405 was checked gesture and instruction corresponding tables.Second judging unit 406 judges according to the inspection that second inspection unit 405 carries out whether the current gesture of user's input is included in the gesture and instruction corresponding tables, if then performance element 401 is carried out the instruction of the gesture correspondence of the current input of user.Carry out in the process of instruction of gesture correspondence of the current input of users at performance element 401, first judging unit 402 judges whether the user has imported new gesture, if the user does not import new gesture, then performance element 401 continues the instruction of the gesture correspondence of the current input of execution user, and, first inspection unit, 403 real-time inspection gesture states and interface feedback corresponding tables, if the gesture state changes, and the gesture state after changing is included in gesture state and the interface feedback corresponding tables, and display unit 404 is shown the gesture state in real time.

The associated methods specific embodiment can also be provided user's gesture by sensors such as camera, time writers for apparatus of the present invention.

As can be seen, in the process of the instruction of carrying out current gesture correspondence, first judging unit 402 can judge whether that new gesture produces, if having, performance element 401 can be carried out corresponding operation according to new gesture, need not require the user to carry out unnecessary operation again, thereby, make the user experience continuous operation, reduced user's operation burden, save user's time, improved operating efficiency.

In addition, because the various gesture states of gesture all can carry out the interface feedback in real time, thereby, brought natural, true to nature operating experience to the user.

The present invention also provides a kind of electronic equipment of gesture identification, and electronic equipment provided by the invention comprises device provided by the invention, and this device can comprise:

Performance element is used to carry out the instruction corresponding with first gesture;

First judging unit, in first process of the described execution instruction corresponding with described first gesture, judge whether that new gesture produces, if have, described performance element is carried out and the described new corresponding instruction of gesture, otherwise described performance element continues to carry out and the corresponding instruction of described first gesture, wherein, described new gesture obtains in carrying out described first instruction, first process.

Electronic equipment provided by the invention can comprise: PC, portable terminal or personal digital assistant PDA.

Integrating step 201 is to step 208 and Fig. 4, can also comprise other unit except that the performance element 401 and first judging unit 402 among Fig. 4 in the device of electronic equipment provided by the invention, the operation that each unit is carried out is identical with the description of method specific embodiment and device part.

At last, also need to prove, in this article, relational terms such as first and second grades only is used for an entity or operation are made a distinction with another entity or operation, and not necessarily requires or hint and have the relation of any this reality or in proper order between these entities or the operation.And, term " comprises ", " comprising " or its any other variant are intended to contain comprising of nonexcludability, thereby make and comprise that process, method, article or the equipment of a series of key elements not only comprise those key elements, but also comprise other key elements of clearly not listing, or also be included as this process, method, article or equipment intrinsic key element.Do not having under the situation of more restrictions, the key element that limits by statement " comprising ... ", and be not precluded within process, method, article or the equipment that comprises described key element and also have other identical element.

Through the above description of the embodiments, those skilled in the art can be well understood to the present invention and can realize by the mode that software adds essential hardware platform, can certainly all implement, but the former is better embodiment under a lot of situation by hardware.Based on such understanding, all or part of can the embodying that technical scheme of the present invention contributes to background technology with the form of software product, this computer software product can be stored in the storage medium, as ROM/RAM, magnetic disc, CD etc., comprise that some instructions are with so that a computer equipment (can be a personal computer, server, the perhaps network equipment etc.) carry out the described method of some part of each embodiment of the present invention or embodiment.

More than a kind of method, device and electronic equipment of gesture identification realized provided by the present invention is described in detail, used specific case herein principle of the present invention and embodiment are set forth, the explanation of above embodiment just is used for helping to understand method of the present invention and core concept thereof; Simultaneously, for one of ordinary skill in the art, according to thought of the present invention, the part that all can change in specific embodiments and applications, in sum, this description should not be construed as limitation of the present invention.

Claims (10)

1. a method that realizes gesture identification is characterized in that, described method comprises:
Carry out the instruction corresponding with first gesture;
In first process of the described execution instruction corresponding with described first gesture, judge whether that new gesture produces, if have, carry out and the described new corresponding instruction of gesture, otherwise, continue to carry out and the corresponding instruction of described first gesture, wherein, described new gesture is carried out continuously by described first gesture and is obtained.
2. method according to claim 1 is characterized in that, in described first process, perhaps, in second process of the described execution instruction corresponding with described new gesture, described method also comprises:
Current state according to first corresponding tables between the feedback of gesture state and interface and described first gesture or described new gesture, the current interface feedback that shows described current state correspondence, wherein, described gesture state is the various states of current gesture, described current gesture is for carrying out the gesture of page turning, and the various states of described current gesture comprise: dig one jiao of the page, dig more parts, open fully.
3. method according to claim 1 is characterized in that, before the described execution instruction corresponding with described first gesture, perhaps, before the described execution instruction corresponding with described new gesture, described method also comprises:
According to second corresponding tables between the gesture and instruction, judge in described second corresponding tables, whether to include and described first gesture or the described new corresponding gesture of gesture;
When in described second corresponding tables, including the gesture corresponding, carry out and described first gesture or the described new corresponding instruction of gesture with described first gesture or described new gesture.
4. according to any described method of claim 1 to 3, it is characterized in that described first gesture and described new gesture are to be obtained by the operation that sensor goes to obtain the user.
5. method according to claim 4 is characterized in that, described sensor is touch-screen, camera or time writer.
6. a device of realizing gesture identification is characterized in that, described device comprises:
Performance element is used to carry out the instruction corresponding with first gesture;
First judging unit, carry out in first process of the instruction corresponding at described performance element with described first gesture, judge whether that new gesture produces, if have, described performance element is carried out and the described new corresponding instruction of gesture, otherwise described performance element continues to carry out and the corresponding instruction of described first gesture, wherein, described new gesture is carried out continuously by described first gesture and is obtained.
7. device according to claim 6 is characterized in that, described device also comprises:
First inspection unit, be used for described first process in described performance element execution, perhaps, carry out in second process of the instruction corresponding at described performance element with described new gesture, check first corresponding tables between the feedback of gesture state and interface, wherein, described gesture state is the various states of current gesture, described current gesture is for carrying out the gesture of page turning, and the various states of described current gesture comprise: dig one jiao of the page, dig more parts, open fully;
Display unit is used for the inspection carried out according to described first inspection unit, and the current state of described first gesture or described new gesture, shows the current interface feedback of described current state correspondence.
8. device according to claim 6 is characterized in that, described device also comprises:
Second inspection unit is used to check second corresponding tables between the gesture and instruction;
Second judging unit is used for the inspection carried out according to described second inspection unit, judges whether include in described second corresponding tables and described first gesture or the described new corresponding gesture of gesture;
When including the gesture corresponding with described first gesture or described new gesture in described second corresponding tables, described performance element is carried out and described first gesture or the described new corresponding instruction of gesture.
9. according to any described device of claim 6 to 8, it is characterized in that described first gesture and described new gesture are to be obtained by the operation that sensor goes to obtain the user.
10. device according to claim 9 is characterized in that, described sensor is touch-screen, camera or time writer.
CN2008102230572A 2008-09-26 2008-09-26 Method, device and electronic aid for realizing gesture identification CN101685343B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2008102230572A CN101685343B (en) 2008-09-26 2008-09-26 Method, device and electronic aid for realizing gesture identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2008102230572A CN101685343B (en) 2008-09-26 2008-09-26 Method, device and electronic aid for realizing gesture identification

Publications (2)

Publication Number Publication Date
CN101685343A CN101685343A (en) 2010-03-31
CN101685343B true CN101685343B (en) 2011-12-28

Family

ID=42048524

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008102230572A CN101685343B (en) 2008-09-26 2008-09-26 Method, device and electronic aid for realizing gesture identification

Country Status (1)

Country Link
CN (1) CN101685343B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853071B (en) * 2010-05-13 2012-12-05 重庆大学 Gesture identification method and system based on visual sense
CN101901350B (en) * 2010-07-23 2012-05-16 北京航空航天大学 Characteristic vector-based static gesture recognition method
KR20120024247A (en) * 2010-09-06 2012-03-14 삼성전자주식회사 Method for operating a mobile device by recognizing a user gesture and the mobile device thereof
JP5609507B2 (en) * 2010-10-04 2014-10-22 ソニー株式会社 Information processing apparatus, information processing method, and program
US20120110517A1 (en) * 2010-10-29 2012-05-03 Honeywell International Inc. Method and apparatus for gesture recognition
JP5997699B2 (en) * 2010-11-01 2016-09-28 トムソン ライセンシングThomson Licensing Method and apparatus for detecting gesture input
CN102855073B (en) * 2011-06-30 2016-03-30 联想(北京)有限公司 Electronic equipment and information processing method thereof
US9884257B2 (en) 2013-03-06 2018-02-06 Tencent Technology (Shenzhen) Company Limited Method for preventing misoperations of intelligent terminal, and intelligent terminal
CN104035702B (en) * 2013-03-06 2016-08-17 腾讯科技(深圳)有限公司 A kind of method preventing intelligent terminal's maloperation and intelligent terminal
CN103488296B (en) * 2013-09-25 2016-11-23 华为软件技术有限公司 Body feeling interaction gestural control method and device

Also Published As

Publication number Publication date
CN101685343A (en) 2010-03-31

Similar Documents

Publication Publication Date Title
KR101873908B1 (en) Method and Apparatus for Providing User Interface of Portable device
US10572556B2 (en) Systems and methods for facilitating enhancements to search results by removing unwanted search results
US9928232B2 (en) Topically aware word suggestions
JP5923184B2 (en) Apparatus and method for providing remote user interaction
US10248305B2 (en) Manipulating documents in touch screen file management applications
RU2595634C2 (en) Touch screens hover input handling
US9372594B2 (en) Method and apparatus for adding icon to interface of system, and mobile terminal
US10222974B2 (en) Method and apparatus for providing quick access to device functionality
TWI468957B (en) Device, method, and graphical user interface for navigating a list of identifiers
US9542013B2 (en) Method and apparatus for determining recipients of a sharing operation based on an indication associated with a tangible object
CN102819555B (en) A kind of method and apparatus carrying out recommendation information loading in the reading model of webpage
CN102612679B (en) The method of project that touch screen user interface is rolled
CN102681829B (en) A kind of screenshot method, device and telecommunication customer end
KR101381063B1 (en) Methods and graphical user interfaces for editing on a multifunction device with a touch screen display
EP2502130B1 (en) System and method of controlling three dimensional virtual objects on a portable computing device
CN105068723B (en) Information processing method and electronic equipment
EP3540608A1 (en) Managing of local and remote media items
CN103345356B (en) A kind of method and terminal controlling locking screen interface
US9626094B2 (en) Communication device and electronic device
US20140019912A1 (en) System and method for processing sliding operations on portable terminal devices
CN103177073B (en) Classification search method and the mobile device for being suitable for the classification search method
EP2490130B1 (en) Quick text entry on a portable electronic device
JP4536638B2 (en) Display information selection apparatus and method, program, and recording medium
WO2014130480A1 (en) Natural language document search
US20150127677A1 (en) Enterprise graph search based on object and actor relationships

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant