CN104169858A - Method and device of using terminal device to identify user gestures - Google Patents

Method and device of using terminal device to identify user gestures Download PDF

Info

Publication number
CN104169858A
CN104169858A CN201380005552.5A CN201380005552A CN104169858A CN 104169858 A CN104169858 A CN 104169858A CN 201380005552 A CN201380005552 A CN 201380005552A CN 104169858 A CN104169858 A CN 104169858A
Authority
CN
China
Prior art keywords
program
sensor
recognition mode
recognition
terminal device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201380005552.5A
Other languages
Chinese (zh)
Other versions
CN104169858B (en
Inventor
吴昊
楚庆
钟山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN104169858A publication Critical patent/CN104169858A/en
Application granted granted Critical
Publication of CN104169858B publication Critical patent/CN104169858B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Abstract

The present invention discloses a method and a device of using a terminal device to identify user gestures to improve the identification accuracy of the terminal to the user gestures. The method comprises the steps of using a sensor of the terminal device to obtain the sensing data, using the terminal device to match the sensing data and the pre-registered identification modes, if the sensing data match one or more of the identification modes, using the terminal device to obtain the matching response operations of the identification modes, and using the terminal device to execute the response operations.

Description

A kind of method and apparatus of terminal device identification user gesture
Technical field
The present invention relates to communication technical field, particularly a kind of method and apparatus of terminal device identification user gesture.
Background technology
At present, user carries out information interaction by touch-screen and has become the main flow of terminal interaction mode, and smart mobile phone and the panel computer of current market main flow all possess touch-screen.Through cultivation for many years, user learns gradually and gets used to carrying out the man-machine interaction between terminal device by gesture.User, by single-point, multiple point touching, in conjunction with all kinds of directly perceived, lively gestures, can easily complete the each generic task on mobile phone.
In prior art, mainly concentrate on following several situation for the identification of user's gesture:
(1) by touch-screen, the identification of position, touch point and displacement is completed to gesture identification.
Existing touch screen gesture is all to use some fingers that screen is clicked and moved.Main touch screen comprises: click the ad-hoc location of screen and pick up (Tap) with pointing; , pin for a long time screen ad-hoc location (Long press) with pointing; Finger is clicked screen and pick up (Swipe) after specific direction is slided; Two fingers are picked up (Pinch) after pressing on screen and moving with two fingers outwardly or inwardly; Two fingers are by screen, and rotation two fingers are also picked up (Rotate).
But touch screen identification cannot break through the restriction that gesture must contact screen, in the time that screen or user's hand is dirtier, the accuracy of gesture identification and availability will reduce.And, carry out the limitation of Gesture Recognition with touch screen and must make restriction to gesture design, affect rich, availability and the ease for use of gesture.User needs the operation species of carrying out more and more in terminal, and the restriction of touch screen identification makes many more intuitive gestures cannot be identified.
(2) for fear of the restriction of touch screen identification, Microsoft is used CMOS infrared sensor, and this sensor can make the illumination condition regardless of surrounding environment, can perception external environment condition.This sensor carrys out perception environment by the mode of black and white spectrum: black represents infinite distance, pure white represent infinitely near.The corresponding object of gray zone between black and white is to the physical distance of sensor.It collects every bit within the vision, and forms a width and represent the depth image of surrounding environment.Sensor generates depth image stream with the speed of 30 frames per second, and 3D ground reproduces surrounding environment in real time.On this basis, can identify mankind's gesture and create the operator power set not conflicting with mankind's nature gesture.
But this kind of identification gesture technology be based on parlor scene, the distance of operator and sensor is conventionally more than 10 inches, visible, is not suitable for the operation scenario of handheld terminal.In addition, the sensor that the sensor that this gesture identification is used uses than existing handheld terminal is more complicated, and power consumption is higher, therefore, is not suitable for adopting in handheld terminal.
Visible, in prior art, terminal is still comparatively low for the accuracy of user's gesture identification.
Summary of the invention
In embodiments of the invention, provide a kind of method and apparatus of terminal device identification user gesture, to improve the accuracy of terminal for user's gesture identification.
In order to solve the problems of the technologies described above, the embodiment of the invention discloses following technical scheme:
First aspect, provides a kind of terminal device to identify the method for user's gesture, comprising:
The sensor of terminal device obtains sensing data;
Described terminal device mates described sensing data with the recognition mode of registered in advance;
If recognition mode matches described in one or more in described sensing data and described recognition mode, described terminal device obtains the operation response of the recognition mode matching;
Described terminal device is carried out described operation response.
In conjunction with above-mentioned first aspect, in the possible implementation of the first, described recognition mode comprises: recognition rule and described operation response;
Wherein, described recognition rule comprises: single-sensor recognition rule and/or multisensor recognition rule;
Wherein, single-sensor recognition rule is the rule that a sensing data that a corresponding sensor obtains is set up;
Multisensor recognition rule is the synthesis rule that multiple sensing datas that corresponding multiple sensor obtains are set up.
In conjunction with above-mentioned first aspect, and the possible implementation of the first, in the possible implementation of the second, described terminal device mates described sensing data with the recognition mode of registered in advance, specifically comprise:
Described terminal device mates described sensing data with single-sensor recognition rule and/or the multisensor recognition rule of described recognition mode;
, recognition mode matches described in one or more in described sensing data and described recognition mode, specifically comprises:
Described sensing data meets single-sensor recognition rule and/or the multisensor recognition rule of described recognition mode.
In conjunction with above-mentioned first aspect, and first, second kind of possible implementation, in the third possible implementation, before the sensor of described terminal device obtains sensing data, described method also comprises:
Described terminal device is opened described sensor;
Wherein, the method that described terminal device is opened described sensor comprises:
Described terminal device opening program;
Described terminal device, according to described program, is registered the recognition mode corresponding with described program;
Described terminal device is opened described sensor according to described recognition mode.
In conjunction with above-mentioned first aspect, and the third possible implementation, in the 4th kind of possible implementation, described terminal device is carried out described operation response, specifically comprises:
The described program being unlocked of described terminal device, carries out the described operation response of the recognition mode corresponding with described program;
Wherein, the operation that the described program that described operation response is unlocked described in being can be carried out.
In conjunction with above-mentioned first aspect, and the 3rd, the 4th kind of possible implementation, in the 5th kind of possible implementation, described program comprises: the program of system-level program and/or application layer;
Wherein, when described system-level program is described terminal device unlatching, the program simultaneously starting;
The program of described application layer is the program that described terminal device is opened according to user's input;
In the time that described program is the program of described application layer, described method also comprises:
Described terminal device is closed the program of described application layer;
Described terminal device is nullified the described recognition mode corresponding with the program of pent described application layer;
Described terminal device is closed the described sensor corresponding with the described recognition mode being canceled.
In conjunction with above-mentioned first aspect, and the 3rd to the 5th kind of possible implementation, in the 6th kind of possible implementation, described terminal device, according to described program, is registered the recognition mode corresponding with described program, specifically comprises:
Described terminal device, according to described program, is registered the recognition mode corresponding with described program in program frame.
Second aspect, provides a kind of terminal, comprising:
Sensing data acquisition module, for obtaining sensing data by sensor;
Matching module, for mating described sensing data with the recognition mode of registered in advance;
Operation response acquisition module, if matched for recognition mode described in one or more of described sensing data and described recognition mode, described terminal device obtains the operation response of the recognition mode matching;
Operation response execution module, for carrying out described operation response.
In conjunction with above-mentioned second aspect, in the possible implementation of the first,
In conjunction with above-mentioned second aspect, in the possible implementation of the first, described recognition mode comprises: recognition rule and described operation response;
Wherein, described recognition rule comprises: single-sensor recognition rule and/or multisensor recognition rule;
Wherein, single-sensor recognition rule is the rule that a sensing data that a corresponding sensor obtains is set up;
Multisensor recognition rule is the synthesis rule that multiple sensing datas that corresponding multiple sensor obtains are set up.
In conjunction with above-mentioned second aspect, and the possible implementation of the first, in the possible implementation of the second, described matching module, comprising:
The first matching module, for mating described sensing data with the single-sensor recognition rule of described recognition mode;
The second matching module, for mating described sensing data with the multisensor recognition rule of described recognition mode.
In conjunction with above-mentioned second aspect, and first, second kind of possible implementation, in the third possible implementation, also comprise:
Sensor opening module, for opening described sensor;
Described sensor opening module, specifically comprises:
Program is opened unit, for opening program;
Recognition mode registering unit, for according to described program, registers the recognition mode corresponding with described program;
Sensor is opened unit, for opening described sensor according to described recognition mode.
In conjunction with above-mentioned second aspect, and the third possible implementation, in the 4th kind of possible implementation, described program comprises: the program of system-level program and/or application layer;
Wherein, when described system-level program is described terminal device unlatching, the program simultaneously starting;
The program of described application layer is the program that described terminal device is opened according to user's input;
In the time that described program is the program of described application layer, described terminal also comprises:
Stop module, for closing the program of described application layer;
Recognition mode is nullified module, for nullifying the described recognition mode corresponding with the program of pent described application layer;
Sensor closing module, for closing the described sensor corresponding with the described recognition mode being canceled.
The third aspect, provides a kind of terminal, comprising: processor, storer, bus and sensor; Described processor, storer and sensor interconnect by described bus; Described sensor is used for obtaining sensing data; Described storer is used for storing computer executed instructions; In the time of described terminal operating, described processor is carried out the described computer executed instructions of described memory stores, and described sensing data is mated with the recognition mode of registered in advance; If recognition mode matches described in one or more in described sensing data and described recognition mode, obtain the operation response of the recognition mode matching; Carry out described operation response.
In conjunction with the above-mentioned third aspect, in the possible implementation of the first, described recognition mode comprises: recognition rule and described operation response;
Wherein, described recognition rule comprises: single-sensor recognition rule and/or multisensor recognition rule;
Wherein, single-sensor recognition rule is the rule that a sensing data that a corresponding sensor obtains is set up;
Multisensor recognition rule is the synthesis rule that multiple sensing datas that corresponding multiple sensor obtains are set up.
In conjunction with the above-mentioned third aspect, and the possible implementation of the first, in the possible implementation of the second, described processor mates described sensing data with the recognition mode of registered in advance, specifically comprise:
Described processor mates described sensing data with single-sensor recognition rule and/or the multisensor recognition rule of described recognition mode;
, recognition mode matches described in one or more in described sensing data and described recognition mode, specifically comprises:
Described sensing data meets single-sensor recognition rule and/or the multisensor recognition rule of described recognition mode.
In conjunction with the above-mentioned third aspect, with first, second kind of possible implementation, in the third possible implementation, described processor also for: open described sensor;
Wherein, the method that described processor is opened described sensor comprises:
Described processor opening program;
Described processor, according to described program, is registered the recognition mode corresponding with described program;
Described processor is opened described sensor according to described recognition mode.
In conjunction with the above-mentioned third aspect, and the third possible implementation, in the 4th kind of possible implementation, described processor is carried out described operation response, specifically comprises:
The described program being unlocked, carries out the described operation response of the recognition mode corresponding with described program;
Wherein, the operation that the described program that described operation response is unlocked described in being can be carried out.
In conjunction with the above-mentioned third aspect, and the 3rd, the 4th kind of possible implementation, in the 5th kind of possible implementation, described program comprises: the program of system-level program and/or application layer;
Wherein, when described system-level program is described terminal device unlatching, the program simultaneously starting;
The program of described application layer is the program that described terminal device is opened according to user's input;
In the time that described program is the program of described application layer, described processor also for:
Close the program of described application layer;
Nullify the described recognition mode corresponding with the program of pent described application layer;
Close the described sensor corresponding with the described recognition mode being canceled.
In conjunction with the above-mentioned third aspect, and the 3rd to the 5th kind of possible implementation, in the 6th kind of possible implementation, described processor, according to described program, is registered the recognition mode corresponding with described program, specifically comprises:
Described processor, according to described program, is registered the recognition mode corresponding with described program in program frame.
In the embodiment of the present invention, in terminal device inside, sensor is set, and at the corresponding recognition mode of terminal inner registered in advance, when user before terminal device, make operating terminal every empty-handed gesture time, the gesture sensing data that terminal device can receive sensor, mate with the corresponding recognition mode of registered in advance, if recognition mode matches described in one or more in described sensing data and described recognition mode, terminal can be identified active user's gesture, and carries out corresponding operation response according to described gesture.Visible, owing to receiving user's gesture by sensor, user, without using finger contact screen, makes user's gesture operation not be subject to the restriction of necessary contact screen, meanwhile, greatly improves the accuracy of terminal to user's gesture identification.
Brief description of the drawings
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, to the accompanying drawing of required use in embodiment or description of the Prior Art be briefly described below, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, do not paying under the prerequisite of creative work, can also obtain according to these accompanying drawings other accompanying drawing.
Figure 1 shows that the process flow diagram of the method for an identification user gesture of the embodiment of the present invention;
Figure 2 shows that in the embodiment of the present invention every empty-handed gesture schematic diagram;
Figure 3 shows that the process flow diagram of the method for another identification user gesture of the embodiment of the present invention;
Figure 4 shows that the method flow diagram of terminal device turn on sensor in the embodiment of the present invention;
Figure 5 shows that a terminal structure schematic diagram of the embodiment of the present invention;
Figure 6 shows that another terminal structure schematic diagram of the embodiment of the present invention;
Figure 7 shows that another terminal structure schematic diagram of the embodiment of the present invention;
Figure 8 shows that the software and hardware structure schematic diagram of terminal in the embodiment of the present invention;
Figure 9 shows that the process flow diagram that carries out gesture identification based on Fig. 8 terminal;
Figure 10 shows that the process flow diagram of the camera identification user gesture of the embodiment of the present invention;
Figure 11 shows that the terminal schematic diagram based on computer system identification user gesture of the embodiment of the present invention.
Embodiment
In order to make those skilled in the art person understand better the technical scheme in the embodiment of the present invention, and the above-mentioned purpose of the embodiment of the present invention, feature and advantage can be become apparent more, below in conjunction with accompanying drawing, technical scheme in the embodiment of the present invention is described in further detail.
Describe technical scheme of the present invention in detail below in conjunction with accompanying drawing.First, introduce the method for control provided by the invention user equipment to reside in cell,
Referring to Fig. 1, be the embodiment of the method for terminal device identification user gesture provided by the invention, the method comprises following execution step:
The sensor of step 101, terminal device obtains sensing data.
In the embodiment of the present invention, terminal device is provided with the various inductors that can respond to user's gesture.User is using when terminal, without use finger contact screen, need before terminal, within the scope of preset distance, make accordingly every empty-handed gesture, can receive these gestures by the sensor that is arranged on terminal inner, and described gesture is converted to corresponding sensing data.
In specific implementation process, can be as shown in Figure 2 every empty-handed gesture, comprise: wave, clap, on lift, press down several, in different user's application scenarioss, the operating system of terminal and application program can be given different implications for these gestures, realize and on the identification of existing touch screen gesture and the basis of use, expand more man-machine interaction gesture collection not affecting.
Described sensor can be built in the hardware of terminal; Or, be arranged on exterior of terminal, be connected with terminal by the interface being arranged in terminal.Particularly, described sensor can comprise: range sensor, light sensor, camera, gyroscope, three-dimensional accelerometer etc.By using polytype sensor, comprehensively identify user's gesture and action, make the identification of user's gesture no longer be subject to the constraint of the single original paper such as single-sensor or touch screen, thereby user's gesture operation is convenient and directly perceived.
Step 102, described terminal device mate described sensing data with the recognition mode of registered in advance.
In the embodiment of the present invention, the inner registered in advance recognition mode of terminal device.By the different recognition modes of registered in advance, record the different gestures of user that terminal device can be identified.
In the time that terminal device receives sensor and obtains sensing data corresponding to user's gesture of getting, this sensing data need to be mated with the recognition mode of registered in advance, taking the gesture of determining that whether the current user's gesture receiving can be identified as terminal.
If recognition mode matches described in one or more in the described sensing data of step 103 and described recognition mode, described terminal device obtains the operation response of the recognition mode matching.
In this step, by this sensing data is mated with the recognition mode of registered in advance, can determine whether the current user's gesture receiving is the gesture that terminal can be identified.Particularly, if recognition mode matches described in one or more in described sensing data and described recognition mode, terminal can be identified active user's gesture, and then, according to the gesture identifying, obtain corresponding operation response; And if this sensing data does not all mate with the recognition mode of registered in advance, terminal can not be identified active user's gesture, and then, do not need active user's gesture to carry out operation response.
Step 104, described terminal device are carried out described operation response.
In this step, after described terminal gets the operation response matching, can, according to the particular content of operation response, carry out corresponding operating.
Described corresponding operating is: according to user's gesture, control terminal is realized corresponding regulatory function, for example: according to user's gesture, realize the zoom of camera.
In the embodiment of the present invention, in terminal device inside, sensor is set, and at the corresponding recognition mode of terminal inner registered in advance, when user before terminal device, make operating terminal every empty-handed gesture time, the gesture sensing data that terminal device can receive sensor, mate with the corresponding recognition mode of registered in advance, if recognition mode matches described in one or more in described sensing data and described recognition mode, terminal can be identified active user's gesture, and carries out corresponding operation response according to described gesture.Visible, owing to receiving user's gesture by sensor, user, without using finger contact screen, makes user's gesture operation not be subject to the restriction of necessary contact screen, meanwhile, greatly improves the accuracy of terminal to user's gesture identification.
For the ease of the further understanding to technical solution of the present invention, below the related realization mode of technical solution of the present invention is elaborated.
In the embodiment of the present invention, described recognition mode specifically can comprise: recognition rule and described operation response.Wherein, described recognition rule specifically comprises: single-sensor recognition rule and/or multisensor recognition rule.Herein, single-sensor recognition rule is the rule that a sensing data that a corresponding sensor obtains is set up; Multisensor recognition rule is the synthesis rule that multiple sensing datas that corresponding multiple sensor obtains are set up.
In the embodiment of the present invention, described terminal device mates described sensing data with the recognition mode of registered in advance, specifically comprise: described terminal device mates described sensing data with single-sensor recognition rule and/or the multisensor recognition rule of described recognition mode;
, recognition mode matches described in one or more in described sensing data and described recognition mode, specifically comprises: described sensing data meets single-sensor recognition rule and/or the multisensor recognition rule of described recognition mode.
When concrete application, the unlatching of the corresponding multiple sensors of recognition mode possibility, as shown in table 1.
The corresponding relation of table 1 recognition mode and turn on sensor
Shown in table 2 is the recognition rule of corresponding each sensor under recognition mode MR001.
The recognition rule of corresponding each sensor under table 2 recognition mode MR001
For example, in each recognition mode (: MR001), can there be multiple subpatterns, the Different Results that these subpatterns produce after adopting often same sensor combinations to identify.
Taking recognition mode MR001 as example, if light sensor meets rule 1, range sensor meets rule 1, continues the front camera of judgement and meets which kind of rule; If front camera also meets rule 1, the match is successful in subpattern 1; Otherwise, pattern match failure.If light sensor meets rule 2, range sensor meets rule 2, continues the front camera of judgement and meets which kind of rule; If front camera also meets rule 2, the match is successful in subpattern 2; Otherwise, pattern match failure.
And then, according to table 3, carry out the operation response of corresponding subpattern.
The subpattern operation response of table 3 MR001
Subpattern title Operation response Description/remarks
Subpattern 1 Zoom amplifies Staff is near terminal
Subpattern 2 Zoom dwindles Staff is away from terminal
For fear of the fault open of sensor, in an embodiment provided by the invention, as shown in Figure 3, before the sensor of described terminal device obtains sensing data, described method also comprises:
Step 105, described terminal device are opened described sensor.
By this step, realize the control that sensor is opened.
Particularly, as shown in Figure 4, the method that described terminal device is opened described sensor can comprise:
Step 401, described terminal device opening program;
Step 402, described terminal device, according to described program, are registered the recognition mode corresponding with described program;
Step 403, described terminal device are opened described sensor according to described recognition mode.
Under this implementation, the described operation response of the recognition mode corresponding with described program is carried out by the described program being unlocked of described terminal; Wherein, the operation that the described program that described operation response is unlocked described in being can be carried out.
It should be noted that, in the embodiment of the present invention, described program specifically comprises: the program of system-level program and application layer; Wherein, when described system-level program is described terminal device unlatching, the program simultaneously starting; The program of described application layer is the program that described terminal device is opened according to user's input.
In the time that described program is the program of described application layer, if described terminal device does not need user's gesture to identify, described terminal device can be closed the program of described application layer, and, nullify the described recognition mode corresponding with the program of pent described application layer, finally, described terminal device is closed the described sensor corresponding with the described recognition mode being canceled.
In addition, in the embodiment of the present invention, described terminal device, according to described program, is registered the recognition mode corresponding with described program in program frame.
Under this implementation, can application framework be set in terminal device inside.Conventionally, application framework is arranged at terminal operating system upper strata.The program of terminal device built-in system level and the program of application layer can occur alternately, to call the various functions of two kinds of programs by described application framework with described application framework.
Correspondingly, the present invention also provides a kind of terminal.
As shown in Figure 5, be the structural representation of terminal in an embodiment provided by the invention.This terminal specifically can comprise:
Sensing data acquisition module 501, for obtaining sensing data by sensor;
Matching module 502, for mating described sensing data with the recognition mode of registered in advance;
Operation response acquisition module 503, if matched for recognition mode described in one or more of described sensing data and described recognition mode, described terminal device obtains the operation response of the recognition mode matching;
Operation response execution module 504, for carrying out described operation response.
In this terminal device, be provided with the various inductors that can respond to user's gesture.User is using when terminal, without use finger contact screen, need before terminal, within the scope of preset distance, make accordingly every empty-handed gesture, can receive these gestures by the sensor that is arranged on terminal inner, and described gesture is converted to corresponding sensing data.Described sensor can be built in the hardware of terminal; Or, be arranged on exterior of terminal, be connected with terminal by the interface being arranged in terminal.Particularly, described sensor can comprise: range sensor, light sensor, camera, gyroscope, three-dimensional accelerometer etc.
The inner registered in advance recognition mode of terminal device.By the different recognition modes of registered in advance, record the different gestures of user that terminal device can be identified.After sensing data acquisition module obtains sensing data, by matching module, this sensing data is mated with the recognition mode of registered in advance, taking the gesture of determining that whether the current user's gesture receiving can be identified as terminal.If recognition mode matches described in one or more in described sensing data and described recognition mode, terminal can be identified active user's gesture, and then, according to the gesture identifying, obtain corresponding operation response by operation response acquisition module.Finally, by operation response execution module, according to the particular content of operation response, carry out corresponding operating.
In the embodiment of the present invention, in terminal device inside, sensor is set, and at the corresponding recognition mode of terminal inner registered in advance, when user before terminal device, make operating terminal every empty-handed gesture time, the gesture sensing data that terminal device can receive sensor, mate with the corresponding recognition mode of registered in advance, if recognition mode matches described in one or more in described sensing data and described recognition mode, terminal can be identified active user's gesture, and carries out corresponding operation response according to described gesture.Visible, owing to receiving user's gesture by sensor, user, without using finger contact screen, makes user's gesture operation not be subject to the restriction of necessary contact screen, meanwhile, greatly improves the accuracy of terminal to user's gesture identification.
For above-mentioned terminal, described recognition mode specifically can comprise: recognition rule and described operation response.Wherein, described recognition rule specifically comprises: single-sensor recognition rule and/or multisensor recognition rule.Herein, single-sensor recognition rule is the rule that a sensing data that a corresponding sensor obtains is set up; Multisensor recognition rule is the synthesis rule that multiple sensing datas that corresponding multiple sensor obtains are set up.
In specific embodiment of the present invention, the unlatching of the corresponding one or more sensors of recognition mode possibility.Thus, described matching module, specifically can comprise:
The first matching module, for mating described sensing data with the single-sensor recognition rule of described recognition mode;
The second matching module, for mating described sensing data with the multisensor recognition rule of described recognition mode.
For fear of the fault open of sensor, in another terminal embodiment, as shown in Figure 6, this terminal can also comprise:
Sensor opening module 505, for opening described sensor.
By sensor opening module is set, realize the control that sensor is opened.Only, after this sensor opening module is opened, the sensor arranging in terminal just starts to gather user's gesture.
In specific implementation process, described opening module, can comprise:
Program is opened unit, for opening program;
Recognition mode registering unit, for according to described program, registers the recognition mode corresponding with described program;
Sensor is opened unit, for opening described sensor according to described recognition mode.
Herein, the described operation response of the recognition mode corresponding with described program is carried out by the described program being unlocked of described terminal; Wherein, the operation that the described program that described operation response is unlocked described in being can be carried out.
It should be noted that, in the embodiment of the present invention, described program specifically comprises: the program of system-level program and application layer; Wherein, when described system-level program is described terminal device unlatching, the program simultaneously starting; The program of described application layer is the program that described terminal device is opened according to user's input.Described terminal device, according to described program, is registered the recognition mode corresponding with described program in inside in the program frame arranging.
In the time that described program is the program of described application layer, as shown in Figure 7, can also in described terminal, arrange:
Stop module 506, for closing the program of described application layer;
Recognition mode is nullified module 507, for nullifying the described recognition mode corresponding with the program of pent described application layer;
Sensor closing module 508, for closing the described sensor corresponding with the described recognition mode being canceled.
For the ease of the understanding of the present invention, below by concrete application scenarios, technique scheme is further expalined to explanation.
Figure 8 shows that the software and hardware structure schematic diagram of terminal.Wherein, hardware sensor can comprise: range sensor, light sensor, camera, gyroscope, three-dimensional accelerometer etc.; Sensor driver is the sensor driver comprising in OS (Operation System, operating system) bottom, the data of passing back for separating read transducer, and convert the data layout that upper level applications can be identified to; Application framework, preset gesture mode data, after collecting the sensing data of sensor driver generation, according to system setting and system program or application program registration scenarios, whether identification current sensor data meets specific recognition mode, in the time that sensing data meets corresponding recognition mode, application framework will send gesture event to application program or the system program of having registered this pattern; System program is the system-level application program of resident terminal internal memory, and system program automatically starts in the time of start, and registers corresponding gesture event to application framework in opening; Application program is non-resident general application program, generally manually booted by user, and in the time that user manually boots this application program, will be in application framework, register corresponding gesture event.
Based on the terminal shown in Fig. 8, the flow process that this terminal is carried out gesture identification as shown in Figure 9.In this flow process, comprising:
Step 901, each hardware sensor are collected after sensing data corresponding to user's gesture, send to sensor driver;
Step 902, sensor driver are rejected the useless sensing data of part according to the accuracy requirement screening of default;
Step 903, sensor driver, by the live part in identification sensor data, become specific format by this part effective sensor data preparation, send to application framework.
Step 904, application framework, according to the registration scenarios of system program and application program, carry out corresponding modes identification coupling;
Step 905, application framework send to corresponding system program or application program the gesture event identifying after pattern-recognition coupling;
Step 906, corresponding system program or application program are carried out operation response according to the gesture event receiving.
Conventionally, system program and application program need to just can receive corresponding gesture event in application framework registration.Be registered as example with application program below, application program is in the flow process of application framework registration.
When application program starts first, application program need to check by register relevant gesture mode data-application framework in application framework whether sensor is opened, if sensor is not opened, opens sensor; When detecting that needed sensor opens, and after gesture mode data succeed in registration, application framework sends corresponding feedback message to the application program that completes registration.
In the time that application program switches to backstage, can cancel its registration in application framework, whether application framework inspection has opened sensor not used, if had, sends instruction to close this sensor to this sensor driver.In the time that application is switched go back to foreground again, this application program needs again to send log-on message to application framework, and detailed process is the same, no longer repeats herein.
In the time that application program finally need to exit, also need to cancel its registration in application framework, whether application framework inspection has opened sensor not used, if had, sends instruction to close this sensor to this sensor driver.
For application framework, the program that completes registration on it can be stored with program registration tabular form.This is listed as follows shown in table 4.
The list of table 4 program registration
Program identification Program state Whether identify MR001 MR002 MR003
P001 Front stage operation Be Be ? ? ?
P002 Backstage No ? Be ? ?
P003 System program Be ? ? Be ?
P004 System program Be ? Be Be ?
Wherein, MR001 ... represent respectively the different gesture modes for program.In the time that application framework identifies corresponding gesture mode, find according to this list all programs of having registered this pattern, therefore, can send this gesture event to corresponding each program.
For different gesture modes, may need to use different sensors.
Below taking user by camera zoom on gesture convergent-divergent control terminal as example, as shown in figure 10, illustrate identification this user's gesture whole process.
Step 1001, user open camera programm;
Step 1002, camera programm are registered gesture zoom mode data in application framework;
Step 1003, application framework send enabled instruction to related sensor driver;
Step 1004, related sensor driver are opened corresponding sensor;
Step 1005, related sensor driver send sensor to application framework and start successful feedback message;
Step 1006, application framework send the feedback message succeeding in registration to camera programm;
Step 1007, user use and dwindle gesture before camera;
Step 1008, respective sensor send user's gesture operation signal to sensor driver;
Step 1009, sensor driver carry out the arrangement of each sensor transmitted signal, are passed to application framework;
Step 1010, application framework mate sensing data with preset gesture mode data, successfully identify this user's gesture;
Recognition result is fed back to camera programm by step 1011, application framework;
Step 1012, camera programm, by the change of display interface, feed back user's gesture.
As shown in figure 11, the present invention also provides a kind of terminal based on computer system identification user gesture, and in specific implementation, the terminal of the embodiment of the present invention can comprise: processor 1101, storer 1102, bus 1103 and sensor 1104; Described processor 1101, storer 1102 and sensor 1104 interconnect by described bus 1103; Described sensor 1103 is for obtaining sensing data; Described storer 1102 is for storing computer executed instructions; In the time of described terminal operating, described processor 1101 is carried out the described computer executed instructions that described storer 1102 is stored, and makes described terminal realize following operation: described sensing data is mated with the recognition mode of registered in advance; If recognition mode matches described in one or more in described sensing data and described recognition mode, obtain the operation response of the recognition mode matching; Carry out described operation response.
For this terminal, described recognition mode comprises: recognition rule and described operation response; Wherein, described recognition rule comprises: single-sensor recognition rule and/or multisensor recognition rule; Wherein, single-sensor recognition rule is the rule that a sensing data that a corresponding sensor obtains is set up; Multisensor recognition rule is the synthesis rule that multiple sensing datas that corresponding multiple sensor obtains are set up.
In specific implementation process, described processor mates described sensing data with the recognition mode of registered in advance, specifically comprise: described processor mates described sensing data with single-sensor recognition rule and/or the multisensor recognition rule of described recognition mode;
, recognition mode matches described in one or more in described sensing data and described recognition mode, specifically comprises:
Described sensing data meets single-sensor recognition rule and/or the multisensor recognition rule of described recognition mode.
In addition, described processor also for: open described sensor.
Wherein, the method that described processor is opened described sensor comprises:
Described processor opening program;
Described processor, according to described program, is registered the recognition mode corresponding with described program;
Described processor is opened described sensor according to described recognition mode.
Described processor is carried out described operation response, specifically comprises: the described program being unlocked, carry out the described operation response of the recognition mode corresponding with described program; Wherein, the operation that the described program that described operation response is unlocked described in being can be carried out.
It should be noted that, described program comprises: the program of system-level program and/or application layer.Wherein, when described system-level program is described terminal device unlatching, the program simultaneously starting; The program of described application layer is the program that described terminal device is opened according to user's input.
In the time that described program is the program of described application layer, described processor also for:
Close the program of described application layer;
Nullify the described recognition mode corresponding with the program of pent described application layer;
Close the described sensor corresponding with the described recognition mode being canceled.
And described processor, according to described program, is registered the recognition mode corresponding with described program, specifically comprises: described processor, according to described program, is registered the recognition mode corresponding with described program in program frame.
In the embodiment of the present invention, processor can be central processing unit (central processing unit, CPU), special IC (application-specific integrated circuit, ASIC) etc.
Computer-readable storage medium can have program stored therein, and this program can comprise the part or all of step in each embodiment of method of the data transmission that the embodiment of the present invention provides while execution.Described storage medium can be magnetic disc, CD, read-only store-memory body (Read-Only Memory, ROM) or random store-memory body (Random Access Memory, RAM) etc.
Those of ordinary skill in the art can recognize, unit and the algorithm steps of each example of describing in conjunction with embodiment disclosed herein, can realize with the combination of electronic hardware or computer software and electronic hardware.These functions are carried out with hardware or software mode actually, depend on application-specific and the design constraint of technical scheme.Professional and technical personnel can realize described function with distinct methods to each specifically should being used for, but this realization should not thought and exceeds scope of the present invention.
Those skilled in the art can be well understood to, and for convenience and simplicity of description, the specific works process of the system of foregoing description, device and unit, can, with reference to the corresponding process in preceding method embodiment, not repeat them here.
In the several embodiment that provide in the application, should be understood that disclosed system, apparatus and method can realize by another way.For example, device embodiment described above is only schematic, for example, the division of described unit, be only that a kind of logic function is divided, when actual realization, can have other dividing mode, for example multiple unit or assembly can in conjunction with or can be integrated into another system, or some features can ignore, or do not carry out.Another point, shown or discussed coupling each other or direct-coupling or communication connection can be by some interfaces, indirect coupling or the communication connection of device or unit can be electrically, machinery or other form.
The described unit as separating component explanation can or can not be also physically to separate, and the parts that show as unit can be or can not be also physical locations, can be positioned at a place, or also can be distributed in multiple network element.Can select according to the actual needs some or all of unit wherein to realize the object of the present embodiment scheme.
In addition, the each functional unit in each embodiment of the present invention can be integrated in a processing unit, can be also that the independent physics of unit exists, and also can be integrated in a unit two or more unit.
If described function realizes and during as production marketing independently or use, can be stored in a computer read/write memory medium using the form of SFU software functional unit.Based on such understanding, the part that technical scheme of the present invention contributes to prior art in essence in other words or the part of this technical scheme can embody with the form of software product, this computer software product is stored in a storage medium, comprise that some instructions (can be personal computers in order to make a computer equipment, server, or the network equipment etc.) or processor (processor) carry out all or part of step of method described in each embodiment of the present invention.And aforesaid storage medium comprises: USB flash disk, portable hard drive, ROM (read-only memory) (ROM, Read-Only Memory), the various media that can be program code stored such as random access memory (RAM, Random Access Memory), magnetic disc or CD.
The above; be only the specific embodiment of the present invention, but protection scope of the present invention is not limited to this, any be familiar with those skilled in the art the present invention disclose technical scope in; can expect easily changing or replacing, within all should being encompassed in protection scope of the present invention.Therefore, protection scope of the present invention should described be as the criterion with the protection domain of claim.

Claims (19)

1. a method for terminal device identification user gesture, is characterized in that, comprising:
The sensor of terminal device obtains sensing data;
Described terminal device mates described sensing data with the recognition mode of registered in advance;
If recognition mode matches described in one or more in described sensing data and described recognition mode, described terminal device obtains the operation response of the recognition mode matching;
Described terminal device is carried out described operation response.
2. method according to claim 1, is characterized in that, described recognition mode comprises: recognition rule and described operation response;
Wherein, described recognition rule comprises: single-sensor recognition rule and/or multisensor recognition rule;
Wherein, single-sensor recognition rule is the rule that a sensing data that a corresponding sensor obtains is set up;
Multisensor recognition rule is the synthesis rule that multiple sensing datas that corresponding multiple sensor obtains are set up.
3. method according to claim 2, is characterized in that, described terminal device mates described sensing data with the recognition mode of registered in advance, specifically comprise:
Described terminal device mates described sensing data with single-sensor recognition rule and/or the multisensor recognition rule of described recognition mode;
, recognition mode matches described in one or more in described sensing data and described recognition mode, specifically comprises:
Described sensing data meets single-sensor recognition rule and/or the multisensor recognition rule of described recognition mode.
4. according to the method in any one of claims 1 to 3, it is characterized in that, before the sensor of described terminal device obtains sensing data, described method also comprises:
Described terminal device is opened described sensor;
Wherein, the method that described terminal device is opened described sensor comprises:
Described terminal device opening program;
Described terminal device, according to described program, is registered the recognition mode corresponding with described program;
Described terminal device is opened described sensor according to described recognition mode.
5. method according to claim 4, is characterized in that, described terminal device is carried out described operation response, specifically comprises:
The described program being unlocked of described terminal device, carries out the described operation response of the recognition mode corresponding with described program;
Wherein, the operation that the described program that described operation response is unlocked described in being can be carried out.
6. according to the method described in claim 4 or 5, it is characterized in that, described program comprises: the program of system-level program and/or application layer;
Wherein, when described system-level program is described terminal device unlatching, the program simultaneously starting;
The program of described application layer is the program that described terminal device is opened according to user's input;
In the time that described program is the program of described application layer, described method also comprises:
Described terminal device is closed the program of described application layer;
Described terminal device is nullified the described recognition mode corresponding with the program of pent described application layer;
Described terminal device is closed the described sensor corresponding with the described recognition mode being canceled.
7. according to the method described in any one in claim 4 to 6, it is characterized in that, described terminal device, according to described program, is registered the recognition mode corresponding with described program, specifically comprises:
Described terminal device, according to described program, is registered the recognition mode corresponding with described program in program frame.
8. a terminal, is characterized in that, comprising:
Sensing data acquisition module, for obtaining sensing data by sensor;
Matching module, for mating described sensing data with the recognition mode of registered in advance;
Operation response acquisition module, if matched for recognition mode described in one or more of described sensing data and described recognition mode, described terminal device obtains the operation response of the recognition mode matching;
Operation response execution module, for carrying out described operation response.
9. terminal according to claim 8, is characterized in that, described recognition mode comprises: recognition rule and described operation response;
Wherein, described recognition rule comprises: single-sensor recognition rule and/or multisensor recognition rule;
Wherein, single-sensor recognition rule is the rule that a sensing data that a corresponding sensor obtains is set up;
Multisensor recognition rule is the synthesis rule that multiple sensing datas that corresponding multiple sensor obtains are set up.
10. terminal according to claim 9, is characterized in that, described matching module, comprising:
The first matching module, for mating described sensing data with the single-sensor recognition rule of described recognition mode;
The second matching module, for mating described sensing data with the multisensor recognition rule of described recognition mode.
Terminal in 11. according to Claim 8-10 described in any one, is characterized in that, also comprises:
Sensor opening module, for opening described sensor;
Described sensor opening module, specifically comprises:
Program is opened unit, for opening program;
Recognition mode registering unit, for according to described program, registers the recognition mode corresponding with described program;
Sensor is opened unit, for opening described sensor according to described recognition mode.
12. terminals according to claim 11, is characterized in that, described program comprises: the program of system-level program and/or application layer;
Wherein, when described system-level program is described terminal device unlatching, the program simultaneously starting;
The program of described application layer is the program that described terminal device is opened according to user's input;
In the time that described program is the program of described application layer, described terminal also comprises:
Stop module, for closing the program of described application layer;
Recognition mode is nullified module, for nullifying the described recognition mode corresponding with the program of pent described application layer;
Sensor closing module, for closing the described sensor corresponding with the described recognition mode being canceled.
13. 1 kinds of terminals, is characterized in that, comprising: processor, storer, bus and sensor; Described processor, storer and sensor interconnect by described bus; Described sensor is used for obtaining sensing data; Described storer is used for storing computer executed instructions; In the time of described terminal operating, described processor is carried out the described computer executed instructions of described memory stores, and described sensing data is mated with the recognition mode of registered in advance; If recognition mode matches described in one or more in described sensing data and described recognition mode, obtain the operation response of the recognition mode matching; Carry out described operation response.
14. terminals according to claim 13, is characterized in that, described recognition mode comprises: recognition rule and described operation response;
Wherein, described recognition rule comprises: single-sensor recognition rule and/or multisensor recognition rule;
Wherein, single-sensor recognition rule is the rule that a sensing data that a corresponding sensor obtains is set up;
Multisensor recognition rule is the synthesis rule that multiple sensing datas that corresponding multiple sensor obtains are set up.
15. terminals according to claim 14, is characterized in that, described processor mates described sensing data with the recognition mode of registered in advance, specifically comprise:
Described processor mates described sensing data with single-sensor recognition rule and/or the multisensor recognition rule of described recognition mode;
, recognition mode matches described in one or more in described sensing data and described recognition mode, specifically comprises:
Described sensing data meets single-sensor recognition rule and/or the multisensor recognition rule of described recognition mode.
16. according to the terminal described in any one in claim 13-15, it is characterized in that, described processor also for: open described sensor;
Wherein, the method that described processor is opened described sensor comprises:
Described processor opening program;
Described processor, according to described program, is registered the recognition mode corresponding with described program;
Described processor is opened described sensor according to described recognition mode.
17. terminals according to claim 16, is characterized in that, described processor is carried out described operation response, specifically comprises:
The described program being unlocked, carries out the described operation response of the recognition mode corresponding with described program;
Wherein, the operation that the described program that described operation response is unlocked described in being can be carried out.
18. according to the terminal described in claim 16 or 17, it is characterized in that, described program comprises: the program of system-level program and/or application layer;
Wherein, when described system-level program is described terminal device unlatching, the program simultaneously starting;
The program of described application layer is the program that described terminal device is opened according to user's input;
In the time that described program is the program of described application layer, described processor also for:
Close the program of described application layer;
Nullify the described recognition mode corresponding with the program of pent described application layer;
Close the described sensor corresponding with the described recognition mode being canceled.
19. according to the terminal described in any one in claim 16-18, it is characterized in that, described processor, according to described program, is registered the recognition mode corresponding with described program, specifically comprises:
Described processor, according to described program, is registered the recognition mode corresponding with described program in program frame.
CN201380005552.5A 2013-12-03 2013-12-03 Method and device of using terminal device to identify user gestures Active CN104169858B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2013/088389 WO2015081485A1 (en) 2013-12-03 2013-12-03 Method and device for terminal device to identify user gestures

Publications (2)

Publication Number Publication Date
CN104169858A true CN104169858A (en) 2014-11-26
CN104169858B CN104169858B (en) 2017-04-26

Family

ID=51912340

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201380005552.5A Active CN104169858B (en) 2013-12-03 2013-12-03 Method and device of using terminal device to identify user gestures

Country Status (2)

Country Link
CN (1) CN104169858B (en)
WO (1) WO2015081485A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104536565A (en) * 2014-12-18 2015-04-22 深圳市酷商时代科技有限公司 Application program control method and device
CN106650346A (en) * 2015-10-29 2017-05-10 阿里巴巴集团控股有限公司 Password input method and equipment
CN110609751A (en) * 2018-06-14 2019-12-24 珠海市魅族科技有限公司 Terminal device control method and device, terminal device and computer readable storage medium
CN110618874A (en) * 2018-06-20 2019-12-27 珠海市魅族科技有限公司 Terminal device control method and device, terminal device and computer readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662462A (en) * 2012-03-12 2012-09-12 中兴通讯股份有限公司 Electronic device, gesture recognition method and gesture application method
CN103067630A (en) * 2012-12-26 2013-04-24 刘义柏 Method of generating wireless control command through gesture movement of mobile phone
CN103226386A (en) * 2013-03-13 2013-07-31 广东欧珀移动通信有限公司 Gesture identification method and system based on mobile terminal
CN103279714A (en) * 2013-06-19 2013-09-04 深圳市中兴移动通信有限公司 Mobile terminal as well as data encryption and decryption method
CN103294201A (en) * 2013-06-27 2013-09-11 深圳市中兴移动通信有限公司 Mobile terminal and gesture controlling method thereof
CN103399633A (en) * 2013-07-17 2013-11-20 北京小米科技有限责任公司 Wireless remote control method and mobile terminal
US20130307775A1 (en) * 2012-05-15 2013-11-21 Stmicroelectronics R&D Limited Gesture recognition

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662462A (en) * 2012-03-12 2012-09-12 中兴通讯股份有限公司 Electronic device, gesture recognition method and gesture application method
US20130307775A1 (en) * 2012-05-15 2013-11-21 Stmicroelectronics R&D Limited Gesture recognition
CN103067630A (en) * 2012-12-26 2013-04-24 刘义柏 Method of generating wireless control command through gesture movement of mobile phone
CN103226386A (en) * 2013-03-13 2013-07-31 广东欧珀移动通信有限公司 Gesture identification method and system based on mobile terminal
CN103279714A (en) * 2013-06-19 2013-09-04 深圳市中兴移动通信有限公司 Mobile terminal as well as data encryption and decryption method
CN103294201A (en) * 2013-06-27 2013-09-11 深圳市中兴移动通信有限公司 Mobile terminal and gesture controlling method thereof
CN103399633A (en) * 2013-07-17 2013-11-20 北京小米科技有限责任公司 Wireless remote control method and mobile terminal

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104536565A (en) * 2014-12-18 2015-04-22 深圳市酷商时代科技有限公司 Application program control method and device
CN104536565B (en) * 2014-12-18 2019-01-11 深圳市酷商时代科技有限公司 Application control method and device
CN106650346A (en) * 2015-10-29 2017-05-10 阿里巴巴集团控股有限公司 Password input method and equipment
CN110609751A (en) * 2018-06-14 2019-12-24 珠海市魅族科技有限公司 Terminal device control method and device, terminal device and computer readable storage medium
CN110609751B (en) * 2018-06-14 2024-01-23 珠海市魅族科技有限公司 Terminal equipment control method and device, terminal equipment and computer readable storage medium
CN110618874A (en) * 2018-06-20 2019-12-27 珠海市魅族科技有限公司 Terminal device control method and device, terminal device and computer readable storage medium
CN110618874B (en) * 2018-06-20 2023-11-07 珠海市魅族科技有限公司 Terminal equipment control method and device, terminal equipment and computer readable storage medium

Also Published As

Publication number Publication date
WO2015081485A1 (en) 2015-06-11
CN104169858B (en) 2017-04-26

Similar Documents

Publication Publication Date Title
KR102129374B1 (en) Method for providing user interface, machine-readable storage medium and portable terminal
US20150077362A1 (en) Terminal with fingerprint reader and method for processing user input through fingerprint reader
US9916514B2 (en) Text recognition driven functionality
KR102199786B1 (en) Information Obtaining Method and Apparatus
US10579152B2 (en) Apparatus, method and recording medium for controlling user interface using input image
KR102496531B1 (en) Method for providing fingerprint recognition, electronic apparatus and storage medium
US9229552B2 (en) System and method for synchronized operation of touch device
CN104049738A (en) Method and apparatus for operating sensors of user device
US9588678B2 (en) Method of operating electronic handwriting and electronic device for supporting the same
US20200249773A1 (en) Electronic device and method for mapping function of electronic device to operation of stylus pen
CN111857508B (en) Task management method and device and electronic equipment
CN108885615A (en) For the ink input of browser navigation
CN102023735A (en) Touch input equipment, electronic equipment and mobile phone
US20230019876A1 (en) Electronic device comprising a plurality of touch screen displays and screen division method
CN104169858A (en) Method and device of using terminal device to identify user gestures
US10055092B2 (en) Electronic device and method of displaying object
CN106569716B (en) Single-hand control method and control system
CN106598422B (en) hybrid control method, control system and electronic equipment
CN105183217A (en) Touch display device and touch display method
CN113486738A (en) Fingerprint identification method and device, electronic equipment and readable storage medium
CN106033286A (en) A projection display-based virtual touch control interaction method and device and a robot
WO2017143575A1 (en) Method for retrieving content of image, portable electronic device, and graphical user interface
CN104484078A (en) Man-machine interactive system and method based on radio frequency identification
US20180188822A1 (en) Electronic device having auxiliary device and method for receiving characters using same
US9740923B2 (en) Image gestures for edge input

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant