CN108139798A - Exchange method and wearable device - Google Patents

Exchange method and wearable device Download PDF

Info

Publication number
CN108139798A
CN108139798A CN201680056170.9A CN201680056170A CN108139798A CN 108139798 A CN108139798 A CN 108139798A CN 201680056170 A CN201680056170 A CN 201680056170A CN 108139798 A CN108139798 A CN 108139798A
Authority
CN
China
Prior art keywords
application
character
wearable device
sensor
motion profile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201680056170.9A
Other languages
Chinese (zh)
Inventor
赵心宇
贺真
陈虎生
陈运哲
张泽狮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN108139798A publication Critical patent/CN108139798A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Abstract

A kind of exchange method and wearable device (100), by the movement locus for acquiring and identifying gesture in wearable device (100) effectively cog region, the character inquiry formed according to movement locus preserves character and the mapping table of application, so that it is determined that going out the corresponding application of character.In this method, the application of wearable device (100) is called by different gestures, reduces the step that user interacts with wearable device (100), operation duration is reduced while reducing operation complexity.Meanwhile provide the operating space of bigger to the user by effective cog region, and the display screen (14) of wearable device (100) will not be blocked in user operation process.

Description

Exchange method and wearable device Technical field
The invention relates to wearable device technology more particularly to a kind of exchange methods and wearable device.
Background technique
Currently, the function of wearable device is more come also more, user can call the different application of wearable device as needed, such as open different application program (Application, APP), search designated contact, decryption screen protection and screenshotss etc..
For some application for transferring wearable device, it is generally the case that user needs to carry out operation interface the operation of several steps.For example, if the APP is located at level-one interface, successively execute following movement: there is selective APP → sliding menu → and chooses APP in wake-up wearable device → opening application interface when user needs to open some APP;For another example, when needing the APP that opens to be located at secondary interface, successively execute following movement: there is the selective interface APP → sliding menu → and chooses APP in wake-up wearable device → be switched to Application Program Interface → unlatching application program.
Under normal conditions, the operation interface of wearable device is limited, for example, the dial plate size of smartwatch is designed with reference to conventional wristwatch mostly, the diameter of dial plate is smaller, usually at 45 millimeters or so.In above-mentioned interactive mode, user needs to carry out a series of operational motion in limited operation interface, and operating process is cumbersome, complexity is high, and the operating time is long.
Summary of the invention
The embodiment of the present application provides a kind of exchange method and wearable device, and the function of wearable device is called by different gestures, reduces the step that user interact with wearable device, and reduction operates duration while reducing operation complexity.
First aspect, the embodiment of the present application provides a kind of exchange method, this method is described from the angle of wearable device, in this method, wearable device acquires the motion profile of the gesture of user in effective cog region by sensor, the character inquiry that processor is formed according to motion profile saves the mapping table of character and application, applies so that it is determined that character is corresponding out and opens application.
In the above method, the application of wearable device is called by different gestures, reduces the step that user interacts with wearable device, and operation duration is reduced while reducing operation complexity.Meanwhile bigger operating space is provided for user by effective cog region, and the display screen of wearable device will not be blocked in user operation process.
In a kind of feasible design, the mapping table is the one-to-one or one-to-many mapping table of character and application.When mapping table is the one-to-one mapping table of character and application, wearable device directly opens unique application after determining the corresponding application of character;When mapping table is the one-to-many mapping table of character and application, wearable device shows list of application corresponding with the character;According to user's operation, selection target is applied and is started from the list of application.
In the above method, the corresponding relationship of character and application can be flexibly set.
In a kind of feasible design, when mapping table is the one-to-many mapping table of character and application, wearable device determines the priority of the corresponding each application of character, shows list of application corresponding with the character on a display screen according to priority.
In the above method, can flexible setting show the sequence of application on a display screen, facilitate the selection of user.
In a kind of feasible design, the wearable device is wrist wearable device, for the sensor integration in the wrist wearable device close to the position of wearer's hand back side, effective cog region includes the back of the hand region and the back of the hand neighboring area of the wearer.
In the above method, by sensor deployment in wrist wearable device close to the position of wearer's hand back side, so that effectively identification region and the relationship of user is closer, it is user-friendly.
In a kind of feasible design, the sensor is ultrasonic sensor, and wearable device identifies the motion profile of gesture in effective cog region according to the ultrasonic wave that the ultrasonic sensor issues.
Second aspect, the embodiment of the present application provide a kind of wearable device, comprising:
Identification module, for identification in effective cog region gesture motion profile, wherein the motion profile be character;Effective cog region is the induction region of the sensor in the wearable device;
Processing module determines the corresponding application of the character for the mapping table according to character and application;
Starting module, for starting the corresponding application of the character.
In a kind of feasible design, the mapping table is the one-to-one or one-to-many mapping table of character and application.
In a kind of feasible design, the starting module is specifically used for showing list of application corresponding with the character, and the selection target application from the list of application starts the target application.
In a kind of feasible design, the starting module is specifically used for showing list of application corresponding with the character according to priority when showing list of application corresponding with the character.
In a kind of feasible design, the wearable device is wrist wearable device, for the sensor integration in the wrist wearable device close to the position of wearer's hand back side, effective cog region includes the back of the hand region and the back of the hand neighboring area of the wearer.
In a kind of feasible design, the identification module identifies the motion profile of gesture in effective cog region specifically for the ultrasonic wave issued according to the ultrasonic sensor when the sensor is ultrasonic sensor.
The third aspect, the embodiment of the present application provides a kind of wearable device, it include: sensor, processor, memory and display screen, wherein, the sensor is used to acquire the motion profile of gesture in effective identification region, the memory is used for the storage processor instruction and data to be executed, the processor is used to execute the instruction in the memory to identify the motion profile, the motion profile is character, according to the mapping table of character and application, it determines the corresponding application of character and opens the application, the display screen is used to show the application opened.
Fourth aspect, the embodiment of the invention provides a kind of computer storage mediums, and for storing computer software instructions used in the wearable device in above-mentioned first aspect, it includes for executing program designed by above-mentioned aspect.
5th aspect, the embodiment of the present invention provide a kind of chip system, at least one processor, memory, input/output section and bus;At least one described processor obtains the instruction in the memory by the bus, with the design function for the wearable device being related to for realizing above-mentioned first aspect the method.
The embodiment of the present application provides a kind of exchange method and wearable device, by the motion profile for acquiring and identifying gesture in the effective cog region of wearable device, the character inquiry formed according to motion profile saves the mapping table of character and application, so that it is determined that the corresponding application of character out.In this method, the application of wearable device is called by different gestures, reduces the step that user interacts with wearable device, and operation duration is reduced while reducing operation complexity.Meanwhile bigger operating space is provided for user by effective cog region, and the display screen of wearable device will not be blocked in user operation process.
Detailed description of the invention
The structural schematic diagram for the wearable device that Fig. 1 is applicable in by the embodiment of the present application exchange method;
Fig. 2 is the flow chart of the embodiment of the present application exchange method embodiment one;
Fig. 3 is the process schematic of the corresponding application of initiation gesture in the embodiment of the present application exchange method;
Fig. 4 is the flow chart of the embodiment of the present application exchange method embodiment two;
Fig. 5 is the process schematic that the embodiment of the present application exchange method quickly searches contact person;
Fig. 6 is the process schematic of the embodiment of the present application exchange method quick start application APP;
Fig. 7 is the process schematic of the embodiment of the present application exchange method quick release;
Fig. 8 is the structural schematic diagram of the embodiment of the present application wearable device embodiment two.
Specific embodiment
The specification and claims of the embodiment of the present application and the (if present)s such as term " first " in above-mentioned attached drawing, " second ", " third ", " the 4th " are to be used to distinguish similar objects, without being used to describe a particular order or precedence order.It should be understood that the data used in this way are interchangeable under appropriate circumstances, so that embodiments herein described herein can also be implemented other than the mode for illustrating or describing here can be used to implement with the mode other than diagram.Furthermore, term " includes " and " having " and their any deformation, it is intended to cover and non-exclusive includes, such as, the process, method, system, product or equipment for containing a series of steps or units those of are not necessarily limited to be clearly listed step or unit, but may include other step or units being not clearly listed or intrinsic for these process, methods, product or equipment.
Currently, if desired user opens the different application of wearable device, need to carry out operation interface the operation of several steps.For example, if the APP is located at level-one interface, successively execute following movement: wake-up wearable device → opening application interface selective APP → sliding menu → occurs and chooses APP when user needs to open some APP, the operation of 4 steps is needed to be implemented at this time;For another example, it when needing the APP that opens to be located at secondary interface, successively executes following movement: waking up wearable device → be switched to Application Program Interface → unlatching application program and the selective interface APP → sliding menu → occur and choose APP, need to be implemented 5 steps at this time and operate.However, the operation interface of wearable device is limited, in above-mentioned interactive mode, user needs to carry out a series of operational motion in limited operation interface, and operating process is cumbersome, complexity is high, and the operating time is long.
In view of this, the embodiment of the present application provides a kind of exchange method and wearable device, the function of wearable device is called by different gestures, reduces the step that user interact with wearable device, and reduction is grasped Make to reduce operation duration while complexity.
The structural schematic diagram for the wearable device that Fig. 1 is applicable in by the embodiment of the present application exchange method.Please refer to Fig. 1, the wearable device 100 that the embodiment of the present application is applicable in includes: sensor 11, processor 12, memory 13 and display screen 14, wherein, the sensor 11 is used to acquire the motion profile of gesture in effective identification region, the memory 13 is used for the instruction and data to be executed of storage processor 12, the processor 12 is used to execute the instruction in the memory 13 to identify the motion profile of specially character, according to the mapping table of character and application, it determines the corresponding application of character and opens the application, display screen 14 is used to show the application opened.
The invention relates to wearable device can have than more or fewer components shown in figure 1, two or more components can be combined, or can have different components and configure or set up, all parts can be in the combination realization of hardware, software or hardware and software including one or more signal processings and/or specific integrated circuit.
In the following, the exchange method described in the embodiment of the present application is described in detail on the basis of Fig. 1.Specifically, reference can be made to Fig. 2, Fig. 2 are the flow chart of the application exchange method embodiment one, comprising:
101, the motion profile of gesture in effective cog region is identified, wherein the motion profile is character;Effective cog region is the induction region of the sensor in the wearable device.
In the embodiment of the present application, integrated ultrasonic, infrared ray (structure light), radar or camera etc. can identify the sensor of finger or object trajectory in wearable device.Wherein, the track of gesture or object (such as pen etc.) movement is gesture.When user wears wearable device, the sensor in wearable device forms induction region in sensor perimeter, which is effective identification region.For example, smartwatch is usually worn on user's wrist and on the back of the hand by taking smartwatch as an example, therefore, sensor is disposed close to dorsal side in smartwatch.At this point, sensor forms induction region in certain altitude above the back of the hand and the back of the hand, which is effective identification region, specifically, reference can be made to Fig. 3, Fig. 3 are the process schematic of the corresponding application of initiation gesture in the application exchange method.
Referring to figure 3., smartwatch is worn in the wrist of user and at the back of the hand direction, no matter how the wrist of user rotates, sensor all can form effective cog region in certain altitude above the back of the hand and the back of the hand of user, sensor can acquire the motion profile of gesture in effective cog region, and can not acquire the motion profile of gesture outside effective cog region.For example, effective cog region is located above the back of the hand when fist eye is towards user, sensor can acquire the motion profile of gesture in the plane that x-axis and z-axis are formed;For another example, fist eye is located on the back of the hand towards ground, effective cog region, and sensor can acquire hand in the plane that x-axis and y-axis are formed The motion profile of gesture.After sensor collects motion profile, the motion profile is identified by processor, identifies the corresponding character of the motion profile.Wherein, when sensor acquires the motion profile of gesture in effective cog region, motion profile in the case of different dimensions can be acquired.In the following, the motion profile for how acquiring gesture in effective cog region to sensor is described in detail.
Specifically, sensor can be at least one in the embodiment of the present application, when sensor is at least one, the motion profile under one-dimensional case can be acquired;When sensor at least two, the motion profile under two-dimensional case can be acquired;When sensor is at least three, the motion profile under three-dimensional situation can be acquired.
For one-dimensional linear situation, the acquisition of motion profile can be realized by a sensor.In collection process, which goes out infrared ray or ultrasonic wave, measures the infrared ray or ultrasonic wave encounters the time returned after barrier, calculate the distance between sensor and testee according to the range time.By recording and obtaining different twice t1 moment and the distance between t2 moment sensor and testee, so that the distance that testee linearly moves between in t1~t2 period is extrapolated, to obtain motion profile.
For two-dimensional surface situation, X-axis and Y-axis are determined, at least need to use two sensors, for example, known to sensors A and sensor B, the distance between sensors A and sensor B L.When measuring testee, the distance L11 and L21 between testee is measured respectively at the t1 moment by sensors A and sensor B, measures the distance L12 and L22 between testee respectively at the t2 moment.Then, the offset in the X-axis and Y direction of testee in the plane can be calculated according to L, L11, L21, L12 and L22, by continuously measuring the motion profile that can determine that testee in the plane.
For three-dimensional space situation, X-axis, Y-axis and Z axis are determined, at least need three sensors, such as sensors A, sensor B and sensor C, three sensors are not conllinear, and known to the distance between every two sensors.T1 moment, sensors A, sensor B and the distance between sensor C and testee are respectively L11, L21 and L31, and t2 moment sensors A, sensor B and the distance between sensor C and testee are respectively L12, L22 and L32.Then, the testee offset of testee in X-axis, Y-axis and the Z-direction in space within t1~t2 period can be calculated according to the fixed range between L11, L21, L31, L12, L22 and L32 and every two sensors, can determine that testee motion profile in space by continuously measuring.
It should be noted that for ease of calculation, sensors A can be set to the intersection of X/Z plane and X/Y plane, sensor B is set in X/Y plane, sensor C for three-dimensional space situation It is set in X/Z plane, offset of the testee in X-axis and Y direction in t1 to the t2 period can be measured by sensors A and sensor B at this time, offset of the testee in X-axis and Z-direction in t1 to the t2 period can be measured by sensors A and sensor C, to obtain offset of the testee on X-axis, Y-axis and Z axis.
It should be noted that effective cog region is the induction region of sensor in the embodiment of the present application, and it is unrelated with the physical feeling of user, it is only related with wearing position.Such as, when wearable device is worn on the wrist of user and in palm of the hand direction and sensor integration is when wearable device is close to the side of the palm of the hand, no matter how the wrist of user rotates, sensor all can form effective cog region in certain altitude above the palm of the hand and the palm of the hand of user, and when wearable device is worn on the wrist of user and in the back of the hand direction and sensor integration is when wearable device is close to the side of the back of the hand, no matter how the wrist of user rotates, sensor all can the back of the hand or the back of the hand above form effective cog region in certain altitude.
In addition, it should be noted that, since the induction region of sensor is larger, for avoid sensor acquired in biggish induction region motion profile, processor to the motion profile carry out identification cause the power consumption of wearable device to increase, in the embodiment of the present application, effective cog region may be a part of induction region, and sensor is only acquired the motion profile in the induction region of part.
102, according to the mapping table of character and application, the corresponding application of the character is determined.
After identifying the corresponding character of motion profile, processor inquires the mapping table for saving character and application according to character, so that it is determined that the corresponding application of character out.Specifically, processor handles the motion profile of finger or object by software algorithm, and identify the character of the forms such as the corresponding letter of motion profile, number, Chinese character, mapping table is inquired according to character, so that it is determined that the corresponding application of character out.Wherein, the corresponding relationship of character and application is stored in mapping table, the corresponding application of different characters is different, and the mapping table under different scenes is different.In addition, can dynamically update mapping table when needing to add or delete some corresponding relationship to mapping table.
103, the corresponding application of starting character.
After determining the corresponding application of character, processor opens the corresponding application of character.Wherein, using including opening APP, searching and formulate contact person, decryption screen protection, screenshotss, operate into AD HOC etc..With continued reference to Fig. 3, processor identifies the corresponding character of motion profile by software algorithm as alphabetical " A ", and inquiry mapping table determines alphabetical " A " corresponding application to open Alipay (Alipay), then processor opens Alipay.As can be seen from FIG. 3: user, which only performs, inputs this step of gesture in effective cog region, and the purpose for opening Alipay can be realized.
Exchange method provided by the embodiments of the present application, by acquiring and identifying the motion profile of gesture in the effective cog region of wearable device, the character inquiry formed according to motion profile saves the mapping table of character and application, so that it is determined that the corresponding application of character out.In this method, the application of wearable device is called by different gestures, reduces the step that user interacts with wearable device, and operation duration is reduced while reducing operation complexity.Meanwhile bigger operating space is provided for user by effective cog region, and the display screen of wearable device will not be blocked in user operation process.
In the following, above-mentioned exchange method is described in detail for ultrasonic sensor is arranged on wearable device.Specifically, reference can be made to Fig. 4, Fig. 4 are the flow chart of the embodiment of the present application exchange method embodiment two, comprising:
201, ultrasonic data acquisition.
In this step, by the ultrasonic transmitter-receiver on ultrasonic sensor, finger or the mobile relevant distance of object and time quantum in effective cog region are acquired.Specifically, reference can be made to the description of above-mentioned steps 101, details are not described herein again.
202, ultrasonic wave identifying processing.
In this step, processor carries out noise reduction process to ultrasound data collected in 201, then integrates the collected distance of each ultrasonic transmitter-receiver with time quantum, generate the motion profile of finger or object, go out gesture, i.e. character, such as number, letter, Chinese character further according to track identification.
203, according to the mapping table of character and application, the corresponding application of the character is determined.
In the embodiment of the present application, mapping table has a characteristic that first, mapping table is that system is predetermined or user is customized according to scene;The second, the mapping table of each scene be it is independent, system and user can according to need dynamic setting, by wearable device be smartwatch for, when smartwatch is currently at desk interface, character A can be set as starting Alipay application;When smartwatch is currently at dialing interface, character A can be set as dialing to Anny etc.;Third, the one-to-one relationship table that mapping table is character and application;Alternatively, mapping table is the one-to-many mapping table of character and application.
204, determine whether the corresponding application of character is unique, if uniquely, executing 205;If not unique, 206 are executed.
205, quick start application.
In this step, scene specific for one directly initiates the application if the corresponding application of gesture is unique.For example, user inputs character " A " in effective cog region, and character under dialing scene " A " only has a mapping to be selected, such as Anny, at this point, smartwatch directly dials to Anny.
206, list of application corresponding with the character is shown.
In this step, mapping table is the one-to-many mapping table of character and application, and scene specific for one shows list of application corresponding with character if character is corresponding to apply at least two on a display screen.For example, user inputs character " A " in effective cog region, and character " A " has multiple mappings to be selected, such as Anny, Andred etc. under dialing scene, at this point, display list of application, is determined and to be dialled to whom according to list of application by user.
207, user's selection target application from list of application starts target application.
In this step, user carries out secondary input selection by first difference character of " up and down " gesture or input etc., thus the selection target application from list of application, and target application is started by wearable device.For example, character " A " there are multiple mappings to be selected under the scene that dials, selected by user so that it is determined that the object for needing to dial out.
In the following, above-mentioned exchange method is described in detail with several specific embodiments.Specifically, it can be found in Fig. 5, Fig. 6 and Fig. 7, Fig. 5 is the process schematic that the embodiment of the present application exchange method quickly searches contact person, and Fig. 6 is the process schematic of the embodiment of the present application exchange method quick start APP, and Fig. 7 is the process schematic of the embodiment of the present application exchange method quick release.
Referring to figure 5., when wearable device is under standby mode, written word mother " C " (or other letters, number or other gestures) on the back of the hand, sensor collects the motion profile of gesture, processor identifies the motion profile for alphabetical " C ", inquiry mapping table determines that the corresponding application of character " C " is address list, then calls directly contact person, i.e., directly open the address list (contracts) of wearable device;User carries out gesture input in address list mode again, such as " Z ", sensor collects the motion profile of gesture, processor identifies that the motion profile is character " Z ", then directly choose the contact schema started with " Z ", if only having one with the contact person that " Z " starts, subscriber dialing or dial-in direct are prompted;If first showing those contact persons of screen display with the contact person at least two that " Z " starts, processor dials according to the selection targets such as slide of user contact person.
During above-mentioned quick lookup contact person, as contact person at least two, processor shows those contact persons according to priority, it centainly sorts so that the priority of the contact person of display screen display has, specifically, it can be found in table 1, table 1 is the priority schematic table of particular contact when quick start searches contact person.
Table 1
Table 1 is please referred to, as contact person at least two, if priority is that user is customized, the sequence of contact person is modulated according to slide of user etc.;If priority is that talk times are most, processor is ranked up contact person according to talk times and shows on a display screen.
Please refer to Fig. 6, when wearable device is under standby mode, written word mother " A " (can also be other letters, number or other gestures) on the back of the hand, sensor collects the fortune function track of gesture, processor identifies that the motion profile is character " A ", inquiry mapping table determines that the corresponding application of character " A " is Alipay, then processor directly opens Alipay.
When the corresponding APP of character " A " has multiple, such as there is Alipay, iqiyi.com, Alibaba on wearable device, that is character " A " and application is one-to-many corresponding relationship, at this time, processor shows those APP according to priority, centainly sorts so that the priority of the APP of display screen display has, specifically, it can be found in table 2, the priority schematic table of APP when table 2 is quick start APP.
Table 2
Priority User is customized
Priority 1 Using number
Priority 2 Significance level: such as it is related to property
According to Fig. 6 and table 2: when user inputs character " A " in effective cog region, processor quick start APP.As for start which APP, then be determined according to priority, for example, when priority be user it is customized when, if the customized APP of user be Alipay, processor quick calling Alipay;For another example, when priority is using the most APP of number, then processor determines the most APP of access times, and the APP that the quick calling access times are most.
Please refer to Fig. 7, when wearable device is under standby mode, written word mother " L " (or other letters, number or other gestures) on the back of the hand, sensor collects the motion profile of gesture, processor identifies that the motion profile is character " L ", inquiry mapping table determines that the corresponding application of character " L " is unlock operation, then is directly unlocked operation to wearable device.
Pass through the above method, it is ensured that the privacy of unlocking process will not reveal operation trace in operating process, largely protect unlock gesture.
Fig. 8 is the structural schematic diagram of wearable device of embodiment of the present invention embodiment two.Wearable device provided in this embodiment can realize each step for the method applied to wearable device that any embodiment of that present invention provides.Specifically, wearable device 200 provided in this embodiment includes:
Identification module 21, for identification in effective cog region gesture motion profile, wherein the motion profile be character;Effective cog region is the induction region of the sensor in the wearable device;
Processing module 22 determines the corresponding application of the character for the mapping table according to character and application;
Starting module 23, for starting the corresponding application of the character.
Wearable device provided by the embodiments of the present application, by acquiring and identifying the motion profile of gesture in the effective cog region of wearable device, the character inquiry formed according to motion profile saves the mapping table of character and application, so that it is determined that the corresponding application of character out.In this method, the application of wearable device is called by different gestures, reduces the step that user interacts with wearable device, and operation duration is reduced while reducing operation complexity.Meanwhile bigger operating space is provided for user by effective cog region, and the display screen of wearable device will not be blocked in user operation process.
Optionally, in one embodiment of the application, the mapping table is the one-to-one or one-to-many mapping table of character and application.
Optionally, in one embodiment of the application, the starting module 23 is specifically used for showing list of application corresponding with the character, and the selection target application from the list of application starts the target application.
Optionally, in one embodiment of the application, the starting module 23 is specifically used for showing list of application corresponding with the character according to priority when showing list of application corresponding with the character.
Optionally, in one embodiment of the application, the wearable device is wrist wearable device, and for the sensor integration in the wrist wearable device close to the position of wearer's hand back side, effective cog region includes the back of the hand region and the back of the hand neighboring area of the wearer.
Optionally, in one embodiment of the application, the identification module 21 identifies the motion profile of gesture in effective cog region specifically for the ultrasonic wave issued according to the ultrasonic sensor when the sensor is ultrasonic sensor.
It is apparent to those skilled in the art that for convenience and simplicity of description, the specific work process of the system, apparatus, and unit of foregoing description can be with reference to the correspondence in preceding method embodiment Process, details are not described herein.
In several embodiments provided herein, it should be understood that the exchange method and wearable device may be implemented in other ways.Such as, the apparatus embodiments described above are merely exemplary, such as, the division of the unit, only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple units or components can be combined or can be integrated into another system, or some features can be ignored or not executed.Another point, shown or discussed mutual coupling, direct-coupling or communication connection can be through some interfaces, the indirect coupling or communication connection of device or unit, can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, and component shown as a unit may or may not be physical unit, it can and it is in one place, or may be distributed over multiple network units.It can some or all of the units may be selected to achieve the purpose of the solution of this embodiment according to the actual needs.
In addition, the functional units in various embodiments of the present invention may be integrated into one processing unit, it is also possible to each unit and physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated unit both can take the form of hardware realization, can also realize in the form of software functional units.
Those of ordinary skill in the art will appreciate that: realizing all or part of the steps of above method embodiment, this can be accomplished by hardware associated with program instructions, program above-mentioned can store in a processor read/write memory medium, the program when being executed, executes step including the steps of the foregoing method embodiments;And storage medium above-mentioned includes: the various media that can store program code such as ROM, RAM, magnetic or disk.
Finally, it should be noted that the above various embodiments is only to illustrate the technical solution of the application, rather than its limitations;Although the application is described in detail referring to foregoing embodiments, those skilled in the art should understand that: it is still possible to modify the technical solutions described in the foregoing embodiments, or equivalent substitution of some or all of the technical features;And these are modified or replaceed, the range of each embodiment technical solution of the application that it does not separate the essence of the corresponding technical solution.

Claims (13)

  1. A kind of exchange method, which is characterized in that it is suitable for wearable device, this method comprises:
    Identify the motion profile of gesture in effective cog region, wherein the motion profile is character;Effective cog region is the induction region of the sensor in the wearable device;
    According to the mapping table of character and application, the corresponding application of the character is determined;
    Start the corresponding application of the character.
  2. The method according to claim 1, wherein
    The mapping table is the one-to-one or one-to-many mapping table of character and application.
  3. According to the method described in claim 2, it is characterized in that, the corresponding application of the starting character, comprising:
    Show list of application corresponding with the character;
    The selection target application from the list of application;
    Start the target application.
  4. According to the method described in claim 3, it is characterized in that, display list of application corresponding with the character, comprising:
    List of application corresponding with the character is shown according to priority.
  5. Method according to any one of claims 1 to 4, it is characterized in that, the wearable device is wrist wearable device, for the sensor integration in the wrist wearable device close to the position of wearer's hand back side, effective cog region includes the back of the hand region and the back of the hand neighboring area of the wearer.
  6. Described in any item methods according to claim 1~5, which is characterized in that the sensor is ultrasonic sensor, the motion profile of gesture in the effective cog region of identification, comprising:
    The motion profile of gesture in effective cog region is identified according to the ultrasonic wave that the ultrasonic sensor issues.
  7. A kind of wearable device characterized by comprising
    Identification module, for identification in effective cog region gesture motion profile, wherein the motion profile be character;Effective cog region is the induction region of the sensor in the wearable device;
    Processing module determines the corresponding application of the character for the mapping table according to character and application;
    Starting module, for starting the corresponding application of the character.
  8. Equipment according to claim 7, which is characterized in that
    The mapping table is the one-to-one or one-to-many mapping table of character and application.
  9. Equipment according to claim 8, which is characterized in that
    The starting module is specifically used for showing list of application corresponding with the character, and the selection target application from the list of application starts the target application.
  10. Equipment according to claim 9, which is characterized in that
    The starting module is specifically used for showing list of application corresponding with the character according to priority when showing list of application corresponding with the character.
  11. According to the described in any item equipment of claim 7~10, it is characterized in that, the wearable device is wrist wearable device, for the sensor integration in the wrist wearable device close to the position of wearer's hand back side, effective cog region includes the back of the hand region and the back of the hand neighboring area of the wearer.
  12. According to the described in any item equipment of claim 7~11, which is characterized in that
    The identification module identifies the motion profile of gesture in effective cog region specifically for the ultrasonic wave issued according to the ultrasonic sensor when the sensor is ultrasonic sensor.
  13. A kind of wearable device, it is characterized in that, it include: sensor, processor, memory and display screen, wherein, the sensor is used to acquire the motion profile of gesture in effective identification region, the memory is used for the storage processor instruction and data to be executed, the processor is used to execute the instruction in the memory to identify the motion profile, the motion profile is character, according to the mapping table of character and application, it determines the corresponding application of character and opens the application, the display screen is used to show the application opened.
CN201680056170.9A 2016-09-26 2016-12-19 Exchange method and wearable device Pending CN108139798A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN2016108532570 2016-09-26
CN201610853257 2016-09-26
PCT/CN2016/110873 WO2018053956A1 (en) 2016-09-26 2016-12-19 Interaction method, and wearable device

Publications (1)

Publication Number Publication Date
CN108139798A true CN108139798A (en) 2018-06-08

Family

ID=61689813

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680056170.9A Pending CN108139798A (en) 2016-09-26 2016-12-19 Exchange method and wearable device

Country Status (2)

Country Link
CN (1) CN108139798A (en)
WO (1) WO2018053956A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117055738A (en) * 2023-10-11 2023-11-14 湖北星纪魅族集团有限公司 Gesture recognition method, wearable device and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110244843B (en) * 2019-06-03 2023-12-08 努比亚技术有限公司 Wearable device control method, wearable device and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014125294A1 (en) * 2013-02-15 2014-08-21 Elliptic Laboratories As Touchless user interfaces
CN104267898A (en) * 2014-09-16 2015-01-07 北京数字天域科技股份有限公司 Method and device for quick triggering application program or application program function
CN105183331A (en) * 2014-05-30 2015-12-23 北京奇虎科技有限公司 Method and device for controlling gesture on electronic device
CN105718064A (en) * 2016-01-22 2016-06-29 南京大学 Gesture recognition system and method based on ultrasonic waves
CN105807900A (en) * 2014-12-30 2016-07-27 丰唐物联技术(深圳)有限公司 Non-contact type gesture control method and intelligent terminal

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8830181B1 (en) * 2008-06-01 2014-09-09 Cypress Semiconductor Corporation Gesture recognition system for a touch-sensing surface
US9372535B2 (en) * 2013-09-06 2016-06-21 Thalmic Labs Inc. Systems, articles, and methods for electromyography-based human-electronics interfaces
CN103558920B (en) * 2013-11-15 2018-06-19 努比亚技术有限公司 The processing method and processing device of Non-contact posture
EP3637227B1 (en) * 2014-05-20 2023-04-12 Huawei Technologies Co., Ltd. Method for performing operation on intelligent wearing device by using gesture, and intelligent wearing device
CN105389003A (en) * 2015-10-15 2016-03-09 广东欧珀移动通信有限公司 Control method and apparatus for application in mobile terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014125294A1 (en) * 2013-02-15 2014-08-21 Elliptic Laboratories As Touchless user interfaces
CN105183331A (en) * 2014-05-30 2015-12-23 北京奇虎科技有限公司 Method and device for controlling gesture on electronic device
CN104267898A (en) * 2014-09-16 2015-01-07 北京数字天域科技股份有限公司 Method and device for quick triggering application program or application program function
CN105807900A (en) * 2014-12-30 2016-07-27 丰唐物联技术(深圳)有限公司 Non-contact type gesture control method and intelligent terminal
CN105718064A (en) * 2016-01-22 2016-06-29 南京大学 Gesture recognition system and method based on ultrasonic waves

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117055738A (en) * 2023-10-11 2023-11-14 湖北星纪魅族集团有限公司 Gesture recognition method, wearable device and storage medium
CN117055738B (en) * 2023-10-11 2024-01-19 湖北星纪魅族集团有限公司 Gesture recognition method, wearable device and storage medium

Also Published As

Publication number Publication date
WO2018053956A1 (en) 2018-03-29

Similar Documents

Publication Publication Date Title
CN104536766B (en) The control method and electronic equipment of a kind of electronic equipment
US11452459B2 (en) Electronic device comprising biosensor
CN104423581B (en) Mobile terminal and its control method
US20170285844A1 (en) Electronic device including antenna device
JP2022522455A (en) Photoelectric fingerprint identification device, terminal and fingerprint identification method
CN105159539B (en) Touch-control response method, device and the wearable device of wearable device
US20160085329A1 (en) Mobile terminal and system having the same
CN107450828A (en) Utilize the method for fingerprint activated function and the electronic equipment of support this method
CN108701043A (en) A kind of processing method and processing device of display
CN108255304A (en) Video data handling procedure, device and storage medium based on augmented reality
CN106462313A (en) Method and electronic device for managing display objects
US20150293597A1 (en) Method, Apparatus and Computer Program for Enabling a User Input Command to be Performed
CN110456953A (en) File interface switching method and terminal device
CN107506757A (en) A kind of lateral fingerprint module and the mobile terminal with lateral fingerprint module
CN106020671A (en) Adjustment method and device for response sensitivity of fingerprint sensor
CN106055085A (en) Wearable device interactive method and interactive wearable device
CN113253908B (en) Key function execution method, device, equipment and storage medium
US10620766B2 (en) Mobile terminal and method for controlling the same
CN106774917A (en) Terminal control mechanism, Wearable, terminal and terminal control method
CN109471690A (en) A kind of message display method and terminal device
CN110036363A (en) Adjust the method for screen size and the electronic device for it
CN110069188A (en) Identification display method and terminal device
WO2018105955A2 (en) Method for displaying object and electronic device thereof
CN110099219A (en) Panorama shooting method and Related product
CN109828731A (en) A kind of searching method and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180608