WO2018053956A1 - 交互方法及可穿戴设备 - Google Patents

交互方法及可穿戴设备 Download PDF

Info

Publication number
WO2018053956A1
WO2018053956A1 PCT/CN2016/110873 CN2016110873W WO2018053956A1 WO 2018053956 A1 WO2018053956 A1 WO 2018053956A1 CN 2016110873 W CN2016110873 W CN 2016110873W WO 2018053956 A1 WO2018053956 A1 WO 2018053956A1
Authority
WO
WIPO (PCT)
Prior art keywords
character
application
wearable device
sensor
area
Prior art date
Application number
PCT/CN2016/110873
Other languages
English (en)
French (fr)
Inventor
赵心宇
贺真
陈虎生
陈运哲
张泽狮
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201680056170.9A priority Critical patent/CN108139798A/zh
Publication of WO2018053956A1 publication Critical patent/WO2018053956A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Definitions

  • the embodiments of the present application relate to wearable device technologies, and in particular, to an interaction method and a wearable device.
  • the functions of the wearable device are more and more, and the user can call different applications of the wearable device according to the need, such as opening different applications (Application, APP), finding a specified contact, decrypting the screen saver, and taking a screenshot.
  • applications Application, APP
  • the user In order to mobilize an application of a wearable device, in general, the user needs to perform a number of steps on the operation interface. For example, when the user needs to open an APP, if the APP is located in the first-level interface, the following actions are performed in sequence: awakening the wearable device ⁇ opening the application interface to display the selected APP ⁇ sliding menu ⁇ selecting the APP; and, if necessary, when needed When the opened APP is located in the secondary interface, perform the following actions in sequence: wake up the wearable device ⁇ switch to the application interface ⁇ open the application to display the selected APP interface ⁇ slide the menu ⁇ select APP.
  • the operating interface of the wearable device is limited.
  • the dial size of the smart watch is mostly designed with reference to a conventional watch.
  • the diameter of the dial is relatively small, usually around 45 mm.
  • An embodiment of the present invention provides an interaction method and a wearable device, which invokes a function of a wearable device through different gestures, reduces steps of interaction between the user and the wearable device, reduces operation complexity, and reduces operation time.
  • an embodiment of the present application provides an interaction method, which is described from the perspective of a wearable device, in which a wearable device collects a motion track of a gesture of a user in an effective identification area by using a sensor, and the processor The character query formed according to the motion track saves the correspondence table between the character and the application, thereby determining the application corresponding to the character and starting the application.
  • the application of the wearable device is invoked through different gestures, the steps of the user interacting with the wearable device are reduced, the operation complexity is reduced, and the operation time is reduced.
  • the effective recognition area provides the user with a larger operation space, and the display screen of the wearable device is not blocked during the user operation.
  • the correspondence table is a one-to-one or one-to-man correspondence table between characters and applications.
  • the wearable device directly opens the unique application after determining the application corresponding to the character; when the correspondence table is a one-to-many relationship between the character and the application Corresponding to the relationship table, the wearable device displays an application list corresponding to the character; selects a target application from the application list according to a user operation and starts.
  • the wearable device determines the priority of each application corresponding to the character, and displays and describes the display on the display according to the priority. A list of applications corresponding to the characters.
  • the order of the applications displayed on the display screen can be flexibly set to facilitate the user's selection.
  • the wearable device is a wrist wearable device
  • the sensor is integrated at a portion of the wrist wearable device near the back side of the wearer's hand, the effective identification area including the wearer The area where the back of the hand is located and the area around the back of the hand.
  • the senor is disposed on a portion of the wrist wearable device near the back side of the wearer's hand, so that the effective identification area is closer to the user and is convenient for the user to operate.
  • the senor is an ultrasonic sensor
  • the wearable device recognizes a motion trajectory of the gesture in the effective recognition zone according to the ultrasonic wave emitted by the ultrasonic sensor.
  • the embodiment of the present application provides a wearable device, including:
  • An identification module configured to identify a motion track of a gesture in the effective recognition area, wherein the motion track is a character; the valid recognition area is a sensing area of a sensor in the wearable device;
  • a processing module configured to determine, according to a correspondence table between the character and the application, an application corresponding to the character
  • the startup module is configured to start an application corresponding to the character.
  • the correspondence table is a one-to-one or one-to-man correspondence table between characters and applications.
  • the startup module is specifically configured to display an application list corresponding to the character, select a target application from the application list, and start the target application.
  • the startup module is specifically configured to display an application list corresponding to the character according to the priority.
  • the wearable device is a wrist wearable device
  • the sensor is integrated at a portion of the wrist wearable device near the back side of the wearer's hand, the effective identification area including the wearer The area where the back of the hand is located and the area around the back of the hand.
  • the identification module is specifically configured to identify a motion trajectory of the gesture in the effective recognition area according to the ultrasonic wave emitted by the ultrasonic sensor.
  • an embodiment of the present application provides a wearable device, including: a sensor, a processor, a memory, and a display screen, wherein the sensor is configured to collect a motion track of a gesture in the effective recognition area, and the memory is used for storing An instruction and data to be executed by the processor, the processor is configured to execute the instruction in the memory to identify the motion track, the motion track is a character, and according to a correspondence table between the character and the application, determine an application corresponding to the character and The application is turned on and the display is used to display an open application.
  • an embodiment of the present invention provides a computer storage medium for storing computer software instructions for the wearable device in the first aspect, which includes a program designed to perform the above aspects.
  • an embodiment of the present invention provides a chip system, at least one processor, a memory, an input/output portion, and a bus; and the at least one processor acquires an instruction in the memory through the bus, to implement the foregoing.
  • the method of the first aspect relates to the design function of the wearable device.
  • An embodiment of the present invention provides an interaction method and a wearable device, by collecting and identifying a motion track of a gesture in a wearable device, and storing a correspondence between a character and an application according to a character query formed by the motion track, thereby determining a character.
  • the corresponding application the application of the wearable device is invoked by different gestures, the steps of the user interacting with the wearable device are reduced, the operation complexity is reduced, and the operation time is reduced.
  • the effective recognition area provides the user with a larger operation space, and the display screen of the wearable device is not blocked during the user operation.
  • FIG. 1 is a schematic structural diagram of a wearable device to which an interaction method is applied in an embodiment of the present application
  • Embodiment 1 of an interaction method according to an embodiment of the present application
  • FIG. 3 is a schematic diagram of a process for starting an application corresponding to a gesture in an interaction method according to an embodiment of the present disclosure
  • Embodiment 4 is a flowchart of Embodiment 2 of an interaction method according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a process of quickly finding a contact by an interaction method according to an embodiment of the present application
  • FIG. 6 is a schematic diagram of a process for quickly launching an application APP by an interaction method according to an embodiment of the present application
  • FIG. 7 is a schematic diagram of a process for quickly unlocking an interaction method according to an embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of Embodiment 2 of a wearable device according to an embodiment of the present application.
  • the embodiment of the present application provides an interaction method and a wearable device, which invokes a function of a wearable device through different gestures, reduces steps for the user to interact with the wearable device, and reduces the operation. Reduce complexity while reducing complexity.
  • FIG. 1 is a schematic structural diagram of a wearable device to which an interaction method is applied in an embodiment of the present application.
  • the wearable device 100 to which the embodiment of the present application is applied includes: a sensor 11 , a processor 12 , a memory 13 , and a display screen 14 , wherein the sensor 11 is configured to collect a motion track of a gesture in the effective recognition area.
  • the memory 13 is configured to store instructions and data to be executed by the processor 12, and the processor 12 is configured to execute an instruction in the memory 13 to identify a motion trajectory of a character, according to a correspondence table between the character and the application.
  • the application corresponding to the character is determined and the application is opened, and the display 14 is used to display the open application.
  • the wearable device may have more or less components than those shown in FIG. 1, may combine two or more components, or may have different component configurations or settings, and each component may be Hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • FIG. 2 is a flowchart of Embodiment 1 of the interaction method of the present application, including:
  • a sensor capable of recognizing a finger or an object trajectory such as an ultrasonic wave, an infrared ray (structure light), a radar, or a camera is integrated in the wearable device.
  • the trajectory of a gesture or an object is a gesture.
  • the sensor in the wearable device forms a sensing area around the sensor, and the sensing area is an effective identification area.
  • a smart watch is usually worn on the user's wrist and close to the back of the hand, so the sensor is deployed on the back side of the smart watch.
  • the senor forms a sensing area in a certain height above the back of the hand and the back of the hand.
  • the sensing area is an effective identification area.
  • FIG. 3 is a schematic diagram of a process corresponding to the application of the startup gesture in the interaction method of the present application.
  • the sensor forms an effective recognition area at a certain height above the back of the user and the back of the hand, and the sensor can collect the effective identification area.
  • the motion track of the gesture, and the motion track that effectively recognizes the gesture outside the area cannot be acquired.
  • the effective recognition area when the punch eye is facing the user, the effective recognition area is located above the back of the hand, and the sensor can capture the motion trajectory of the in-plane gesture formed by the x-axis and the z-axis; for example, the punch eye is facing the ground, the effective recognition area is located on the back of the hand, and the sensor can collect In-plane hand formed by x-axis and y-axis The trajectory of the potential.
  • the motion track is recognized by the processor, and the character corresponding to the motion track is identified.
  • the sensor collects the motion track of the gesture in the effective recognition area the motion track can be collected in different dimensions. In the following, a detailed description will be given on how the sensor collects the motion trajectory of the gesture in the effective recognition area.
  • the senor may be at least one.
  • the motion trajectory in the one-dimensional case may be collected; when the sensor is at least two, the motion in the two-dimensional case may be performed.
  • the trajectory is collected; when the sensor is at least three, the trajectory in the three-dimensional case can be collected.
  • the acquisition of motion trajectories can be achieved with one sensor.
  • the sensor emits infrared or ultrasonic waves, measures the time that the infrared or ultrasonic waves return after encountering an obstacle, and calculates the distance between the sensor and the measured object according to the range time.
  • the distance between the sensor and the measured object is derived, and the motion trajectory is obtained.
  • the X-axis and the Y-axis are determined to require at least two sensors, such as sensor A and sensor B, and the distance L between sensor A and sensor B is known.
  • the distances L11 and L21 between the object and the measured object are respectively measured by the sensor A and the sensor B at time t1, and the distances L12 and L22 with the object to be measured are respectively measured at time t2.
  • the offset of the measured object in the X-axis and Y-axis directions on the plane can be calculated, and the continuous measurement can determine the measured object in the plane. Movement track.
  • the X-axis, Y-axis, and Z-axis are determined, and at least three sensors are required, such as sensor A, sensor B, and sensor C.
  • the three sensors are not collinear, and the distance between each sensor is known.
  • the distance between sensor A, sensor B and sensor C and the measured object are L11, L21 and L31 respectively.
  • the distance between sensor A, sensor B and sensor C and the measured object at time t2 is L12 and L22 respectively.
  • L32 L32.
  • the X-axis and Y-axis of the measured object in the space in the period from t1 to t2 can be calculated.
  • the offset in the Z-axis direction the continuous measurement can determine the motion trajectory of the measured object in space.
  • the sensor A can be disposed at the intersection of the X/Z plane and the X/Y plane, and the sensor B is disposed in the X/Y plane, and the sensor C Set in the X/Z plane, the sensor A and the sensor B can measure the offset of the measured object in the X-axis and Y-axis directions from the time period t1 to t2, and the sensor A and the sensor C can be used.
  • the offset of the measured object in the X-axis and Z-axis directions during the period from t1 to t2 is measured, thereby obtaining the offset of the measured object on the X-axis, the Y-axis, and the Z-axis.
  • the effective identification area is a sensing area of the sensor, which is independent of the body part of the user, and is only related to the wearing position.
  • the sensor is formed at a certain height above the palm and the palm of the user regardless of how the user's wrist rotates.
  • an effective recognition area is formed within the height.
  • the sensing area of the sensor is large, in order to avoid the sensor collecting the motion track in a large sensing area, and the processor identifying the motion track, the power consumption of the wearable device is increased,
  • the effective identification area may also be part of the sensing area, and the sensor only collects the motion trajectory in the partial sensing area.
  • the processor After identifying the character corresponding to the motion track, the processor saves the correspondence table between the character and the application according to the character query, thereby determining the application corresponding to the character. Specifically, the processor processes the motion track of the finger or the object through a software algorithm, and recognizes characters in the form of letters, numbers, Chinese characters, and the like corresponding to the motion track, and queries the correspondence table according to the characters, thereby determining the application corresponding to the characters.
  • the correspondence between the characters and the application is stored in the correspondence table, and the applications corresponding to different characters are different, and the correspondence tables in different scenarios are different.
  • the correspondence table may be dynamically updated.
  • the processor After determining the application corresponding to the character, the processor turns on the application corresponding to the character.
  • the application includes opening the APP, finding a contact, decrypting the screen saver, taking a screen shot, entering a specific mode, and the like.
  • the processor recognizes that the character corresponding to the motion track is the letter “A” through the software algorithm, and the query correspondence table determines that the application corresponding to the letter “A” is Alipay, and the processor turns on the Alipay.
  • the user can perform the purpose of turning on Alipay by performing only one step of inputting a gesture in the valid recognition area.
  • the interaction method provided by the embodiment of the present application collects and recognizes the motion track of the gesture in the area of the wearable device, and saves the correspondence table between the character and the application according to the character query formed by the motion track, thereby determining the application corresponding to the character.
  • the application of the wearable device is invoked by different gestures, the steps of the user interacting with the wearable device are reduced, the operation complexity is reduced, and the operation time is reduced.
  • the effective recognition area provides the user with a larger operation space, and the display screen of the wearable device is not blocked during the user operation.
  • FIG. 4 is a flowchart of Embodiment 2 of an interaction method according to an embodiment of the present application, including:
  • the distance and amount of time related to the movement of the finger or the object in the effective recognition area are collected by the ultrasonic transceiver on the ultrasonic sensor.
  • the processor performs noise reduction processing on the ultrasonic data collected in 201, and then integrates the distance collected by each ultrasonic transceiver with the amount of time to generate a motion trajectory of the finger or the object, and then recognizes the gesture according to the trajectory. That is, characters, such as numbers, letters, Chinese characters, and so on.
  • the correspondence table has the following characteristics: first, the correspondence table is predefined by the system, or is customized by the user according to the scenario; second, the correspondence table of each scenario is independent, the system and the user It can be dynamically set as needed.
  • the wearable device is a smart watch. When the smart watch is currently in the desktop interface, the character A can be set to start the Alipay application; when the smart watch is currently in the dialing interface, the character A can be set to Dialing to Anny et al.
  • the correspondence table is a one-to-one correspondence table between characters and applications; or, the correspondence table is a one-to-man correspondence table between characters and applications.
  • the application corresponding to the gesture is unique, the application is directly started.
  • the user enters the character "A" in the valid identification area, and the character "A" has only one candidate map, such as Anny.
  • the smart watch directly dials Anny.
  • the correspondence relationship table is a one-to-many correspondence table between characters and applications.
  • the application corresponding to the characters is at least two
  • the application list corresponding to the characters is displayed on the display screen.
  • the user inputs the character "A" in the valid identification area, and the character "A" has multiple candidate mappings, such as Anny, Andred, etc.
  • the application list is displayed, and the user determines according to the application list. Who to dial.
  • the user selects a target application from the application list, and starts the target application.
  • the user performs a secondary input selection by using a “up, down, left, and right” gesture or inputting a first difference character, etc., thereby selecting a target application from the application list, and starting the target application by the wearable device.
  • a “up, down, left, and right” gesture or inputting a first difference character, etc., thereby selecting a target application from the application list, and starting the target application by the wearable device.
  • the character "A" has multiple candidate mappings, and the user selects to determine the object that needs to be dialed.
  • FIG. 5 is a schematic diagram of a process for quickly searching for a contact according to an interaction method according to an embodiment of the present disclosure.
  • FIG. 6 is a schematic diagram of a process for quickly starting an APP by using an interaction method according to an embodiment of the present application.
  • the wearable device when the wearable device is in the standby mode, the letter “C” (which may also be other letters, numbers or other gestures) is written on the back of the hand, and the sensor collects the motion track of the gesture, and the processor recognizes the motion track.
  • the query correspondence table determines that the application corresponding to the character "C" is the address book, and directly calls the contact, that is, directly opens the address book (contracts) of the wearable device; the user performs the gesture again in the address book mode.
  • the sensor collects the motion track of the gesture, and the processor recognizes that the motion track is the character "Z”, then directly selects the contact mode starting with “Z”, if the contact starts with "Z” If there is only one, the user is prompted to dial or directly dial; if there are at least two contacts starting with “Z", the contacts are displayed on the display first, and the processor selects the target contact according to the user's sliding operation. dial.
  • the processor displays the contacts according to the priority, so that the priorities of the contacts displayed on the display are sorted, specifically, See Table 1, Table 1 is a quick priority map of the specific contacts when looking up contacts.
  • the order of the contacts is modulated according to the user's sliding operation; if the priority is the maximum number of calls, the processor according to the number of calls Sort contacts and display them on the display.
  • the wearable device when the wearable device is in the standby mode, the letter "A" (which may be other letters, numbers, or other gestures) is written on the back of the hand, and the sensor collects the trajectory of the gesture, and the processor recognizes the motion.
  • the track is the character "A”
  • the query correspondence table determines that the application corresponding to the character "A” is Alipay, and the processor directly opens the Alipay.
  • the wearable device When there are multiple APPs corresponding to the character "A", for example, the wearable device has Alipay, iQiyi, and Facebook, that is, the character "A" and the application are one-to-many correspondence. At this time, the processor is prioritized.
  • the APP displays the APPs so that the priority of the APP displayed on the display screen is sorted. For details, refer to Table 2. Table 2 shows the priority map of the APP when the APP is started quickly.
  • the processor when the user inputs the character “A” in the valid identification area, the processor quickly starts the APP. As for which APP to start, it is determined according to the priority. For example, when the priority is user-defined, if the user-defined APP is Alipay, the processor quickly calls Alipay; for example, when the priority is used by the application. When the APP has the most number of times, the processor determines the most frequently used APP and quickly calls the most frequently used APP.
  • the wearable device when the wearable device is in the standby mode, the letter “L” (which may be other letters, numbers or other gestures) is written on the back of the hand, and the sensor collects the motion track of the gesture, and the processor recognizes the motion track. If the character corresponding to the character "L" is the unlocking operation, the unlocking operation is performed on the wearable device.
  • the character corresponding to the character "L” is the unlocking operation, the unlocking operation is performed on the wearable device.
  • FIG. 8 is a schematic structural diagram of Embodiment 2 of a wearable device according to an embodiment of the present invention.
  • the wearable device provided in this embodiment can implement various steps of the method applied to the wearable device provided by any embodiment of the present invention.
  • the wearable device 200 provided in this embodiment includes:
  • the identification module 21 is configured to identify a motion track of the gesture in the effective recognition area, wherein the motion track is a character; the valid recognition area is a sensing area of the sensor in the wearable device;
  • the processing module 22 is configured to determine an application corresponding to the character according to a correspondence table between the character and the application;
  • the startup module 23 is configured to start an application corresponding to the character.
  • the wearable device collects and recognizes the motion track of the gesture in the area of the wearable device, and saves the correspondence between the character and the application according to the character query formed by the motion track, thereby determining the application corresponding to the character.
  • the application of the wearable device is invoked by different gestures, the steps of the user interacting with the wearable device are reduced, the operation complexity is reduced, and the operation time is reduced.
  • the effective recognition area provides the user with a larger operation space, and the display screen of the wearable device is not blocked during the user operation.
  • the correspondence relationship table is a one-to-one or one-to-many correspondence table between characters and applications.
  • the startup module 23 is specifically configured to display an application list corresponding to the character, select a target application from the application list, and start the target application.
  • the startup module 23 when the application list corresponding to the character is displayed, the startup module 23 is specifically configured to display an application list corresponding to the character according to a priority.
  • the wearable device is a wrist wearable device
  • the sensor is integrated in a position of the wrist wearable device near a back side of the wearer's hand
  • the effective identification area includes The area of the wearer's back and the area around the back of the hand.
  • the identification module 21 is specifically configured to identify a motion trajectory of the gesture in the effective recognition area according to the ultrasonic wave emitted by the ultrasonic sensor.
  • the interaction method and the wearable device may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.

Abstract

一种交互方法及可穿戴设备(100),通过采集并识别可穿戴设备(100)有效识别区内手势的运动轨迹,根据运动轨迹形成的字符查询保存字符与应用的对应关系表,从而确定出字符对应的应用。该方法中,通过不同的手势调用可穿戴设备(100)的应用,减少用户与可穿戴设备(100)交互的步骤,降低操作复杂度的同时减少操作时长。同时,通过有效识别区为用户提供更大的操作空间,而且用户操作过程中不会遮挡可穿戴设备(100)的显示屏(14)。

Description

交互方法及可穿戴设备 技术领域
本申请实施例涉及可穿戴设备技术,尤其涉及一种交互方法及可穿戴设备。
背景技术
目前,可穿戴设备的功能越来也多,用户根据需要可以调用可穿戴设备的不同应用,如开启不同的应用程序(Application,APP)、查找指定联系人、解密屏保以及截屏等。
为调动可穿戴设备的某个应用,通常情况下,用户需要对操作界面进行若干个步骤的操作。例如,当用户需要打开某个APP时,若该APP位于一级界面,则依次执行如下动作:唤醒可穿戴设备→打开应用界面出现供选择的APP→滑动菜单→选中APP;再如,当需要开启的APP位于二级界面时,依次执行如下动作:唤醒可穿戴设备→切换到应用程序界面→开启应用程序出现供选择的APP界面→滑动菜单→选中APP。
通常情况下,可穿戴设备的操作界面有限,例如,智能手表的表盘尺寸大多是参考传统手表设计,表盘的直径比较小,通常在45毫米左右。上述交互方式中,用户需要在有限的操作界面上进行一系列的操作动作,操作过程繁琐、复杂度高,且操作时间长。
发明内容
本申请实施例提供一种交互方法及可穿戴设备,通过不同的手势调用可穿戴设备的功能,减少用户与可穿戴设备交互的步骤,降低操作复杂度的同时减少操作时长。
第一方面,本申请实施例提供一种交互方法,该方法是从可穿戴设备的角度进行描述的,该方法中,可穿戴设备通过传感器采集有效识别区内用户的手势的运动轨迹,处理器根据运动轨迹形成的字符查询保存字符与应用的对应关系表,从而确定出字符对应的应用并开启应用。
上述方法中,通过不同的手势调用可穿戴设备的应用,减少用户与可穿戴设备交互的步骤,降低操作复杂度的同时减少操作时长。同时,通过有效识别区为用户提供更大的操作空间,而且用户操作过程中不会遮挡可穿戴设备的显示屏。
在一种可行的设计中,所述对应关系表为字符与应用的一对一或者一对多的对应关系表。当对应关系表为字符与应用的一对一的对应关系表时,可穿戴设备在确定出字符对应的应用后,直接开启该唯一的应用;当对应关系表为字符与应用的一对多的对应关系表时,可穿戴设备显示与所述字符对应的应用列表;根据用户操作从所述应用列表中选择目标应用并启动。
上述方法中,可以灵活的设置字符与应用的对应关系。
在一种可行的设计中,当对应关系表为字符与应用的一对多的对应关系表时,可穿戴设备确定字符对应的各个应用的优先级,根据优先级在显示屏上显示与所述字符对应的应用列表。
上述方法中,可灵活设置显示在显示屏上的应用的顺序,方便用户的选择。
在一种可行的设计中,所述可穿戴设备为腕式可穿戴设备,所述传感器集成在所述腕式可穿戴设备靠近穿戴者手背侧的部位,所述有效识别区包括所述穿戴者的手背所在区域以及所述手背周边区域。
上述方法中,将传感器部署在腕式可穿戴设备靠近穿戴者手背侧的部位,使得有效识别区域与用户的关系较近,方便用户操作。
在一种可行的设计中,所述传感器为超声波传感器,可穿戴设备根据所述超声波传感器发出的超声波识别所述有效识别区内手势的运动轨迹。
第二方面,本申请实施例提供一种可穿戴设备,包括:
识别模块,用于识别有效识别区内手势的运动轨迹,其中,所述运动轨迹为字符;所述有效识别区为所述可穿戴设备内的传感器的感应区域;
处理模块,用于根据字符与应用的对应关系表,确定所述字符对应的应用;
启动模块,用于启动所述字符对应的应用。
在一种可行的设计中,所述对应关系表为字符与应用的一对一或者一对多的对应关系表。
在一种可行的设计中,所述启动模块,具体用于显示与所述字符对应的应用列表,从所述应用列表中选择目标应用,启动所述目标应用。
在一种可行的设计中,所述启动模块在显示与所述字符对应的应用列表时,具体用于根据优先级显示与所述字符对应的应用列表。
在一种可行的设计中,所述可穿戴设备为腕式可穿戴设备,所述传感器集成在所述腕式可穿戴设备靠近穿戴者手背侧的部位,所述有效识别区包括所述穿戴者的手背所在区域以及所述手背周边区域。
在一种可行的设计中,所述识别模块,在所述传感器为超声波传感器时,具体用于根据所述超声波传感器发出的超声波识别所述有效识别区内手势的运动轨迹。
第三方面,本申请实施例提供一种可穿戴设备,包括:传感器、处理器、内存和显示屏,其中,所述传感器用于采集有效识别区域内手势的运动轨迹,所述内存用于存储处理器要执行的指令和数据,所述处理器用于执行所述内存中的指令以识别所述运动轨迹,所述运动轨迹为字符,根据字符与应用的对应关系表,确定字符对应的应用并开启该应用,所述显示屏用于显示开启的应用。
第四方面,本发明实施例提供了一种计算机存储介质,用于储存上述第一方面中的可穿戴设备所用的计算机软件指令,其包含用于执行上述方面所设计的程序。
第五方面,本发明实施例提供一种芯片系统,至少一个处理器,存储器,输入输出部分和总线;所述至少一个处理器通过所述总线获取所述存储器中的指令,以用于实现上述第一方面所述方法涉及的可穿戴设备的设计功能。
本申请实施例提供一种交互方法及可穿戴设备,通过采集并识别可穿戴设备有效识别区内手势的运动轨迹,根据运动轨迹形成的字符查询保存字符与应用的对应关系表,从而确定出字符对应的应用。该方法中,通过不同的手势调用可穿戴设备的应用,减少用户与可穿戴设备交互的步骤,降低操作复杂度的同时减少操作时长。同时,通过有效识别区为用户提供更大的操作空间,而且用户操作过程中不会遮挡可穿戴设备的显示屏。
附图说明
图1为本申请实施例交互方法所适用的可穿戴设备的结构示意图;
图2为本申请实施例交互方法实施例一的流程图;
图3为本申请实施例交互方法中启动手势对应的应用的过程示意图;
图4为本申请实施例交互方法实施例二的流程图;
图5为本申请实施例交互方法快速查找联系人的过程示意图;
图6为本申请实施例交互方法快速启动应用APP的过程示意图;
图7为本申请实施例交互方法快速解锁的过程示意图;
图8为本申请实施例可穿戴设备实施例二的结构示意图。
具体实施方式
本申请实施例的说明书和权利要求书及上述附图中的术语“第一”、“第二”、“第三”、“第四”等(如果存在)是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请的实施例,除了能够用这里图示或描述的方式实施外,还可以用图示以外的方式实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
目前,用户若需要开启可穿戴设备的不同应用,需要对操作界面进行若干个步骤的操作。例如,当用户需要打开某个APP时,若该APP位于一级界面,则依次执行如下动作:唤醒可穿戴设备→打开应用界面出现供选择的APP→滑动菜单→选中APP,此时需要执行4步操作;再如,当需要开启的APP位于二级界面时,依次执行如下动作:唤醒可穿戴设备→切换到应用程序界面→开启应用程序出现供选择的APP界面→滑动菜单→选中APP,此时需要执行5步操作。然而,可穿戴设备的操作界面有限,上述交互方式中,用户需要在有限的操作界面上进行一系列的操作动作,操作过程繁琐、复杂度高,且操作时间长。
有鉴于此,本申请实施例提供一种交互方法及可穿戴设备,通过不同的手势调用可穿戴设备的功能,减少用户与可穿戴设备交互的步骤,降低操 作复杂度的同时减少操作时长。
图1为本申请实施例交互方法所适用的可穿戴设备的结构示意图。请参照图1,本申请实施例所适用的可穿戴设备100包括:传感器11、处理器12、内存13和显示屏14,其中,所述传感器11用于采集有效识别区域内手势的运动轨迹,所述内存13用于存储处理器12要执行的指令和数据,所述处理器12用于执行所述内存13中的指令以识别具体为字符的运动轨迹,根据字符与应用的对应关系表,确定字符对应的应用并开启该应用,显示屏14用于显示开启的应用。
本申请实施例涉及的可穿戴设备可以具有比图1所示出的更多或更少的部件,可以组合两个或更多个部件,或者可以具有不同的部件配置或设置,各个部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件或硬件和软件的组合实现。
下面,在图1的基础上对本申请实施例所述的交互方法进行详细说明。具体的,可参见图2,图2为本申请交互方法实施例一的流程图,包括:
101、识别有效识别区内手势的运动轨迹,其中,所述运动轨迹为字符;所述有效识别区为所述可穿戴设备内的传感器的感应区域。
本申请实施例中,可穿戴设备内集成超声波、红外线(结构光)、雷达或摄像头等能够识别手指或物体轨迹的传感器。其中,手势或物体(例如笔等)移动的轨迹即为手势。当用户佩戴可穿戴设备时,可穿戴设备内的传感器在传感器周边形成感应区域,该感应区域即为有效识别区域。例如,以智能手表为例,智能手表通常佩戴在用户手腕且靠近手背上,因此,在智能手表靠近手背侧部署传感器。此时,传感器在手背和手背上方一定高度内形成感应区域,该感应区域即为有效识别区域,具体的,可参见图3,图3为本申请交互方法中启动手势对应的应用的过程示意图。
请参照图3,智能手表佩戴在用户的手腕上且在手背方向时,不论用户的手腕如何转动,传感器都会在用户的手背和手背上方一定高度内形成有效识别区,传感器能够采集有效识别区内手势的运动轨迹,而无法采集有效识别区外手势的运动轨迹。例如,拳眼朝向用户时,有效识别区位于手背上方,传感器能够采集x轴和z轴形成的平面内手势的运动轨迹;再如,拳眼朝向地面,有效识别区位于手背上,传感器能够采集x轴和y轴形成的平面内手 势的运动轨迹。传感器采集到运动轨迹后,由处理器对该运动轨迹进行识别,识别出该运动轨迹对应的字符。其中,传感器采集有效识别区内手势的运动轨迹时,可以对不同维度情况下运动轨迹进行采集。下面,对传感器如何采集有效识别区内手势的运动轨迹进行详细说明。
具体的,本申请实施例中,传感器可以至少为一个,当传感器至少为一个时,可以对一维情况下的运动轨迹进行采集;当传感器至少为两个时,可以对二维情况下的运动轨迹进行采集;当传感器至少为三个时,可以对三维情况下的运动轨迹进行采集。
对于一维线性情况,通过一个传感器即可实现运动轨迹的采集。采集过程中,该传感器发射出红外线或者超声波,测量该红外线或超声波遇到障碍物后返回的时间,根据范围时间计算出传感器与被测物体之间的距离。通过记录并得到两次不同的t1时刻和t2时刻传感器与被测物体之间的距离,从而推算出被测物体在t1~t2时间段沿之间沿直线方向移动的距离,从而得到运动轨迹。
对于二维平面情况,X轴和Y轴确定,最少需要用到两个传感器,例如为传感器A与传感器B,传感器A与传感器B之间的距离L已知。在测量被测物体时,由传感器A和传感器B在t1时刻分别测量与被测物体之间的距离L11和L21,在t2时刻分别测量与被测物体之间的距离L12和L22。然后,根据L、L11、L21、L12和L22即可计算出被测物体在平面上的X轴和Y轴方向上的偏移量,通过连续测量即可确定出被测物体在该平面内的运动轨迹。
对于三维空间情况,X轴、Y轴和Z轴确定,最少需要三个传感器,例如传感器A、传感器B与传感器C,该三个传感器不共线,且每俩个传感器之间的距离已知。t1时刻,传感器A、传感器B与传感器C与被测物体之间的距离分别为L11、L21和L31,t2时刻传感器A、传感器B与传感器C与被测物体之间的距离分别为L12、L22和L32。然后,根据L11、L21、L31、L12、L22和L32以及每俩个传感器之间的固定距离即可计算出被测物体在t1~t2时间段内被测物体在空间内的X轴、Y轴和Z轴方向上的偏移量,通过连续测量即可确定出被测物体在空间内运动轨迹。
需要说明的是,对于三维空间情况,为了便于计算,可将传感器A设置于X/Z平面和X/Y平面的交界处,传感器B设置于X/Y平面内,传感器C 设置于X/Z平面内,此时通过传感器A和传感器B即可测得t1到t2时间段内被测物体在X轴和Y轴方向上的偏移量,通过传感器A和传感器C即可测得t1到t2时间段内被测物体在X轴和Z轴方向上的偏移量,从而得到被测物体在X轴、Y轴和Z轴上的偏移量。
需要说明的是,本申请实施例中,有效识别区是传感器的感应区域,其与用户的身体部位无关,仅与佩戴位置有关。例如,当可穿戴设备佩戴在用户的手腕且在手心方向、且传感器集成在可穿戴设备靠近手心的一侧时,不论用户的手腕如何转动,传感器都会在用户的手心和手心上方一定高度内形成有效识别区,而当可穿戴设备佩戴在用户的手腕且在手背方向、且传感器集成在可穿戴设备靠近手背的一侧时,不论用户的手腕如何转动,传感器都会在用的手背或手背上方一定高度内形成有效识别区。
另外,还需要说明的是,由于传感器的感应区域较大,为避免传感器在较大的感应区域内采集运动轨迹、处理器对该运动轨迹进行识别导致可穿戴设备的耗电量增大,本申请实施例中,有效识别区也可以为感应区域的一部分,传感器仅对部分感应区域内的运动轨迹进行采集。
102、根据字符与应用的对应关系表,确定所述字符对应的应用。
在识别出运动轨迹对应的字符后,处理器根据字符查询保存字符与应用的对应关系表,从而确定出字符对应的应用。具体的,处理器通过软件算法处理手指或物体的运动轨迹,并识别出运动轨迹对应的字母、数字、汉字等形式的字符,根据字符查询对应关系表,从而确定出字符对应的应用。其中,对应关系表中存储字符与应用的对应关系,不同的字符对应的应用不同,不同场景下的对应关系表不同。另外,当需要对对应关系表添加或删除某个对应关系时,可以动态更新对应关系表。
103、启动字符对应的应用。
在确定出字符对应的应用后,处理器开启字符对应的应用。其中,应用包括开启APP、查找制定联系人、解密屏保、截屏、进入特定模式等操作。继续参照图3,处理器通过软件算法识别出运动轨迹对应的字符为字母“A”,查询对应关系表确定出字母“A”对应的应用为开启支付宝(Alipay),则处理器开启Alipay。根据图3可知:用户仅执行了在有效识别区输入手势这一个步骤,即可实现开启Alipay的目的。
本申请实施例提供的交互方法,通过采集并识别可穿戴设备有效识别区内手势的运动轨迹,根据运动轨迹形成的字符查询保存字符与应用的对应关系表,从而确定出字符对应的应用。该方法中,通过不同的手势调用可穿戴设备的应用,减少用户与可穿戴设备交互的步骤,降低操作复杂度的同时减少操作时长。同时,通过有效识别区为用户提供更大的操作空间,而且用户操作过程中不会遮挡可穿戴设备的显示屏。
下面,以可穿戴设备上设置超声波传感器为例,对上述的交互方法进行详细说明。具体的,可参见图4,图4为本申请实施例交互方法实施例二的流程图,包括:
201、超声波数据采集。
本步骤中,通过超声波传感器上的超声波收发器,采集有效识别区内手指或物体移动相关的距离和时间量。具体的,可参见上述步骤101的描述,此处不再赘述。
202、超声波识别处理。
本步骤中,处理器对201中采集到的超声波数据进行降噪处理,然后将各个超声波收发器采集到的距离与时间量进行整合,生成手指或物体的运动轨迹,再根据轨迹识别出手势,即字符,如数字、字母、汉字等。
203、根据字符与应用的对应关系表,确定所述字符对应的应用。
本申请实施例中,对应关系表具有如下特点:第一、对应关系表是系统预先定义的,或者是用户根据场景自定义的;第二、各个场景的对应关系表是独立的,系统和用户可以根据需要动态设置,以可穿戴设备为智能手表为例,当智能手表当前处于桌面界面时,字符A可以设定为启动Alipay应用;当智能手表当前处于拨号界面时,字符A可以设定为拨号给Anny等;第三、对应关系表为字符与应用的一一对应关系表;或者,对应关系表为字符与应用的一对多的对应关系表。
204、确定字符对应的应用是否唯一,若唯一,则执行205;若不唯一,则执行206。
205、快速启动应用。
本步骤中,对于一个具体的场景,若手势对应的应用唯一,则直接启动该应用。例如,拨号场景下,用户在有效识别区内输入字符“A”,而字符 “A”仅有一个待选映射,如Anny,此时,智能手表直接对Anny进行拨号。
206、显示与所述字符对应的应用列表。
本步骤中,对应关系表为字符与应用的一对多的对应关系表,对于一个具体的场景,若字符对应的应用至少为两个,则在显示屏上显示与字符对应的应用列表。例如,拨号场景下,用户在有效识别区内输入字符“A”,而字符“A”有多个待选映射,如Anny,Andred等,此时,显示应用列表,由用户根据应用列表确定要给谁拨号。
207、用户从应用列表中选择目标应用,启动目标应用。
本步骤中,用户通过“上下左右”手势或者输入第一个差异字符等进行二次输入选择,从而从应用列表中选择目标应用,并由可穿戴设备启动目标应用。例如,在拨号场景下,字符“A”有多个待选映射,通过用户选择从而确定出需要拨号的对象。
下面,用几个具体的实施例对上述的交互方法进行详细说明。具体的,可参见图5、图6与图7,图5为本申请实施例交互方法快速查找联系人的过程示意图,图6为本申请实施例交互方法快速启动APP的过程示意图,图7为本申请实施例交互方法快速解锁的过程示意图。
请参照图5,当可穿戴设备为待机模式下,在手背上书写字母“C”(也可以为其他字母、数字或其他手势),传感器采集到手势的运动轨迹,处理器识别出该运动轨迹为字母“C”,查询对应关系表确定出字符“C”对应的应用为通讯录,则直接调用联系人,即直接打开可穿戴设备的通讯录(contracts);用户在通讯录模式再进行手势输入,如“Z”,传感器采集到手势的运动轨迹,处理器识别出该运动轨迹为字符“Z”,则直接选中以“Z”开头的联系人模式,若以“Z”开头的联系人仅有一个,则提示用户拨号或直接拨号;若以“Z”开头的联系人至少为两个,则先显示屏上显示该些联系人,处理器根据用户的滑动操作等选择目标联系人从而拨号。
上述快速查找联系人的过程中,当联系人至少为两个时,处理器根据优先级显示该些联系人,使得显示屏上显示的联系人的优先级是有一定排序的,具体的,可参见表1,表1为快速启动查找联系人时特定联系人的优先级示意表。
表1
Figure PCTCN2016110873-appb-000001
请参照表1,当联系人至少为两个时,若优先级为用户自定义的,则根据用户的滑动操作等调制联系人的顺序;若优先级为通话次数最多,则处理器按照通话次数对联系人进行排序并在显示屏上显示。
请参照图6,当可穿戴设备为待机模式下,在手背上书写字母“A”(也可为其他字母、数字或其他手势),传感器采集到手势的运功轨迹,处理器识别出该运动轨迹为字符“A”,查询对应关系表确定出字符“A”对应的应用为支付宝,则处理器直接打开支付宝。
当字符“A”对应的APP有多个时,例如可穿戴设备上具有支付宝、爱奇艺、阿里巴巴,即字符“A”与应用是一对多的对应关系,此时,处理器根据优先级显示该些APP,使得显示屏上显示的APP的优先级是有一定排序的,具体的,可参见表2,表2为快速启动APP时APP的优先级示意表。
表2
优先级 用户自定义
优先级1 应用使用次数
优先级2 重要程度:如涉及财产等
根据图6与表2可知:当用户在有效识别区内输入字符“A”时,处理器快速启动APP。至于启动哪个APP,则是根据优先级确定出的,例如,当优先级为用户自定义时,若用户自定义的APP为支付宝,则处理器快速调用支付宝;再如,当优先级为应用使用次数最多的APP时,则处理器确定使用次数最多的APP,并快速调用该使用次数最多的APP。
请参照图7,当可穿戴设备为待机模式下,在手背上书写字母“L”(也可以为其他字母、数字或其他手势),传感器采集到手势的运动轨迹,处理器识别出该运动轨迹为字符“L”,查询对应关系表确定出字符“L”对应的应用是解锁操作,则直接对可穿戴设备进行解锁操作。
通过上述方法,确保解锁过程的私密性,操作过程中不会泄露操作痕迹,极大程度上保护了解锁手势。
图8为本发明实施例可穿戴设备实施例二的结构示意图。本实施例提供的可穿戴设备,其可实现本发明任意实施例提供的应用于可穿戴设备的方法的各个步骤。具体的,本实施例提供的可穿戴设备200包括:
识别模块21,用于识别有效识别区内手势的运动轨迹,其中,所述运动轨迹为字符;所述有效识别区为所述可穿戴设备内的传感器的感应区域;
处理模块22,用于根据字符与应用的对应关系表,确定所述字符对应的应用;
启动模块23,用于启动所述字符对应的应用。
本申请实施例提供的可穿戴设备,通过采集并识别可穿戴设备有效识别区内手势的运动轨迹,根据运动轨迹形成的字符查询保存字符与应用的对应关系表,从而确定出字符对应的应用。该方法中,通过不同的手势调用可穿戴设备的应用,减少用户与可穿戴设备交互的步骤,降低操作复杂度的同时减少操作时长。同时,通过有效识别区为用户提供更大的操作空间,而且用户操作过程中不会遮挡可穿戴设备的显示屏。
可选的,在本申请一实施例中,所述对应关系表为字符与应用的一对一或者一对多的对应关系表。
可选的,在本申请一实施例中,所述启动模块23,具体用于显示与所述字符对应的应用列表,从所述应用列表中选择目标应用,启动所述目标应用。
可选的,在本申请一实施例中,所述启动模块23在显示与所述字符对应的应用列表时,具体用于根据优先级显示与所述字符对应的应用列表。
可选的,在本申请一实施例中,所述可穿戴设备为腕式可穿戴设备,所述传感器集成在所述腕式可穿戴设备靠近穿戴者手背侧的部位,所述有效识别区包括所述穿戴者的手背所在区域以及所述手背周边区域。
可选的,在本申请一实施例中,所述识别模块21,在所述传感器为超声波传感器时,具体用于根据所述超声波传感器发出的超声波识别所述有效识别区内手势的运动轨迹。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应 过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所述交互方法和可穿戴设备,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
本领域普通技术人员可以理解:实现上述方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成,前述的程序可以存储于一处理器可读取存储介质中,该程序在执行时,执行包括上述方法实施例的步骤;而前述的存储介质包括:ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (13)

  1. 一种交互方法,其特征在于,适用于可穿戴设备,该方法包括:
    识别有效识别区内手势的运动轨迹,其中,所述运动轨迹为字符;所述有效识别区为所述可穿戴设备内的传感器的感应区域;
    根据字符与应用的对应关系表,确定所述字符对应的应用;
    启动所述字符对应的应用。
  2. 根据权利要求1所述的方法,其特征在于,
    所述对应关系表为字符与应用的一对一或者一对多的对应关系表。
  3. 根据权利要求2所述的方法,其特征在于,所述启动所述字符对应的应用,包括:
    显示与所述字符对应的应用列表;
    从所述应用列表中选择目标应用;
    启动所述目标应用。
  4. 根据权利要求3所述的方法,其特征在于,所述显示与所述字符对应的应用列表,包括:
    根据优先级显示与所述字符对应的应用列表。
  5. 根据权利要求1~4任一项所述的方法,其特征在于,所述可穿戴设备为腕式可穿戴设备,所述传感器集成在所述腕式可穿戴设备靠近穿戴者手背侧的部位,所述有效识别区包括所述穿戴者的手背所在区域以及所述手背周边区域。
  6. 根据权利要求1~5任一项所述的方法,其特征在于,所述传感器为超声波传感器,所述识别有效识别区内手势的运动轨迹,包括:
    根据所述超声波传感器发出的超声波识别所述有效识别区内手势的运动轨迹。
  7. 一种可穿戴设备,其特征在于,包括:
    识别模块,用于识别有效识别区内手势的运动轨迹,其中,所述运动轨迹为字符;所述有效识别区为所述可穿戴设备内的传感器的感应区域;
    处理模块,用于根据字符与应用的对应关系表,确定所述字符对应的应用;
    启动模块,用于启动所述字符对应的应用。
  8. 根据权利要求7所述的设备,其特征在于,
    所述对应关系表为字符与应用的一对一或者一对多的对应关系表。
  9. 根据权利要求8所述的设备,其特征在于,
    所述启动模块,具体用于显示与所述字符对应的应用列表,从所述应用列表中选择目标应用,启动所述目标应用。
  10. 根据权利要求9所述的设备,其特征在于,
    所述启动模块在显示与所述字符对应的应用列表时,具体用于根据优先级显示与所述字符对应的应用列表。
  11. 根据权利要求7~10任一项所述的设备,其特征在于,所述可穿戴设备为腕式可穿戴设备,所述传感器集成在所述腕式可穿戴设备靠近穿戴者手背侧的部位,所述有效识别区包括所述穿戴者的手背所在区域以及所述手背周边区域。
  12. 根据权利要求7~11任一项所述的设备,其特征在于,
    所述识别模块,在所述传感器为超声波传感器时,具体用于根据所述超声波传感器发出的超声波识别所述有效识别区内手势的运动轨迹。
  13. 一种可穿戴设备,其特征在于,包括:传感器、处理器、内存和显示屏,其中,所述传感器用于采集有效识别区域内手势的运动轨迹,所述内存用于存储处理器要执行的指令和数据,所述处理器用于执行所述内存中的指令以识别所述运动轨迹,所述运动轨迹为字符,根据字符与应用的对应关系表,确定字符对应的应用并开启该应用,所述显示屏用于显示开启的应用。
PCT/CN2016/110873 2016-09-26 2016-12-19 交互方法及可穿戴设备 WO2018053956A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201680056170.9A CN108139798A (zh) 2016-09-26 2016-12-19 交互方法及可穿戴设备

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610853257 2016-09-26
CN201610853257.0 2016-09-26

Publications (1)

Publication Number Publication Date
WO2018053956A1 true WO2018053956A1 (zh) 2018-03-29

Family

ID=61689813

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/110873 WO2018053956A1 (zh) 2016-09-26 2016-12-19 交互方法及可穿戴设备

Country Status (2)

Country Link
CN (1) CN108139798A (zh)
WO (1) WO2018053956A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110244843A (zh) * 2019-06-03 2019-09-17 努比亚技术有限公司 可穿戴设备控制方法、可穿戴设备及计算机可读存储介质

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117055738B (zh) * 2023-10-11 2024-01-19 湖北星纪魅族集团有限公司 手势识别方法、可穿戴设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103558920A (zh) * 2013-11-15 2014-02-05 深圳市中兴移动通信有限公司 非接触式姿势的处理方法及装置
US8830181B1 (en) * 2008-06-01 2014-09-09 Cypress Semiconductor Corporation Gesture recognition system for a touch-sensing surface
WO2015176228A1 (zh) * 2014-05-20 2015-11-26 华为技术有限公司 手势操作智能穿戴设备的方法和智能穿戴设备
CN105389003A (zh) * 2015-10-15 2016-03-09 广东欧珀移动通信有限公司 一种移动终端应用程序的控制方法和装置
US9372535B2 (en) * 2013-09-06 2016-06-21 Thalmic Labs Inc. Systems, articles, and methods for electromyography-based human-electronics interfaces
CN105807900A (zh) * 2014-12-30 2016-07-27 丰唐物联技术(深圳)有限公司 非接触式手势控制方法及智能终端

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014125294A1 (en) * 2013-02-15 2014-08-21 Elliptic Laboratories As Touchless user interfaces
CN105183331A (zh) * 2014-05-30 2015-12-23 北京奇虎科技有限公司 一种在电子设备上进行手势控制的方法和装置
CN104267898B (zh) * 2014-09-16 2018-08-28 北京数字天域科技有限责任公司 一种快捷触发应用程序或应用程序功能的方法及装置
CN105718064A (zh) * 2016-01-22 2016-06-29 南京大学 基于超声波的手势识别系统与方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8830181B1 (en) * 2008-06-01 2014-09-09 Cypress Semiconductor Corporation Gesture recognition system for a touch-sensing surface
US9372535B2 (en) * 2013-09-06 2016-06-21 Thalmic Labs Inc. Systems, articles, and methods for electromyography-based human-electronics interfaces
CN103558920A (zh) * 2013-11-15 2014-02-05 深圳市中兴移动通信有限公司 非接触式姿势的处理方法及装置
WO2015176228A1 (zh) * 2014-05-20 2015-11-26 华为技术有限公司 手势操作智能穿戴设备的方法和智能穿戴设备
CN105807900A (zh) * 2014-12-30 2016-07-27 丰唐物联技术(深圳)有限公司 非接触式手势控制方法及智能终端
CN105389003A (zh) * 2015-10-15 2016-03-09 广东欧珀移动通信有限公司 一种移动终端应用程序的控制方法和装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110244843A (zh) * 2019-06-03 2019-09-17 努比亚技术有限公司 可穿戴设备控制方法、可穿戴设备及计算机可读存储介质
CN110244843B (zh) * 2019-06-03 2023-12-08 努比亚技术有限公司 可穿戴设备控制方法、可穿戴设备及计算机可读存储介质

Also Published As

Publication number Publication date
CN108139798A (zh) 2018-06-08

Similar Documents

Publication Publication Date Title
US10948949B2 (en) Electronic apparatus having a hole area within screen and control method thereof
CN108664829B (zh) 用于提供与图像中对象有关的信息的设备
US9897808B2 (en) Smart glass
KR101184460B1 (ko) 마우스 포인터 제어 장치 및 방법
CN108513060B (zh) 使用外部电子设备的拍摄方法和支持该方法的电子设备
US9898090B2 (en) Apparatus, method and recording medium for controlling user interface using input image
CN108701043A (zh) 一种显示的处理方法及装置
CN105229582A (zh) 基于近距离传感器和图像传感器的手势检测
KR20150128377A (ko) 지문 처리 방법 및 그 전자 장치
WO2013106169A1 (en) Menu selection using tangible interaction with mobile devices
US10607069B2 (en) Determining a pointing vector for gestures performed before a depth camera
US9779552B2 (en) Information processing method and apparatus thereof
CN113253908B (zh) 按键功能执行方法、装置、设备及存储介质
CN109804618B (zh) 用于显示图像的电子设备和计算机可读记录介质
EP3198389B1 (en) Apparatus and method for identifying object
KR102654621B1 (ko) 객체를 디스플레이하기 위한 방법 및 그 전자 장치
US10438525B2 (en) Method of controlling display of electronic device and electronic device thereof
CN112486394A (zh) 信息处理方法、装置、电子设备及可读存储介质
WO2018053956A1 (zh) 交互方法及可穿戴设备
KR20150020865A (ko) 전자 장치의 입력 처리 방법 및 장치
KR102574772B1 (ko) 생체 데이터를 등록 및 인증하기 위한 방법 및 그 전자 장치
KR102601905B1 (ko) 터치 패드 운용 방법 및 이를 지원하는 전자 장치
KR20150113572A (ko) 영상데이터를 획득하는 전자장치 및 방법
CN108604128A (zh) 一种处理方法及移动设备
WO2017143575A1 (zh) 对图片的内容进行检索的方法、便携式电子设备和图形用户界面

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16916695

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16916695

Country of ref document: EP

Kind code of ref document: A1