WO2017005023A1 - 智能手表手势输入方法及智能手表 - Google Patents

智能手表手势输入方法及智能手表 Download PDF

Info

Publication number
WO2017005023A1
WO2017005023A1 PCT/CN2016/078635 CN2016078635W WO2017005023A1 WO 2017005023 A1 WO2017005023 A1 WO 2017005023A1 CN 2016078635 W CN2016078635 W CN 2016078635W WO 2017005023 A1 WO2017005023 A1 WO 2017005023A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
user
smart watch
gesture data
data
Prior art date
Application number
PCT/CN2016/078635
Other languages
English (en)
French (fr)
Inventor
黄艳锋
Original Assignee
惠州Tcl移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 惠州Tcl移动通信有限公司 filed Critical 惠州Tcl移动通信有限公司
Priority to US15/308,612 priority Critical patent/US10241585B2/en
Priority to EP16757133.0A priority patent/EP3321771A4/en
Publication of WO2017005023A1 publication Critical patent/WO2017005023A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer

Definitions

  • the invention relates to the field of watches, in particular to a smart watch gesture input method and a smart watch.
  • smart watches are more convenient than smart phones in terms of incoming calls, text messages, and notification reminders.
  • smart watches are limited by the size of the screen, and the input method is inherently insufficient.
  • the reply function is completed on the watch.
  • these input methods may adapt to certain types of people, but for some people whose pronunciation is not accurate, there are obstacles.
  • the future smart watches must support multiple input methods, just like now. Mobile phones support multiple input methods to meet people's needs.
  • a smart watch gesture input method includes the following steps:
  • the step of collecting the gesture data of the user and the duration of the gesture includes:
  • the gesture data for saving the time period from t1 to t2 is a continuous gesture data.
  • the step of recognizing the user's gesture and converting to the corresponding text includes:
  • the text information closest to the gesture data in the continuous time period is obtained.
  • the smart watch gesture input method further includes:
  • the gesture data after t2 is divided into gesture data of the next time period.
  • a smart watch gesture input method includes the steps of:
  • the smart watch gesture input method wherein the step of opening the gesture data collection according to the gesture of the user includes:
  • the gesture data collection is turned on.
  • the smart watch gesture input method further includes:
  • the gesture data after t2 is divided into gesture data of the next time period.
  • the smart watch gesture input method wherein the step of opening the gesture data collection according to the gesture of the user further includes:
  • the text information corresponding to each gesture data is stored in advance.
  • the smart watch gesture input method wherein the step of recognizing a user's gesture and converting to a corresponding text includes:
  • the text information closest to the gesture data in the continuous time period is obtained.
  • a smart watch comprising:
  • An acquisition module configured to collect user gesture data and gesture duration
  • An identification module for recognizing a gesture of the user and converting the text into a response
  • An output module for outputting text corresponding to the gesture.
  • the smart watch wherein the opening module comprises:
  • a motion data acquiring unit configured to acquire acceleration, direction, and duration of a user's hand motion
  • the opening unit is configured to determine that the user is performing gesture input and turning on the gesture data collection when the time of the hand movement reaches a predetermined value.
  • the gesture data of the time period from t1 to t2 is saved.
  • the smart watch wherein the smart watch further comprises:
  • a gesture text corresponding module configured to pre-store text information corresponding to each gesture data
  • the smart watch wherein the identification module comprises:
  • a continuous gesture extracting unit configured to extract gesture data in a continuous period of time from the acquired gesture data
  • the gesture text conversion unit is configured to obtain text information that is closest to the gesture data in the continuous time period according to the pre-stored gesture data and the text correspondence relationship.
  • the invention receives the gesture action of the user, opens the gesture data collection function, collects the gesture data of the user in a continuous time period, and finds the acquired gesture data in the corresponding relationship between the pre-stored gesture data and the text information.
  • the text information closest to the current gesture data that is, the output text information, the smart watch and the gesture input method of the present invention can accurately acquire the text that the user needs to input without changing the screen size of the watch. Large to meet the user's text input needs on smart watches.
  • FIG. 1 is a flowchart of steps of a smart watch gesture input method according to an embodiment of the present invention
  • FIG. 2 is a flowchart of steps of data collection of an open gesture of a smart watch gesture input method according to an embodiment of the present invention
  • FIG. 3 is a flowchart of a step of converting a gesture into a text according to a smart watch gesture input method according to an embodiment of the present invention
  • FIG. 4 is a structural block diagram of a smart watch according to an embodiment of the present invention.
  • FIG. 5 is a structural block diagram of an opening module of a smart watch according to an embodiment of the present invention.
  • FIG. 6 is a structural block diagram of an identification module of a smart watch according to an embodiment of the present invention.
  • the present invention provides a smart watch gesture input method and a smart watch.
  • the present invention will be further described in detail below in order to make the objects, technical solutions and effects of the present invention more clear and clear. It is understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
  • a smart watch gesture input method wherein the smart watch gesture input method comprises the following steps:
  • S1 Open gesture data collection according to a gesture of the user
  • the gesture data collection function of the smart watch is turned on, the gesture data is collected, and then the gesture data collection step is performed on the gesture of the user.
  • Gesture data collection and record the duration of the gesture, which is convenient for counting the start and end of a gesture action, facilitating gesture analysis, and after obtaining the gesture data, analyzing the gesture of the user, and obtaining the text information corresponding to the gesture, and finally
  • the text output corresponding to the gesture realizes the step of converting the gesture of the user into a text.
  • the above method does not need to input text manually, and does not need to change the screen of the smart watch, only the person needs to make a corresponding gesture to complete the input of the text.
  • step S1 of the smart watch gesture input method according to the present invention, wherein the step S1 includes:
  • the smart watch gesture input method of the present invention in the step of the smart watch according to the user gesture opening gesture data acquisition, it is necessary to acquire the acceleration, direction and duration of the user's hand motion, wherein the gravity sensor, the gyroscope and the clock are needed.
  • the gesture data collection function is enabled to perform gesture data collection, and the above method can ensure that only the user needs
  • the gesture is turned on, the gesture data collection is turned on, and the error is not turned on due to the user's wrong operation, which consumes power and affects the user's use.
  • the smart watch gesture input method wherein the step S2 comprises: when t1 time, the user performs gesture input, stops inputting when the time reaches t2, and stops the input time reaches the preset time value, saves t1 to
  • the gesture data of the t2 time period is a continuous gesture data.
  • the smart watch gesture input method of the present invention determines the start and end of the gesture when the user gesture data is collected, and facilitates analysis of the gesture data collection. Therefore, at time t1, the user starts gesture input, and inputs at time t2. Temporarily stopped. At this time, it is necessary to judge the length of time when t2 time is stopped. For example, when t2 time, the user gesture input stops, and the input is started again quickly.
  • the gesture input with t1 belongs to a coherent gesture, only when t2
  • the gesture input of the period of time is completed, preferably when the t2 time stops for more than 1 second, that is, the gesture data of the time t1 to t2 is determined to be a continuous gesture, and the gesture data after t2
  • the gesture data belonging to the next time period increases the accuracy of the gesture analysis for the continuous time period.
  • the smart watch gesture input method wherein before the step S1, the method further includes:
  • the text information corresponding to each gesture data is stored in advance.
  • the corresponding relationship between the gesture data and the text needs to be entered first, that is, first, which word is represented by a certain gesture, and the subsequent gesture data is determined.
  • the correspondence relationship can be referred to, and the closest correspondence can be obtained, and the accuracy of the text output can be increased.
  • step S3 of the smart watch gesture input method according to the present invention, wherein the step S3 includes:
  • the gesture data of the continuous time period needs to be intercepted from the acquired gesture data, and the continuous gesture input can be guaranteed.
  • the gesture corresponds to the accuracy of the text, extracts the gesture data of the continuous time period, and finds the text information closest to the gesture data of the continuous time period in the corresponding relationship pre-stored in the smart watch, that is, the final output text information, because even At present, the highest-end gesture input does not guarantee that the accuracy of the corresponding text of the gesture is 100%, and there must be an error. Therefore, the above solution of the present invention increases the accuracy of the gesture input and output text as much as possible.
  • FIG. 4 is a structural block diagram of the illustrated smart watch, which includes:
  • the module 100 is configured to enable gesture data collection according to a gesture of the user
  • the identification module 300 is configured to identify a gesture of the user and convert the text into a response
  • the output module 400 is configured to output text corresponding to the gesture.
  • FIG. 5 it is a structural block diagram of an opening module of the smart watch according to the present invention, wherein the opening module 100 includes:
  • the motion data acquiring unit 101 is configured to acquire acceleration, direction, and duration of the user's hand motion
  • the opening unit 102 is configured to determine that the user is performing gesture input and turning on the gesture data collection when the time of the hand movement reaches a predetermined value.
  • Segment gesture data is a continuous gesture data
  • the smart watch further includes:
  • a gesture text corresponding module 500 configured to pre-store text information corresponding to each gesture data
  • FIG. 6 it is a structural block diagram of an identification module of the smart watch according to the present invention, wherein the identification module 300 includes:
  • a continuous gesture extraction unit 301 configured to extract gesture data in a continuous period of time from the acquired gesture data
  • the gesture text conversion unit 302 is configured to obtain text information that is closest to the gesture data in the continuous time period according to the pre-stored gesture data and the text correspondence relationship. .
  • the present invention receives the gesture data acquisition function of the user, starts the gesture data collection function, collects the gesture data of the user in a continuous time period, and compares the acquired gesture data with the pre-stored gesture data and the text information.
  • the text information closest to the current gesture data is found, that is, the output text information, and the smart watch and the gesture input method of the present invention can accurately acquire the text that the user needs to input without changing the screen size of the watch. Greatly meet the user's text input needs on smart watches.
  • the above-mentioned smart watch has the same concept as the smart watch gesture input method in the above embodiment, and any method provided in the embodiment of the smart watch gesture input method can be run on the smart watch, and the specific implementation process thereof is detailed. The embodiment of the smart watch gesture input method is not described here.
  • the computer program can be stored in a computer readable storage medium, such as in a memory of the smart watch, and executed by at least one processor in the smart watch, which may include, for example, during execution.
  • the storage medium may be a magnetic disk, an optical disk, a read only memory (ROM), or a random access memory (RAM).
  • each functional module can be integrated into one processing chip, or each module can exist physically separately, or two or more modules can be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
  • the integrated module if implemented in the form of a software functional module and sold or used as a standalone product, may also be stored in a computer readable storage medium, such as a read only memory, a magnetic disk or an optical disk, etc. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement Of Unknown Time Intervals (AREA)
  • Input From Keyboards Or The Like (AREA)

Abstract

一种智能手表手势输入方法及智能手表,通过接收用户的手势动作,开启手势数据采集功能,采集连续时间段内的用户的手势数据,将获取到的手势数据在预设对应关系中找出对应的文字信息,即为输出的文字信息,上述方案在不改变屏幕大小的情况下,准确的获取用户需要输入的文字,其能满足用户在智能手表上的文字输入需求。

Description

智能手表手势输入方法及智能手表 技术领域
本发明涉及手表领域,尤其涉及智能手表手势输入方法及智能手表。
背景技术
随着使用智能手表的用户越来越多,智能手表在来电,短信,通知提醒方面比智能手机更便捷,但智能手表受限于屏幕尺寸,在输入法方面先天不足,对短信,邮件等不能在手表上完成回复功能。虽然目前有产品实现了声音输入,但这些输入法对某类人群可能适应,但对一些发音不准的人而言,就存在障碍,未来的智能手表一定是支持多种输入法,正如现在智能手机一样支持多种输入法,才能适应人们的需要。
因此,现有技术还有待于改进和发展。
技术问题
鉴于上述现有技术的不足,本发明的目的在于提供一种智能手表手势输入方法及智能手表,旨在解决现有技术中智能手表不方便输入的问题。
技术解决方案
一种智能手表手势输入方法,其包括以下步骤:
预先存储各个手势数据对应的文字信息;
获取用户的手部运动的加速度、方向以及持续时间;
当手部运动的时间达到预定值时,判定用户在进行手势输入,开启手势数据采集;
采集用户的手势数据以及手势持续时间;
根据预先存储的各个手势数据对应的文字信息的对应关系,识别用户的手势,转换为相应的文字;以及
输出与手势对应的文字。
在所述的智能手表手势输入方法中,其中所述采集用户的手势数据以及手势持续时间的步骤,具体包括:
当t1时间,用户进行手势输入,到达t2时间停止输入,且停止输入的时间达到预设时间值时,保存t1到t2时间段的手势数据为一连续手势数据。
在所述的智能手表手势输入方法中,其中所述识别用户的手势,转换为相应的文字的步骤,具体包括:
从获取到的手势数据中提取一连续时间段的手势数据;
根据预先存储的手势数据和文字对应关系,得出与所述连续时间段内的手势数据最相近的文字信息。
在所述的智能手表手势输入方法中,所述智能手表手势输入方法,还包括:
将t2之后的手势数据划分至下一个时间段的手势数据中。
一种智能手表手势输入方法,其包括步骤:
根据用户的手势开启手势数据采集;
采集用户的手势数据以及手势持续时间;
识别用户的手势,转换为相应的文字;
输出与手势对应的文字。
所述的智能手表手势输入方法,其中,所述根据用户的手势开启手势数据采集的步骤,包括:
获取用户的手部运动的加速度、方向以及持续时间;
当手部运动的时间达到预定值时,判定用户在进行手势输入,开启手势数据采集。
所述的智能手表手势输入方法,其中,所述采集用户的手势数据以及手势持续时间的步骤,包括:当t1时间,用户进行手势输入,到达t2时间停止输入,且停止输入的时间达到预设时间值时,保存t1到t2时间段的手势数据为一连续手势数据。
在所述的智能手表手势输入方法中,所述智能手表手势输入方法,还包括:
将t2之后的手势数据划分至下一个时间段的手势数据中。
所述的智能手表手势输入方法,其中,所述根据用户的手势开启手势数据采集的步骤之前,还包括:
预先存储各个手势数据对应的文字信息。
所述的智能手表手势输入方法,其中,所述识别用户的手势,转换为相应的文字的步骤,包括:
从获取到的手势数据中提取一连续时间段的手势数据;
根据预先存储的手势数据和文字对应关系,得出与该连续时间段内的手势数据最相近的文字信息。
一种智能手表,其包括:
开启模块,用于根据用户的手势开启手势数据采集;
采集模块,用于采集用户的手势数据及手势持续时间;
识别模块,用于识别用户的手势,转换为响应的文字;
输出模块,用于输出与手势对应的文字。
所述的智能手表,其中,所述开启模块包括:
运动数据获取单元,用于获取用户的手部运动的加速度、方向及持续时间;
开启单元,用于当手部运动的时间达到预定值时,判定用户在进行手势输入,开启手势数据采集。
所述的智能手表,其中,所述采集模块中,当t1时间,用户进行手势输入,到达t2时间停止输入,且停止输入的时间达到预设时间值时,保存t1到t2时间段的手势数据为一连续手势数据
所述的智能手表,其中,所述智能手表还包括:
手势文字对应模块,用于预先存储各个手势数据对应的文字信息
所述的智能手表,其中,所述识别模块包括:
连续手势提取单元,用于从获取到的手势数据中提取以连续时间段的手势数据;
手势文字转换单元,用于根据预先存储的手势数据和文字对应关系,得出与该连续时间段内的手势数据最相近的文字信息。
有益效果
有益效果:本发明通过接收用户的手势动作,开启手势数据采集功能,采集一个连续时间段内的用户的手势数据,将获取到的手势数据在预先存储的手势数据与文字信息的对应关系中找出与当前手势数据最接近的文字信息,即为输出的文字信息,本发明的智能手表以及手势输入方法能够在不改变手表的屏幕大小的情况下,较为准确的获取用户需要输入的文字,极大的满足用户在智能手表上的文字输入需求。
附图说明
图1为本发明实施例提供的智能手表手势输入方法的步骤流程图;
图2为本发明实施例提供的智能手表手势输入方法的开启手势数据采集的步骤流程图;
图3为本发明实施例提供的智能手表手势输入方法的手势转换为文字的步骤流程图;
图4为本发明实施例提供的智能手表的结构框图;
图5为本发明实施例提供的智能手表的开启模块的结构框图;
图6为本发明实施例提供的智能手表的识别模块的结构框图。
本发明的最佳实施方式
本发明提供一种智能手表手势输入方法及智能手表,为使本发明的目的、技术方案及效果更加清楚、明确,以下对本发明进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。
请参阅图1,一种智能手表手势输入方法,其中,所述智能手表手势输入方法包括以下步骤:
S1、根据用户的手势开启手势数据采集;
S2.采集用户的手势数据以及手势持续时间;
S3、识别用户的手势,转换为相应的文字;
S4、输出与手势对应的文字。
本发明的所述智能手表手势输入方法,在具体操作时,首先用户在比画手势时,会开启智能手表的手势数据采集功能,进行手势数据采集,接着,手势数据采集步骤,对用户的手势进行手势数据采集,并且记录下手势的持续时间,方便统计一个手势动作的开始与结尾,便于进行手势分析,获取到手势数据后,对用户的手势进行分析,得出手势对应的文字信息,最后将所述手势对应的文字输出,实现用户的手势转换为文字的步骤,上述方法不需要人为的输入文字,不需要对智能手表屏幕进行改变,只需要人做出相应的手势即可完成文字的输入功能,方便用户的使用。
进一步的,如图2所示,为本发明所述的智能手表手势输入方法的步骤S1的流程图,其中,所述步骤S1包括:
S101、获取用户的手部运动的加速度、方向以及持续时间;
S102、当手部运动的时间达到预定值时,判定用户在进行手势输入,开启手势数据采集。
本发明的所述智能手表手势输入方法,在智能手表根据用户手势开启手势数据采集步骤中,需要获取用户的手部运动的加速度、方向以及持续时间,其中需要用到重力传感器、陀螺仪以及时钟来获取上述数据,只有在持续时间达到一定的值时,例如手势持续1秒,表示用户正在进行手势输入,此时则开启手势数据采集功能,进行手势数据的采集,上述方法能够保证只有用户需要开启时才开启手势数据采集,不会因为用户的错误操作导致错误开启,耗费电量,影响用户的使用。
进一步的,所述的智能手表手势输入方法,其中,所述步骤S2包括:当t1时间,用户进行手势输入,到达t2时间停止输入,且停止输入的时间达到预设时间值时,保存t1到t2时间段的手势数据为一连续手势数据。
本发明的所述智能手表手势输入方法,在采集用户手势数据时,要确定手势开始和结束,方便对手势数据采集进行分析,因此在时间t1时,用户开始手势输入,而在t2时间时输入暂时停止了,此时需要判断t2时间停止的时间长度,例如t2时间时,用户手势输入停止了,很快的又开始输入,此时应该判定与t1时输入的手势属于连贯手势,只有当t2时间停止输入手势,且达到一定时间时,判定该段时间的手势输入完毕,优选当t2时间停止时间超过1秒,即判定t1到t2时间内的手势数据为连续的手势,t2之后的手势数据属于下一个时间段的手势数据,增加了对连续时间段的手势分析的准确性。
进一步的,所述的智能手表手势输入方法,其中,所述步骤S1之前还包括:
S0、预先存储各个手势数据对应的文字信息。
本发明的所述智能手表手势输入方法,在数据采集步骤,开启步骤之前,还需要首先录入手势数据与文字的对应关系,也就是说,首先确定某个手势代表哪个字,在后续对手势数据进行分析时,能够参照该对应关系,得出最接近的对应关系,增加文字输出的准确率。
进一步的,如图3所示,为本发明所述的智能手表手势输入方法的步骤S3的流程图,其中,所述步骤S3包括:
S301、从获取到的手势数据中提取一连续时间段的手势数据;
S302.、根据预先存储的手势数据和文字对应关系,得出与该连续时间段内的手势数据最相近的文字信息。
本发明的所述智能手表手势输入方法,对手势数据进行分析得出对应的文字的步骤中,首先,需要从获取到的手势数据中截取一连续时间段的手势数据,连续的手势输入才能保证手势对应文字的准确性,提取连续时间段的手势数据,在智能手表中预先存储的对应关系中找到与该连续时间段的手势数据最接近的文字信息,即为最终输出的文字信息,因为即使现在最高端的手势输入,也不能保证手势对应文字的准确率为100%,必定会有误差,因此本发明的上述方案,尽可能的增加手势输入与输出文字的准确性。
进一步的,本发明还提供一种智能手表,如图4所示,为所示智能手表的结构框图,其中,包括:
开启模块100,用于根据用户的手势开启手势数据采集;
采集模块200,用于采集用户的手势数据及手势持续时间;
识别模块300,用于识别用户的手势,转换为响应的文字;
输出模块400,用于输出与手势对应的文字。
本发明的所述智能手表的各个模块的功能在方法步骤中已经详细描述,故不在此赘述。
进一步的,如图5所示,为本发明所述的智能手表的开启模块的结构框图,其中,所述开启模块100包括:
运动数据获取单元101,用于获取用户的手部运动的加速度、方向及持续时间;
开启单元102,用于当手部运动的时间达到预定值时,判定用户在进行手势输入,开启手势数据采集。
进一步的,所述的智能手表,其中,所述采集模块200中,当t1时间,用户进行手势输入,到达t2时间停止输入,且停止输入的时间达到预设时间值时,保存t1到t2时间段的手势数据为一连续手势数据
进一步的,所述的智能手表,其中,还包括:
手势文字对应模块500,用于预先存储各个手势数据对应的文字信息
进一步的,如图6所示,为本发明的所述的智能手表的识别模块的结构框图,其中,所述识别模块300包括:
连续手势提取单元301,用于从获取到的手势数据中提取以连续时间段的手势数据;
手势文字转换单元302,用于根据预先存储的手势数据和文字对应关系,得出与该连续时间段内的手势数据最相近的文字信息。。
综上所述,本发明通过接收用户的手势动作,开启手势数据采集功能,采集一个连续时间段内的用户的手势数据,将获取到的手势数据在预先存储的手势数据与文字信息的对应关系中找出与当前手势数据最接近的文字信息,即为输出的文字信息,本发明的智能手表以及手势输入方法能够在不改变手表的屏幕大小的情况下,较为准确的获取用户需要输入的文字,极大的满足用户在智能手表上的文字输入需求。
上述智能手表与上文实施例中的智能手表手势输入方法属于同一构思,在所述智能手表上可以运行所述智能手表手势输入方法实施例中提供的任一方法,其具体实现过程详见所述智能手表手势输入方法实施例,此处不再赘述。
需要说明的是,对本发明实施例的智能手表手势输入方法而言,本领域普通技术人员可以理解实现本发明实施例的智能手表手势输入方法的全部或部分流程,是可以通过计算机程序来控制相关的硬件来完成,所述计算机程序可存储于一计算机可读取存储介质中,如存储在智能手表的存储器中,并被该智能手表内的至少一个处理器执行,在执行过程中可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(ROM)或随机存储记忆体(RAM)等。
对本发明实施例的智能手表而言,其各功能模块可以集成在一个处理芯片中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中,所述存储介质譬如为只读存储器,磁盘或光盘等。
应当理解的是,本发明的应用不限于上述的举例,对本领域普通技术人员来说,可以根据上述说明加以改进或变换,所有这些改进和变换都应属于本发明所附权利要求的保护范围。

Claims (15)

  1. 一种智能手表手势输入方法,其包括以下步骤:
    预先存储各个手势数据对应的文字信息;
    获取用户的手部运动的加速度、方向以及持续时间;
    当手部运动的时间达到预定值时,判定用户在进行手势输入,开启手势数据采集;
    采集用户的手势数据以及手势持续时间;
    根据预先存储的各个手势数据对应的文字信息的对应关系,识别用户的手势,转换为相应的文字;以及
    输出与手势对应的文字。
  2. 根据权利要求1所述的智能手表手势输入方法,其中所述采集用户的手势数据以及手势持续时间的步骤,具体包括:
    当t1时间,用户进行手势输入,到达t2时间停止输入,且停止输入的时间达到预设时间值时,保存t1到t2时间段的手势数据为一连续手势数据。
  3. 根据权利要求2所述的智能手表手势输入方法,其中所述识别用户的手势,转换为相应的文字的步骤,具体包括:
    从获取到的手势数据中提取一连续时间段的手势数据;
    根据预先存储的手势数据和文字对应关系,得出与所述连续时间段内的手势数据最相近的文字信息。
  4. 根据权利要求2所述的智能手表手势输入方法,所述智能手表手势输入方法,还包括:
    将t2之后的手势数据划分至下一个时间段的手势数据中。
  5. 一种智能手表手势输入方法,其包括以下步骤:
    根据用户的手势开启手势数据采集;
    采集用户的手势数据以及手势持续时间;
    识别用户的手势,转换为相应的文字;以及
    输出与手势对应的文字。
  6. 根据权利要求5所述的智能手表手势输入方法,其中所述根据用户的手势开启手势数据采集的步骤,具体包括:
    获取用户的手部运动的加速度、方向以及持续时间;
    当手部运动的时间达到预定值时,判定用户在进行手势输入,开启所述手势数据采集。
  7. 根据权利要求5所述的智能手表手势输入方法,其中所述采集用户的手势数据以及手势持续时间的步骤,具体包括:
    当t1时间,用户进行手势输入,到达t2时间停止输入,且停止输入的时间达到预设时间值时,保存t1到t2时间段的手势数据为一连续手势数据。
  8. 根据权利要求7所述的智能手表手势输入方法,所述智能手表手势输入方法,还包括:
    将t2之后的手势数据划分至下一个时间段的手势数据中。
  9. 根据权利要求5所述的智能手表手势输入方法,其中所述根据用户的手势开启手势数据采集的步骤之前,还包括:
    预先存储各个手势数据对应的文字信息。
  10. 根据权利要求5所述的智能手表手势输入方法,其中所述识别用户的手势,转换为相应的文字的步骤,具体包括:
    从获取到的手势数据中提取一连续时间段的手势数据;
    根据预先存储的手势数据和文字对应关系,得出与所述连续时间段内的手势数据最相近的文字信息。
  11. 一种智能手表,其包括:
    开启模块,用于根据用户的手势开启手势数据采集;
    采集模块,用于采集用户的手势数据及手势持续时间;
    识别模块,用于识别用户的手势,转换为响应的文字;
    输出模块,用于输出与手势对应的文字。
  12. 根据权利要求11所述的智能手表,其中所述开启模块包括:
    运动数据获取单元,用于获取用户的手部运动的加速度、方向及持续时间;
    开启单元,用于当手部运动的时间达到预定值时,判定用户在进行手势输入,开启手势数据采集。
  13. 根据权利要求11所述的智能手表,其中所述采集模块中,当t1时间,用户进行手势输入,到达t2时间停止输入,且停止输入的时间达到预设时间值时,保存t1到t2时间段的手势数据为一连续手势数据。
  14. 根据权利要求11所述的智能手表,其中所述智能手表还包括:
    手势文字对应模块,用于预先存储各个手势数据对应的文字信息。
  15. 根据权利要求11所述的智能手表,其中所述识别模块包括:
    连续手势提取单元,用于从获取到的手势数据中提取以连续时间段的手势数据;
    手势文字转换单元,用于根据预先存储的手势数据和文字对应关系,得出与该连续时间段内的手势数据最相近的文字信息。
PCT/CN2016/078635 2015-07-08 2016-04-07 智能手表手势输入方法及智能手表 WO2017005023A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/308,612 US10241585B2 (en) 2015-07-08 2016-04-07 Smart watch and gesture input method for the smart watch
EP16757133.0A EP3321771A4 (en) 2015-07-08 2016-04-07 GESTURE ENTRY PROCEDURES FOR INTELLIGENT CLOCK AND INTELLIGENT CLOCK

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510396300.0 2015-07-08
CN201510396300.0A CN105045391B (zh) 2015-07-08 2015-07-08 智能手表手势输入方法及智能手表

Publications (1)

Publication Number Publication Date
WO2017005023A1 true WO2017005023A1 (zh) 2017-01-12

Family

ID=54451983

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/078635 WO2017005023A1 (zh) 2015-07-08 2016-04-07 智能手表手势输入方法及智能手表

Country Status (4)

Country Link
US (1) US10241585B2 (zh)
EP (1) EP3321771A4 (zh)
CN (1) CN105045391B (zh)
WO (1) WO2017005023A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10990226B2 (en) 2018-03-08 2021-04-27 International Business Machines Corporation Inputting information using a virtual canvas

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045391B (zh) * 2015-07-08 2019-01-15 深圳市Tcl云创科技有限公司 智能手表手势输入方法及智能手表
CN105975054A (zh) * 2015-11-23 2016-09-28 乐视网信息技术(北京)股份有限公司 一种信息处理方法和装置
CN106331366A (zh) * 2016-09-12 2017-01-11 广州视源电子科技股份有限公司 来电处理方法、可穿戴设备、移动终端及来电处理系统
CN106648076A (zh) * 2016-12-01 2017-05-10 杭州联络互动信息科技股份有限公司 一种智能手表的文字输入方法以及装置
US10845885B2 (en) 2017-02-27 2020-11-24 International Business Machines Corporation Object scrolling and selection on a wearable computing device
CN112068700A (zh) * 2020-09-04 2020-12-11 北京服装学院 一种文字信息输入方法、装置及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102999296A (zh) * 2012-12-03 2013-03-27 北京百度网讯科技有限公司 用于移动终端的文本快捷输入方法、装置和移动终端
CN103760970A (zh) * 2013-12-20 2014-04-30 北京智谷睿拓技术服务有限公司 一种穿戴式输入系统及输入方法
CN103793075A (zh) * 2014-02-14 2014-05-14 北京君正集成电路股份有限公司 一种应用在智能手表上的识别方法及智能手表
US20140184495A1 (en) * 2012-12-31 2014-07-03 Joseph Patrick Quin Portable Device Input by Configurable Patterns of Motion
CN105045391A (zh) * 2015-07-08 2015-11-11 惠州Tcl移动通信有限公司 智能手表手势输入方法及智能手表

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8289162B2 (en) * 2008-12-22 2012-10-16 Wimm Labs, Inc. Gesture-based user interface for a wearable portable device
US9174123B2 (en) * 2009-11-09 2015-11-03 Invensense, Inc. Handheld computer systems and techniques for character and command recognition related to human movements
US8571321B2 (en) * 2010-07-26 2013-10-29 Casio Computer Co., Ltd. Character recognition device and recording medium
US9967100B2 (en) * 2013-11-05 2018-05-08 Samsung Electronics Co., Ltd Method of controlling power supply for fingerprint sensor, fingerprint processing device, and electronic device performing the same
CN103576578B (zh) * 2013-11-05 2017-04-12 小米科技有限责任公司 一种采用耳机线对终端进行控制的方法、装置和设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102999296A (zh) * 2012-12-03 2013-03-27 北京百度网讯科技有限公司 用于移动终端的文本快捷输入方法、装置和移动终端
US20140184495A1 (en) * 2012-12-31 2014-07-03 Joseph Patrick Quin Portable Device Input by Configurable Patterns of Motion
CN103760970A (zh) * 2013-12-20 2014-04-30 北京智谷睿拓技术服务有限公司 一种穿戴式输入系统及输入方法
CN103793075A (zh) * 2014-02-14 2014-05-14 北京君正集成电路股份有限公司 一种应用在智能手表上的识别方法及智能手表
CN105045391A (zh) * 2015-07-08 2015-11-11 惠州Tcl移动通信有限公司 智能手表手势输入方法及智能手表

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10990226B2 (en) 2018-03-08 2021-04-27 International Business Machines Corporation Inputting information using a virtual canvas

Also Published As

Publication number Publication date
US10241585B2 (en) 2019-03-26
CN105045391B (zh) 2019-01-15
EP3321771A1 (en) 2018-05-16
US20170168583A1 (en) 2017-06-15
EP3321771A4 (en) 2019-02-27
CN105045391A (zh) 2015-11-11

Similar Documents

Publication Publication Date Title
WO2017005023A1 (zh) 智能手表手势输入方法及智能手表
WO2017032187A1 (zh) 自动捕捉目标物的方法、装置及存储介质
WO2014079322A1 (zh) 音频流媒体的跟踪方法及系统、存储介质
WO2020034526A1 (zh) 保险录音的质检方法、装置、设备和计算机存储介质
WO2017067264A1 (zh) 一种降低误识别率的方法、装置及智能移动终端
WO2017067290A1 (zh) 指纹录入方法、装置及终端设备
WO2017012404A1 (zh) 群组管理方法、终端和存储介质
WO2017045517A1 (zh) 基于手势识别的文字输入方法、装置及存储介质
WO2017113974A1 (zh) 一种语音处理的方法、装置以及终端
WO2013163920A1 (zh) 插入或删除电子表格中单元格或行列的方法及其装置
WO2017067259A1 (zh) 一种指纹传感器的校准参数的获取方法、装置及移动终端
WO2015043173A1 (zh) 一种在通话过程中发送联系人信息的方法及系统
WO2018086219A1 (zh) 通话的记录方法、装置和终端
WO2014019317A1 (zh) 基于云的智能人脸识别检索方法
WO2017067263A1 (zh) 一种指纹传感器校准方法、装置及智能移动终端
WO2015009066A1 (en) Method for operating conversation service based on messenger, user interface and electronic device using the same
WO2016123898A1 (zh) 一种短信管理方法及其移动终端
WO2022059969A1 (ko) 심전도 데이터 분류를 위한 심층 신경망 사전 학습 방법
WO2017107367A1 (zh) 用户标识处理的方法、终端和非易失性计算可读存储介质
CN112860169B (zh) 交互方法及装置、计算机可读介质和电子设备
WO2017173838A1 (zh) 基于验证的消息显示方法及通信终端
WO2014180150A1 (en) Method, terminal and computer storage medium for triggering a communication with a contact
CN108108284A (zh) 日志处理方法、装置、终端设备及存储介质
WO2018018819A1 (zh) 应用程序的管理方法、管理装置及终端
WO2015196878A1 (zh) 一种电视虚拟触控方法及系统

Legal Events

Date Code Title Description
REEP Request for entry into the european phase

Ref document number: 2016757133

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 15308612

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16757133

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016757133

Country of ref document: EP