CN106249851B - Input information determination method and device - Google Patents

Input information determination method and device Download PDF

Info

Publication number
CN106249851B
CN106249851B CN201510584085.7A CN201510584085A CN106249851B CN 106249851 B CN106249851 B CN 106249851B CN 201510584085 A CN201510584085 A CN 201510584085A CN 106249851 B CN106249851 B CN 106249851B
Authority
CN
China
Prior art keywords
gesture
information
ppg
determining
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510584085.7A
Other languages
Chinese (zh)
Other versions
CN106249851A (en
Inventor
刘浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhigu Ruituo Technology Services Co Ltd
Original Assignee
Beijing Zhigu Ruituo Technology Services Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhigu Ruituo Technology Services Co Ltd filed Critical Beijing Zhigu Ruituo Technology Services Co Ltd
Priority to CN201510584085.7A priority Critical patent/CN106249851B/en
Publication of CN106249851A publication Critical patent/CN106249851A/en
Application granted granted Critical
Publication of CN106249851B publication Critical patent/CN106249851B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The application provides an input information determining method and device, and relates to the field of wearable devices. The method comprises the following steps: acquiring first blood flow information at the thumb in response to a target gesture performed by a user's hand, the target gesture being a thumb-up gesture or a thumb-down gesture; and determining first input information according to the first blood flow information and reference information. Therefore, the method and the device for determining the input information based on the blood flow information at the thumb are provided, and the input capability of electronic equipment such as wearable equipment is effectively enhanced.

Description

Input information determination method and device
Technical Field
The present application relates to the field of wearable devices, and in particular, to a method and a device for determining input information.
Background
Along with the popularization of electronic equipment, more and more wearable devices get into people's life, and more sensors can all be integrated to these wearable devices generally to monitor people's health etc. For example, some smart rings incorporate a PPG sensor to detect the heart rate of the user.
Meanwhile, the existing wearable equipment is small in size, small in input interface, weak in general interaction capacity and not easy for user input.
Disclosure of Invention
The purpose of this application is: an input information determination method and device are provided to enhance the input capability of an electronic device such as a wearable device.
According to a first aspect of at least one embodiment of the present application, there is provided an input information determination method, including:
acquiring a first photoplethysmography (PPG) information at the thumb in response to a target gesture performed by a user's hand, the target gesture being a thumb-up gesture or a thumb-down gesture;
and determining first input information according to the first PPG information and PPG reference information.
With reference to any one of the possible implementation manners of the first aspect, in a second possible implementation manner, the determining an input information according to the first PPG information and a PPG reference information includes:
determining a gesture type of the target gesture according to the first PPG information and the PPG reference information;
and determining first input information corresponding to the gesture type.
With reference to any one of the possible implementations of the first aspect, in a third possible implementation, the determining a gesture type of the target gesture according to the first PPG information and the PPG reference information includes:
and determining the gesture type of the target gesture according to the average amplitude value of the first PPG information and a reference threshold.
With reference to any one of the possible implementation manners of the first aspect, in a fourth possible implementation manner, the determining a gesture type of the target gesture according to the average amplitude value of the first PPG information and a reference threshold includes:
determining that the gesture type is a thumbup gesture in response to the mean amplitude value of the first PPG information being greater than the reference threshold.
With reference to any one of the possible implementation manners of the first aspect, in a fifth possible implementation manner, the determining a gesture type of the target gesture according to the average magnitude value of the first PPG information and a reference threshold includes:
determining that the gesture type is a thumbdown gesture in response to the mean amplitude value of the first PPG information being less than the reference threshold.
With reference to any one of the possible implementations of the first aspect, in a sixth possible implementation, the determining a gesture type of the target gesture according to the first PPG information and the PPG reference information includes:
determining a gesture type of the target gesture according to the average amplitude value of the first PPG information and at least one reference interval.
With reference to any one of the possible implementation manners of the first aspect, in a seventh possible implementation manner, the determining a gesture type of the target gesture according to the average amplitude value of the first PPG information and at least one reference interval includes:
determining that the gesture type is a thumb-up gesture in response to the mean amplitude value of the first PPG information belonging to a first reference interval.
With reference to any one of the possible implementation manners of the first aspect, in an eighth possible implementation manner, the determining a gesture type of the target gesture according to the average amplitude value of the first PPG information and at least one reference interval includes:
determining that the gesture type is a thumb-down gesture in response to the mean amplitude value of the first PPG information belonging to a second reference interval.
With reference to any one of the possible implementation manners of the first aspect, in a ninth possible implementation manner, the method further includes:
acquiring second PPG information adjacent to the first PPG information;
determining a gesture type corresponding to the second PPG information according to the PPG reference information;
in response to that the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, determining that the execution duration of the target gesture comprises the time corresponding to the first PPG information and the time corresponding to the second PPG information.
With reference to any one of the possible implementation manners of the first aspect, in a tenth possible implementation manner, the method further includes:
in response to that the gesture type corresponding to the second PPG information is different from the gesture type of the target gesture, determining that the execution duration of the target gesture is equal to the time corresponding to the first PPG information.
With reference to any one of the possible implementation manners of the first aspect, in an eleventh possible implementation manner, the method further includes: and determining second input information corresponding to the execution duration of the target gesture.
With reference to any one of the possible implementation manners of the first aspect, in a twelfth possible implementation manner, the method further includes:
acquiring second PPG information adjacent to the first PPG information;
determining a gesture type corresponding to the second PPG information according to the PPG reference information;
determining that the number of execution times of the target gesture is increased once in response to that the gesture type corresponding to the second PPG information is different from the gesture type of the target gesture.
With reference to any one of the possible implementation manners of the first aspect, in a thirteenth possible implementation manner, the method further includes:
responding to the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, and acquiring third PPG information adjacent to the second PPG information;
determining a gesture type corresponding to the third PPG information according to the PPG reference information;
determining that the number of execution times of the target gesture is increased once in response to that the gesture type corresponding to the third PPG information is different from the gesture type of the target gesture.
With reference to any one of the possible implementation manners of the first aspect, in a fourteenth possible implementation manner, the method further includes:
and determining third input information corresponding to the execution times of the target gesture.
According to a second aspect of at least one embodiment of the present application, there is provided an input information determination method including:
acquiring first blood flow information at the thumb in response to a target gesture performed by a user's hand, the target gesture being a thumb-up gesture or a thumb-down gesture;
and determining first input information according to the first blood flow information and reference information.
According to a third aspect of at least one embodiment of the present application, there is provided an input information determination apparatus including:
an acquisition module, configured to perform a target gesture in response to a hand of a user, acquire a first PPG message at the thumb, the target gesture being a thumb-up gesture or a thumb-down gesture;
and the first input information determining module is used for determining first input information according to the first PPG information and PPG reference information.
With reference to any one possible implementation manner of the third aspect, in a second possible implementation manner, the first input information determining module includes:
a gesture type determination submodule for determining a gesture type of the target gesture according to the first PPG information and the PPG reference information;
and the first input information determining submodule is used for determining the first input information corresponding to the gesture type.
With reference to any one of the possible implementation manners of the third aspect, in a third possible implementation manner, the gesture type determination submodule is configured to determine the gesture type of the target gesture according to the average amplitude value of the first PPG information and a reference threshold.
With reference to any one of the possible implementation manners of the third aspect, in a fourth possible implementation manner, the gesture type determination submodule includes:
a first determining unit, configured to determine that the gesture type is a thumb-up gesture in response to the mean amplitude value of the first PPG information being greater than the reference threshold.
With reference to any one of the possible implementation manners of the third aspect, in a fifth possible implementation manner, the gesture type determination submodule includes:
a second determining unit, configured to determine that the gesture type is a thumb-down gesture in response to the mean amplitude value of the first PPG information being less than the reference threshold.
With reference to any one of the possible implementation manners of the third aspect, in a sixth possible implementation manner, the gesture type determination submodule is configured to determine the gesture type of the target gesture according to the average amplitude value of the first PPG information and at least one reference interval.
With reference to any one of the possible implementation manners of the third aspect, in a seventh possible implementation manner, the gesture type determination submodule includes:
a first determining unit, configured to determine that the gesture type is a thumb-up gesture in response to that the average amplitude value of the first PPG information belongs to a first reference interval.
With reference to any one of the possible implementation manners of the third aspect, in an eighth possible implementation manner, the gesture type determination submodule includes:
a second determining unit, configured to determine that the gesture type is a thumb-down gesture in response to that the average amplitude value of the first PPG information belongs to a second reference interval.
With reference to any one of possible implementation manners of the third aspect, in a ninth possible implementation manner, the obtaining module is further configured to obtain second PPG information that is adjacent to the first PPG information;
the apparatus further comprises:
a first determining module, configured to determine, according to the PPG reference information, a gesture type corresponding to the second PPG information;
and a second determining module, configured to determine, in response to that the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, that the execution duration of the target gesture includes a time corresponding to the first PPG information and a time corresponding to the second PPG information.
With reference to any one of the possible implementation manners of the third aspect, in a tenth possible implementation manner, the second determining module is further configured to determine, in response to that a gesture type corresponding to the second PPG information is different from a gesture type of the target gesture, that an execution duration of the target gesture is equal to a time corresponding to the first PPG information.
With reference to any one possible implementation manner of the third aspect, in an eleventh possible implementation manner, the apparatus further includes:
and the second input information determining module is used for determining second input information corresponding to the execution duration of the target gesture.
With reference to any one of the possible implementation manners of the third aspect, in a twelfth possible implementation manner, the obtaining module is further configured to obtain second PPG information that is adjacent to the first PPG information;
the apparatus further comprises:
a first determining module, configured to determine, according to the PPG reference information, a gesture type corresponding to the second PPG information;
and a third determining module, configured to determine that the number of times of execution of the target gesture is increased once in response to that the gesture type corresponding to the second PPG information is different from the gesture type of the target gesture.
With reference to any one of the possible implementation manners of the third aspect, in a thirteenth possible implementation manner, the obtaining module is further configured to, in response to that a gesture type corresponding to the second PPG information is the same as a gesture type of the target gesture, obtain third PPG information adjacent to the second PPG information;
the first determining module is further configured to determine a gesture type corresponding to the third PPG information according to the PPG reference information;
the third determining module is further configured to increase the number of times of execution of the target gesture by one time in response to that the gesture type corresponding to the third PPG information is different from the gesture type of the target gesture.
With reference to any one possible implementation manner of the third aspect, in a fourteenth possible implementation manner, the apparatus further includes:
and the third input information determining module is used for determining third input information corresponding to the execution times of the target gesture.
According to a fourth aspect of at least one embodiment of the present application, there is provided an input information determination apparatus including:
an acquisition module, configured to perform a target gesture in response to a user's hand, and acquire a first blood flow information at the thumb, wherein the target gesture is a thumb-up gesture or a thumb-down gesture;
and the first input information determining module is used for determining first input information according to the first blood flow information and reference information.
According to a fifth aspect of at least one embodiment of the present application, there is provided a user equipment, the equipment comprising:
a PPG sensor;
a memory for storing instructions;
a processor to execute the memory-stored instructions, the instructions to cause the processor to:
acquiring first PPG information at the thumb in response to a target gesture performed by a user's hand, the target gesture being a thumb-up gesture or a thumb-down gesture;
and determining first input information according to the first PPG information and PPG reference information.
According to a sixth aspect of at least one embodiment of the present application, there is provided a user equipment, the equipment comprising:
a blood flow information sensor;
a memory for storing instructions;
a processor to execute the memory-stored instructions, the instructions to cause the processor to:
acquiring first blood flow information at the thumb in response to a target gesture performed by a user's hand, the target gesture being a thumb-up gesture or a thumb-down gesture;
and determining first input information according to the first blood flow information and reference information.
According to the method and the device, a target gesture is performed on the hand of a user, first blood flow information is acquired at the thumb, the target gesture is a thumb-up gesture or a thumb-down gesture, and then first input information can be determined according to the first blood flow information and reference information. Therefore, the method and the equipment for determining the input information according to the blood flow information based on the gestures of the user are provided, and the input capability of electronic equipment such as wearable equipment is effectively improved.
Drawings
FIG. 1 is a flow chart of an input information determination method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a thumb-up gesture as described herein;
FIG. 3 is a schematic diagram of a thumb-down gesture as described herein;
fig. 4 is a waveform schematic diagram of PPG information collected at the thumb when a user performs a thumb-up gesture;
fig. 5 is a waveform schematic diagram of PPG information collected at the thumb when a user performs a thumb-down gesture;
fig. 6 is a schematic waveform diagram of PPG information acquired at the thumb when a user performs a thumb-up gesture and a thumb-down gesture, respectively;
FIG. 7 is a schematic diagram of LDF information collected at a thumb and corresponding frequency domain information when a user performs a thumb-up gesture;
FIG. 8 is a schematic diagram of LDF information collected at the thumb and its corresponding frequency domain information when a user performs a thumb-down gesture;
FIG. 9 is a schematic diagram illustrating a calculation of the Doppler frequency shift amount according to the present application;
FIG. 10 is a schematic diagram of LDF information and its corresponding frequency domain information collected at the thumb of a user performing a thumb-up gesture and a thumb-down gesture, respectively;
fig. 11 is a block diagram of an input information determining apparatus according to an embodiment of the present application;
FIG. 12 is a block diagram of the first input information determining module according to an embodiment of the present disclosure;
FIG. 13 is a block diagram of a gesture type determination sub-module according to another embodiment of the present disclosure;
FIG. 14 is a block diagram of a gesture type determination sub-module according to an embodiment of the present disclosure;
fig. 15 is a block diagram of the input information determination device according to another embodiment of the present application;
fig. 16 is a block diagram of the input information determination device according to another embodiment of the present application;
FIG. 17 is a block diagram of a gesture type determination sub-module according to another embodiment of the present disclosure;
fig. 18 is a schematic block configuration diagram of the determination unit according to another embodiment of the present application;
fig. 19 is a schematic block configuration diagram of the determination unit according to another embodiment of the present application;
fig. 20 is a block configuration diagram of the input information determination device according to another embodiment of the present application;
fig. 21 is a block configuration diagram of the input information determination device according to another embodiment of the present application;
fig. 22 is a schematic hardware structure diagram of the user equipment in an embodiment of the present application;
fig. 23 is a schematic hardware structure diagram of the user equipment according to another embodiment of the present application.
Detailed Description
The following detailed description of embodiments of the present application will be made with reference to the accompanying drawings and examples. The following examples are intended to illustrate the present application but are not intended to limit the scope of the present application.
Those skilled in the art will understand that, in the embodiments of the present application, the size of the serial number of each step described below does not mean the execution sequence, and the execution sequence of each step should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
The inventor finds in the research process that if the hand of the user performs different gestures, the blood flow information of the hand can be obviously changed. Based on the gesture recognition method, the gesture of the user can be recognized according to the blood flow information detected by the sensor, and then the input information corresponding to the corresponding gesture can be determined. The blood flow information may be PPG (photoplethysmography) information, or may also be Doppler measurement information, such as LDF (Laser Doppler flow velocity) information.
Fig. 1 is a flowchart of an input information determination method according to an embodiment of the present application, which may be implemented on, for example, an input information determination device. As shown in fig. 1, the method includes:
s120: acquiring first blood flow information at the thumb in response to a target gesture performed by a user's hand, the target gesture being a thumb-up gesture or a thumb-down gesture;
s140: and determining first input information according to the first blood flow information and reference information.
According to the method, a target gesture is performed on the hand of a user, first blood flow information is acquired at the thumb, the target gesture is a thumb-up gesture or a thumb-down gesture, and then first input information can be determined according to the first blood flow information and reference information. Therefore, the method for determining the input information according to the blood flow information based on the user gesture is provided, and the input capability of the electronic equipment such as wearable equipment is effectively improved.
The functions of steps S120 and S140 will be described in detail below with reference to specific embodiments.
S120: a first blood flow information is acquired at the thumb in response to a target gesture being performed by a user's hand, the target gesture being a thumb-up gesture or a thumb-down gesture.
The blood flow information may be, for example, PPG information or doppler measurement information, which may be acquired by a corresponding sensor, for example, the PPG information may be acquired by a PPG sensor. The first blood flow information may be blood flow information corresponding to a first time period.
The thumb-up gesture, i.e., thumb-up gesture, is shown in FIG. 2, when the user gives the gesture, his four fingers make a fist and his thumb is pointing up. The thumb-down gesture, i.e., thumbs-down gesture, as shown in fig. 3, when the user gives the gesture, his four fingers make a fist and the thumb is down.
S140: and determining first input information according to the first blood flow information and reference information.
In one embodiment, the step S140 may include:
s141: determining the gesture type of the target gesture according to the first blood flow information and reference information;
s142: and determining the first input information corresponding to the gesture type.
In step S141, the gesture type of the target gesture is determined, that is, it is determined whether the target gesture is a thumb-up gesture or a thumb-down gesture.
As mentioned above, the blood flow information may be PPG information or doppler measurement information, which will be described below.
a) In one embodiment, the first blood flow information is first PPG information, and the corresponding step S141 further includes:
s141 a: and determining the gesture type of the target gesture according to the first PPG information and a PPG reference information.
Fig. 4 is a waveform diagram of PPG information collected at the thumb when a user performs a thumb-up gesture. Fig. 5 is a waveform diagram of PPG information collected at the thumb when a user performs a thumb-down gesture. By contrast, it can be seen that when the user performs a thumb-up gesture, the resulting mean amplitude value of the PPG information is high, and when the user performs a thumb-down gesture, the resulting mean amplitude value of the PPG information is low. Therefore, the method described herein can identify whether the user is currently performing a thumb-up gesture or a thumb-down gesture according to the average amplitude value of the PPG information collected at the thumb. The average amplitude value of a certain PPG information can be determined by summing the amplitude values of all sampling points on the PPG information and dividing the sum by the number of the sampling points.
In order to better reflect the average amplitude value of the first PPG information, the length of the first PPG information may be at least one PPG cycle, that is, greater than or equal to 0.8 seconds; meanwhile, if the first time period is too long, the user may have performed two gestures, namely, a thumb-up gesture and a thumb-down gesture, successively in the time period, and at this time, if the user performs recognition according to one gesture, an error may be caused, so that the first time period may be less than or equal to the time period for the user to perform one gesture, that is, less than or equal to 1 second. That is, the first time period may be between 0.8 and 1 second, and may be set to one PPG period, such as 0.8 seconds, for simplicity.
In one embodiment, the step S141a may include:
s141 a': and determining the gesture type of the target gesture according to the average amplitude value of the first PPG information and a reference threshold.
Wherein the reference threshold may be an amplitude value determined by pre-training. For example, the user performs the thumb-up gesture for multiple times, acquires multiple groups of PPG information from the thumb, and calculates an average amplitude value corresponding to the thumb-up gesture; and then, the user executes the thumb-down gesture for multiple times, acquires multiple groups of PPG information from the thumb respectively, and calculates to obtain an average amplitude value corresponding to the thumb-down gesture. Further, a magnitude value between the average magnitude value corresponding to the thumb-up gesture and the average magnitude value corresponding to the thumb-down gesture may be determined as the reference threshold.
In one embodiment, the step S141 a' may further include:
s1411 a': determining that the gesture type is a thumbup gesture in response to the mean amplitude value of the first PPG information being greater than the reference threshold.
As described above, the average amplitude value of the PPG information corresponding to the thumb-up gesture may be higher than the average amplitude value of the PPG information corresponding to the thumb-down gesture. In this step, the gesture type is either a thumb-up gesture or a thumb-down gesture, and in the case that the average amplitude value of the first PPG information is greater than the reference threshold, the gesture type may be determined to be a thumb-up gesture.
Similarly, the step S141 a' may further include:
s1412 a': determining that the gesture type is a thumbdown gesture in response to the mean amplitude value of the first PPG information being less than the reference threshold.
In another embodiment, the step S141a may include:
s141a ": determining a gesture type of the target gesture according to the average amplitude value of the first PPG information and at least one reference interval.
Wherein the reference interval is an amplitude value interval, which can be pre-trained to determine. For example, the user performs the thumb-up gesture for multiple times, acquires multiple groups of PPG information at the thumb, calculates an average amplitude value according to each group of PPG information, and determines a reference interval corresponding to the thumb-up gesture according to the maximum value and the minimum value; similarly, the user performs the thumb-down gesture for multiple times, acquires multiple groups of PPG information from the thumb, calculates an average amplitude value according to each group of PPG information, and determines another reference interval corresponding to the thumb-down gesture according to the maximum value and the minimum value.
In one embodiment, the step S141a ″ may include:
s1411a ": determining that the gesture type is a thumb-up gesture in response to the mean amplitude value of the first PPG information belonging to a first reference interval.
As described above, the first reference interval may be a reference interval corresponding to a pre-trained and determined thumb-up gesture, and when the detected average amplitude value of the first PPG information belongs to the first interval, it may be determined that the gesture type is a thumb-up gesture.
Similarly, the step S141a ″ may include:
s1412a ": determining that the gesture type is a thumb-down gesture in response to the mean amplitude value of the first PPG information belonging to a second reference interval.
As described above, the second reference interval may be a reference interval corresponding to a thumb-down gesture determined in advance through training.
The gesture type of the target gesture is recognized in the reference interval mode, although the processing complexity is higher than that of the mode of recognizing according to the reference threshold, the accuracy of the mode is higher, the gesture types which can be recognized are convenient to increase in the future, and a space is reserved for upgrading the method.
In the step S142, a corresponding relationship between the gesture type of the target gesture and the first input information may be preset, for example, an opening command corresponds to a thumb-up gesture, and a closing command corresponds to a thumb-down gesture.
Besides determining the gesture type of the target gesture and further inputting the input information corresponding to the gesture type, the method may further determine the execution duration of the target gesture and determine the input information corresponding to different execution durations. In one embodiment, the method may further comprise:
s150 a: acquiring second PPG information adjacent to the first PPG information;
s160 a: determining a gesture type corresponding to the second PPG information according to the PPG reference information;
s170 a: in response to that the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, determining that the execution duration of the target gesture comprises the time corresponding to the first PPG information and the time corresponding to the second PPG information.
The second PPG information is the PPG information of the second time period. The second time period may be the same as or close to the first time period.
The gesture type determined in step S141a may also be understood as a gesture type corresponding to the first PPG information. The implementation principle of step S160a may be the same as that of step S141a, and is not described again.
In step S170a, if the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, it indicates that the user has kept the target gesture in the time range of the first time period and the second time period, that is, the execution duration of the target gesture includes the time corresponding to the first PPG information and the time corresponding to the second PPG information.
Those skilled in the art understand that, in order to obtain an accurate value of the execution duration of the target gesture, the method should acquire and determine a gesture type corresponding to second PPG information and a gesture type … … corresponding to third PPG information that are sequentially adjacent to the first PPG information until a PPG information different from the gesture type of the target gesture is obtained, which may be referred to as cut-off PPG information, and a time length from a time corresponding to the first PPG information to a time before the cut-off PPG information is the execution duration of the target gesture.
Thus, in one embodiment, the method may further comprise:
s180 a: in response to that the gesture type corresponding to the second PPG information is different from the gesture type of the target gesture, determining that the execution duration of the target gesture is equal to the time corresponding to the first PPG information.
That is, if the second PPG information is the cutoff PPG information, the duration of execution of the target gesture is just the time corresponding to the first PPG information.
For further illustration, reference may be made to the PPG waveform shown in fig. 6. As shown in fig. 6, assuming that PPG information of a C1 time period is acquired in an experiment, it is determined that the gesture type corresponding to C1 is a thumb-up gesture according to the average amplitude value; then, PPG information of a C2 time period is acquired, and after calculation processing, the gesture type corresponding to the PPG information is determined to be a thumb-up gesture, so that the gesture type corresponding to the C3 time period needs to be further determined; then, PPG information of a C3 time period is acquired, and after calculation processing, the gesture type corresponding to the PPG information is determined to be a thumb-up gesture, so that the gesture type corresponding to the C4 time period needs to be further determined; and then, PPG information of a C4 time period is acquired, and the gesture type corresponding to the PPG information is determined to be a thumb-down gesture after operation processing, so that the execution duration of the thumb-up gesture is determined to be the sum of the times corresponding to C1, C2 and C3.
Through the processing, the method can finally obtain the execution duration of the target gesture, and the execution duration can correspond to different input information. Thus, in one embodiment, the method may further comprise:
s190 a: and determining second input information corresponding to the execution duration of the target gesture.
The correspondence between the execution duration of the target gesture and the second input information may be preset, for example, the correspondence may be as shown in table 1 below. Taking the first row of records in table 1 as an example, when the target gesture is thumb up and the execution time is less than 1.5 seconds, the second input information may be an open command, such as turning on a television; for example, when the target gesture is thumb-up and the execution time is longer than 2 seconds, the second input information may be a forward switch command, such as switching a television channel to a direction smaller than the current channel number.
TABLE 1
Target gestures Duration of execution Second input information
Thumb up Less than 1.5 seconds Open
Thumb up More than 2 seconds Forward switching
Thumb down Less than 1.5 seconds Close off
Thumb down More than 2 seconds Backward handover
In addition to determining the gesture type and the execution duration of the target gesture, the method may also determine the execution times of the target gesture and correspond to different input information. In one embodiment, the method may further comprise:
s200 a: acquiring second PPG information adjacent to the first PPG information;
s210 a: determining a gesture type corresponding to the second PPG information according to the PPG reference information;
s220 a: determining that the number of execution times of the target gesture is increased once in response to that the gesture type corresponding to the second PPG information is different from the gesture type of the target gesture.
The implementation principle of the step S200a is the same as that of the step S150a, and is not described again.
The implementation principle of the step S210a is the same as that of the step S160a, and is not described again.
In step S220a, the number of times the target gesture is performed may have an initial value, which may be 0. In this step, the method may acquire and determine a gesture type corresponding to second PPG information and a gesture type … … corresponding to third PPG information that are sequentially adjacent to the first PPG information until PPG information different from the gesture type of the target gesture is obtained, which may be referred to as cut-off PPG information, and the user is performing the same gesture all the time from the time corresponding to the first PPG information to the time before the cut-off PPG information, and the target gesture is completed exactly once at the time before the cut-off PPG information. Therefore, in step S220a, if the gesture type corresponding to the second PPG information is different from the gesture type of the target gesture, it indicates that the user has completed one target gesture, so that it may be determined that the number of times of execution of the target gesture is increased by one, that is, the number of times of execution of the target gesture is changed from zero to 1.
In addition, if the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, it indicates that the user is still performing the target gesture within the time corresponding to the second PPG information. Thus, the method may further comprise:
s230 a: responding to the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, and acquiring third PPG information adjacent to the second PPG information;
s240 a: determining a gesture type corresponding to the third PPG information according to the PPG reference information;
s250 a: determining that the number of execution times of the target gesture is increased once in response to that the gesture type corresponding to the third PPG information is different from the gesture type of the target gesture.
That is to say, in a case that the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, it is necessary to further determine the gesture type corresponding to the third PPG information. If the gesture type corresponding to the third PPG information is different from the gesture type of the target gesture, the target gesture is completed by the user, and therefore the execution times of the target gesture are increased by one time; on the contrary, if the gesture type corresponding to the third PPG information is also the same as the gesture type of the target gesture, the gesture type corresponding to the fourth PPG information after the third PPG information needs to be further determined, and so on until a PPG information different from the gesture type of the target gesture is found, that is, the PPG information is cut off, and at this time, the number of times of execution of the target gesture is changed from zero to 1.
Similarly, after the ending PPG message, when the gesture type of a PPG message is the same as the gesture type of the target gesture, the PPG message may be marked as a new first PPG message, and the above steps are repeated to find a new ending PPG message, where the number of target gestures performed is increased once again, i.e. from 1 to 2. By analogy, the method can acquire the total times of executing the target gesture by the user within a preset period of time according to the user requirement, namely the finally determined execution times of the target gesture.
To further illustrate how the number of executions of the target gesture is determined, reference may still be made to the PPG waveform shown in fig. 6. As shown in FIG. 6, assume that the user needs to obtain the number of times that the thumb-up gesture is performed in the period of C1-C6. Then, as described above, when it is determined that the gesture type corresponding to C4 is the thumb-down gesture, it is determined that the number of times of performing the thumb-up gesture increases once, i.e., changes from 0 to 1; then continuing to determine the gesture type corresponding to the C5, and finding that the gesture type is a thumb-up gesture; then, the gesture type corresponding to C6 is further determined, and the gesture type is found to be a thumb-down gesture, so that the number of times of performing the thumb-up gesture is determined to increase once, that is, to change from 1 to 2, and finally the number of times of performing the thumb-up gesture is determined to be 2.
Through the processing, the method can finally obtain the execution times of the target gesture, and the execution times can correspond to different input information. Thus, in one embodiment, the method may further comprise:
s260 a: and determining third input information corresponding to the execution times of the target gesture.
The corresponding relationship between the number of execution times of the target gesture and the third input information may be preset, for example, the corresponding relationship may be as shown in table 2 below. Taking the first row record in table 2 as an example, when the target gesture is thumb up and the number of execution times is 1, the third input information may be an open command, such as turning on a television; for example, when the target gesture is thumb-up and the number of execution times is 2, the third input information may be a volume up command.
TABLE 2
Target gestures Number of executions Third input information
Thumb up 1 Open
Thumb up 2 Volume increase
Thumb down 1 Close off
Thumb down 2 Volume reduction
In addition to the information shown in table 2, the third input information corresponding to the number of times of execution of the target gesture may be determined by a wearable device after self-arithmetic processing, for example, the user returns home at night and controls an intelligent electric lamp in home to turn on through a thumb-up gesture, and turns on an electric lamp every time the thumb-up gesture is executed; then, when the user is ready to sleep, he controls the lights off by a thumb-down gesture, turning off one light per thumb-down gesture performed. Therefore, the method can take the number of times of the user performing the thumb-up gesture as a reference value, then judge whether the number of times of the user performing the thumb-down gesture is smaller than the reference value, if so, indicate that some lamps are not turned off, further input a flashing command to the lamps which are not turned off, and the lamps start flashing for 5 seconds to remind the user to turn off the lamps.
b) In another embodiment, the first blood flow information is first Doppler measurement information, which may be, for example, LDF (Laser Doppler shift), LDV (Laser Doppler Velocimetry), ultrasonic Doppler shift, etc., and includes a series of envelope signals, which are subjected to, for example, fast fourier transform to obtain corresponding frequency domain signals, wherein the amount of Doppler shift in the frequency domain signals is proportional to the blood flow velocity. Therefore, when the amplitude value of the corresponding PPG information is lower, the blood flow velocity is increased, and the Doppler frequency shift amount in the frequency domain signal is larger; when the higher the amplitude value of the corresponding PPG information is, it indicates that the blood flow velocity is reduced, the smaller the amount of doppler shift in the frequency domain signal is.
Correspondingly, the step S141 further includes:
s141 b: and determining the gesture type of the target gesture according to the first Doppler measurement information and Doppler reference information.
In one embodiment, the step S141b may further include:
s1411 b: performing frequency domain conversion on the first Doppler measurement information to obtain first frequency domain information;
s1412 b: and determining the gesture type according to the Doppler frequency shift amount of the first frequency domain information and frequency shift reference information.
In step S1411b, the first frequency domain information corresponding to the first doppler measurement information may be obtained by a fast fourier transform or the like, for example. The first doppler measurement information is doppler measurement information of a first time period, and the first time period may be between 0.8 second and 1 second.
Fig. 7 is a schematic diagram of LDF information collected at a thumb and corresponding frequency domain information when a user performs a thumb-up gesture. Wherein, the LDF waveform is above the dotted line, and the corresponding frequency domain information is below the dotted line.
Fig. 8 is a schematic diagram of LDF information collected at a thumb and corresponding frequency domain information when a user performs a thumb-down gesture. Wherein, the LDF waveform is above the dotted line, and the corresponding frequency domain information is below the dotted line.
Fig. 9 is a diagram of a conventional calculation method for the doppler frequency shift amount. First, taking the frequency domain information waveform shown by the solid line in fig. 9 as an example, when calculating the doppler shift amount, first, the frequency value with the highest energy, i.e., the frequency corresponding to the point a in the diagram, then, the energy intensity of the frequency corresponding to the point a is determined, then, the frequency corresponding to the energy intensity 3dB lower than the energy intensity, i.e., the frequency corresponding to the point B in the diagram, is determined, and then, the difference value between the frequency corresponding to the point B and the frequency corresponding to the point a is calculated, i.e., the doppler shift amount of the frequency domain information waveform shown by the solid line in fig. 9, i.e., f1 in. Similarly, the doppler shift amount of the frequency domain information waveform shown by the dotted line in fig. 9 can be obtained as f 2. Also, as can be seen in fig. 9, f2 is greater than f 1. In addition, the calculation of the doppler shift amount is not limited to the above method, and is not the focus of the present application and will not be described again.
The inventors found in the experiments shown in fig. 7 and 8 that: when the user executes a thumb-up gesture, as shown in fig. 7, the doppler shift amount of the frequency domain information corresponding to the obtained LDF information is small; when the user performs a thumb-down gesture, as shown in fig. 8, the obtained doppler shift amount of the frequency domain information corresponding to the LDF information is large. Therefore, the method can identify whether the user currently performs a thumb-up gesture or a thumb-down gesture according to the doppler frequency shift amount of the frequency domain information corresponding to the LDF information collected at the thumb.
In one embodiment, the step S1412b may include:
s1412 b': and determining the gesture type of the target gesture according to the Doppler frequency shift amount and a reference threshold value.
The reference threshold may be a doppler shift amount determined by pre-training. For example, the user executes the thumb-up gesture for multiple times, acquires multiple groups of LDF information at the thumb, performs frequency domain conversion, calculates corresponding doppler shift amounts, and calculates an average value of the doppler shift amounts to obtain an average doppler shift amount corresponding to the thumb-up gesture; then, the user executes the thumb-down gesture for multiple times, acquires multiple groups of LDF information at the thumb respectively, performs frequency domain conversion respectively, calculates corresponding Doppler frequency shift amount, and then calculates the average value of the Doppler frequency shift amount to obtain the average Doppler frequency shift amount corresponding to the thumb-down gesture. Further, a value between an average doppler shift amount corresponding to the thumb-up gesture and an average doppler shift amount corresponding to the thumb-down gesture may be determined as the reference threshold.
In one embodiment, the step S1412 b' may further include:
s14121 b': in response to the amount of Doppler frequency shift being less than the reference threshold, determining that the gesture type is a thumbup gesture.
As described above, the amount of Doppler shift for the thumb-up gesture is higher than the amount of Doppler shift for the thumb-down gesture. In this step, the gesture type is either a thumb-up gesture or a thumb-down gesture, and in the case that the doppler shift amount is smaller than the reference threshold, the gesture type may be determined to be a thumb-up gesture.
Similarly, the step S1412 b' may further include:
s14122 b': in response to the amount of Doppler frequency shift being greater than the reference threshold, determining that the gesture type is a thumb-down gesture.
In another embodiment, the step S1412b may include:
s1412b ": and determining the gesture type of the target gesture according to the Doppler frequency shift amount and at least one reference interval.
The reference interval is a doppler shift interval, which can be determined by pre-training. For example, the user executes the thumb-up gesture for multiple times, acquires multiple groups of LDF information at the thumb, calculates a doppler frequency shift amount according to the frequency domain information corresponding to each group of LDF information, and determines a reference interval corresponding to the thumb-up gesture according to the maximum value and the minimum value; similarly, the user executes the thumb-down gesture for multiple times, acquires multiple groups of LDF information at the thumb respectively, calculates a Doppler frequency shift amount according to the frequency domain information corresponding to each group of LDF information respectively, and then determines another reference interval corresponding to the thumb-down gesture according to the maximum value and the minimum value.
In one embodiment, the step S1412b ″ may include:
s14121 b': in response to the Doppler frequency shift amount belonging to a first reference interval, determining that the gesture type is a thumb-up gesture.
As described above, the first reference interval may be a reference interval corresponding to a pre-trained thumb-up gesture.
Similarly, the step S1412b ″ may include:
s14122 b': determining that the gesture type is a thumb-down gesture in response to the Doppler frequency shift amount belonging to a second reference interval.
As described above, the second reference interval may be a reference interval corresponding to a thumb-down gesture determined in advance through training.
The gesture type of the target gesture is recognized in the reference interval mode, although the processing complexity is higher than that of the mode of recognizing according to the reference threshold, the accuracy of the mode is higher, the gesture types which can be recognized are convenient to increase in the future, and a space is reserved for upgrading the method.
In the step S142, a corresponding relationship between the gesture type of the target gesture and the first input information may be preset, for example, an opening command corresponds to a thumb-up gesture, and a closing command corresponds to a thumb-down gesture.
Besides determining the gesture type of the target gesture and further inputting the input information corresponding to the gesture type, the method may further determine the execution duration of the target gesture and determine the input information corresponding to different execution durations. In one embodiment, the method may further comprise:
s150 b: acquiring adjacent second Doppler measurement information after the first Doppler measurement information;
s160 b: determining a gesture type corresponding to the second Doppler measurement information according to the Doppler reference information;
s170 b: and in response to that the gesture type corresponding to the second Doppler measurement information is the same as the gesture type of the target gesture, determining that the execution duration of the target gesture comprises the time corresponding to the first Doppler measurement information and the time corresponding to the second Doppler measurement information.
The second doppler measurement information is also doppler measurement information of a second time period. The second time period may be the same as or close to the first time period.
The gesture type obtained in step S140b is the gesture type corresponding to the first doppler measurement information. The implementation principle of the step S160b may be the same as that of the step S140b, and is not described again.
In step S170b, if the gesture type corresponding to the second doppler measurement information is the same as the gesture type of the target gesture, it indicates that the user keeps the target gesture in the time range of the first time period and the second time period, that is, the execution duration of the target gesture includes the time corresponding to the first doppler measurement information and the time corresponding to the second doppler measurement information.
Those skilled in the art understand that, in order to obtain an accurate value of the execution duration of the target gesture, the method should acquire and determine … … a gesture type corresponding to second doppler measurement information and a gesture type corresponding to third doppler measurement information that are sequentially adjacent to each other after the first doppler measurement information until obtaining a piece of doppler measurement information different from the gesture type of the target gesture, which may be referred to as cutoff doppler measurement information, and a length of time from a time corresponding to the first doppler measurement information to a time before the cutoff doppler measurement information is the execution duration of the target gesture.
Thus, in one embodiment, the method may further comprise:
s180 b: and in response to that the gesture type corresponding to the second Doppler measurement information is different from the gesture type of the target gesture, determining that the execution duration of the target gesture is equal to the time corresponding to the first Doppler measurement information.
That is, if the second doppler measurement information is the cutoff doppler measurement information, the execution duration of the target gesture is just the time corresponding to the first doppler measurement information.
For further explanation, reference may be made to the LDF waveform and its corresponding frequency domain information shown in fig. 10. As shown in fig. 10, it is assumed that LDF information of a C1 time period is obtained in an experiment first, frequency domain information corresponding to a position below a dotted line is obtained through frequency domain conversion, and it is determined that a gesture type corresponding to C1 is a thumb-up gesture according to a doppler shift amount of the frequency domain information; next, LDF information of the time period C2 is acquired, and after arithmetic processing, it is determined that the gesture type corresponding thereto is also a thumb-up gesture, so that it is necessary to further determine the gesture type corresponding to the time period C3; next, LDF information of the time period C3 is acquired, and after arithmetic processing, it is determined that the gesture type corresponding thereto is also a thumb-up gesture, so that it is necessary to further determine the gesture type corresponding to the time period C4; next, LDF information of the time period C4 is obtained, and after the operation processing, the gesture type corresponding to the LDF information is determined to be a thumb-down gesture, so that the execution duration of the thumb-up gesture is determined to be the sum of the times corresponding to C1, C2 and C3.
Through the processing, the method can finally obtain the execution duration of the target gesture, and the execution duration can correspond to different input information. Thus, in one embodiment, the method may further comprise:
s190 b: and determining second input information corresponding to the execution duration of the target gesture.
The correspondence between the execution duration of the target gesture and the second input information may be preset, for example, the correspondence may be as shown in table 1 below.
In addition to determining the gesture type and duration of execution of the target gesture, the method may also determine a number of executions of the target gesture. In one embodiment, the method may further comprise:
s200 b: acquiring adjacent second Doppler measurement information after the first Doppler measurement information;
s210 b: determining a gesture type corresponding to the second Doppler measurement information according to the Doppler reference information;
s220 b: and determining that the execution times of the target gesture is increased once in response to that the gesture type corresponding to the second Doppler measurement information is different from the gesture type of the target gesture.
The implementation principle of the step S200b is the same as that of the step S150b, and is not described again.
The implementation principle of the step S210b is the same as that of the step S160b, and is not described again.
In step S220b, the number of times the target gesture is performed may have an initial value, which may be 0. In this step, the method may acquire and determine … … a gesture type corresponding to second doppler measurement information and a gesture type corresponding to third doppler measurement information that are sequentially adjacent to each other after the first doppler measurement information until doppler measurement information different from the gesture type of the target gesture is obtained, which may be referred to as cutoff doppler measurement information, and the user is executing the same gesture all the time from the time corresponding to the first doppler measurement information to the time before the cutoff doppler measurement information, and completes the target gesture just once at the time before the cutoff doppler measurement information. Therefore, in the step S220b, if the gesture type corresponding to the second doppler measurement information is different from the gesture type of the target gesture, it indicates that the user has completed one target gesture, so that it may be determined that the number of times of execution of the target gesture is increased by one, that is, the number of times of execution of the target gesture is changed from zero to 1.
In addition, if the gesture type corresponding to the second doppler measurement information is the same as the gesture type of the target gesture, it indicates that the user is still performing the target gesture within the time corresponding to the second doppler measurement information. Thus, the method may further comprise:
s230 b: responding to the gesture type corresponding to the second Doppler measurement information is the same as the gesture type of the target gesture, and then obtaining adjacent third Doppler measurement information behind the second Doppler measurement information;
s240 b: determining a gesture type corresponding to the third Doppler measurement information according to the Doppler reference information;
s250 b: determining that the number of execution times of the target gesture is increased once in response to that the gesture type corresponding to the third Doppler measurement information is different from the gesture type of the target gesture.
That is to say, when the gesture type corresponding to the second doppler measurement information is the same as the gesture type of the target gesture, the gesture type corresponding to the third doppler measurement information needs to be further determined. If the gesture type corresponding to the third Doppler measurement information is different from the gesture type of the target gesture, the target gesture is finished once by the user, and therefore the execution times of the target gesture are increased once; on the contrary, if the gesture type corresponding to the third doppler measurement information is the same as the gesture type of the target gesture, the gesture type corresponding to the fourth doppler measurement information after the third doppler measurement information needs to be further judged, and so on until a doppler measurement information different from the gesture type of the target gesture is found, that is, the doppler measurement information is cut off, and at this time, the execution frequency of the target gesture is changed from zero to 1.
Similarly, after the cutoff doppler measurement information, when the gesture type of a doppler measurement information appearing again is the same as the gesture type of the target gesture, the doppler measurement information may be marked as a new first doppler measurement information, and then the above steps are repeated to find a new cutoff doppler measurement information, where the number of times of execution of the target gesture is increased once again, that is, from 1 to 2. By analogy, the method can acquire the total times of executing the target gesture by the user within a preset period of time according to the user requirement.
To further illustrate how the number of executions of the target gesture is determined, reference may still be made to the LDF waveform and its corresponding frequency domain information shown in fig. 10. As shown in FIG. 10, assume that the user needs to obtain the number of times the thumb-up gesture is performed during the period of C1-C6. Then, as described above, when it is determined that the gesture type corresponding to C4 is the thumb-down gesture, it is determined that the number of times of performing the thumb-up gesture increases once, i.e., changes from 0 to 1; then continuing to determine the gesture type corresponding to the C5, and finding that the gesture type is a thumb-up gesture; then, the gesture type corresponding to C6 is further determined, and the gesture type is found to be a thumb-down gesture, so that the number of times of performing the thumb-up gesture is determined to increase once, that is, to change from 1 to 2, and finally the number of times of performing the thumb-up gesture is determined to be 2.
Through the processing, the method can finally obtain the execution times of the target gesture, and the execution times can correspond to different input information. Thus, in one embodiment, the method may further comprise:
s260 b: and determining third input information corresponding to the execution times of the target gesture.
The corresponding relationship between the number of execution times of the target gesture and the third input information may be preset, for example, the corresponding relationship may be as shown in table 2.
In addition to the information shown in table 2, the third input information corresponding to the number of times of execution of the target gesture may be determined by a wearable device after self-arithmetic processing, for example, the user returns home at night and controls an intelligent electric lamp in home to turn on through a thumb-up gesture, and turns on an electric lamp every time the thumb-up gesture is executed; then, when the user is ready to sleep, he controls the lights off by a thumb-down gesture, turning off one light per thumb-down gesture performed. Thus, the method can count the number of times the lamp is turned on and off to remind the user to turn off the lamp which is not turned off.
Those skilled in the art will understand that, in the above embodiment, the gesture type is substantially determined according to the first blood flow information, and then the first input information is determined, that is, the first input information is determined according to the first correspondence relationship between the first blood flow information, the gesture type and the first input information; in fact, it is not necessary for the method or for a device performing the method to know the gesture type at all, that is to say that it can determine the first input information directly from the second correspondence between the first blood flow information and the first input information. Of course, it is still necessary for the user to know the first corresponding relationship to perform different gestures to input different information.
Furthermore, embodiments of the present application also provide a computer-readable medium, comprising computer-readable instructions that when executed perform the following operations: the operations of steps S120, S140 of the method in the embodiment shown in fig. 1 described above are performed.
In summary, the method in the embodiment of the application can determine corresponding input information according to the blood flow information of the thumb of the user based on the gesture of the user, so that the input capability of electronic equipment such as wearable equipment is effectively improved.
Fig. 11 is a schematic structural diagram of a module of the input information determining device according to an embodiment of the present application, where the input information determining device may be installed in a wearable device as a function module, and certainly may also be used as an independent wearable device for a user to use. As shown in fig. 11, the apparatus 1100 may include:
an obtaining module 1110, configured to obtain a first blood flow information at the thumb in response to a target gesture performed by a hand of a user, the target gesture being a thumb-up gesture or a thumb-down gesture;
a first input information determining module 1120, configured to determine a first input information according to the first blood flow information and a reference information.
The device of the embodiment of the application responds to a target gesture executed by hands of a user, acquires first blood flow information at the thumb, and then determines first input information according to the first blood flow information and reference information. Therefore, the equipment for determining the input information based on the blood flow information of the thumb is provided, the user can change the blood flow information by executing different gestures to correspondingly input different information, and the input interaction capacity of the wearable equipment and the like is effectively improved.
The functions of the acquiring module 1110 and the first input information determining module 1120 will be described in detail below with reference to specific embodiments.
The obtaining module 1110 is configured to perform a target gesture in response to a hand of a user, and obtain a first blood flow information at the thumb, where the target gesture is a thumb-up gesture or a thumb-down gesture.
The blood flow information may be, for example, PPG information or doppler measurement information, which may be acquired by a corresponding sensor, for example, the PPG information may be acquired by a PPG sensor. The first blood flow information may be blood flow information corresponding to a first time period.
The thumb-up gesture, i.e., thumb-up gesture, is shown in FIG. 2, when the user gives the gesture, his four fingers make a fist and his thumb is pointing up. The thumb-down gesture, i.e., thumbs-down gesture, as shown in fig. 3, when the user gives the gesture, his four fingers make a fist and the thumb is down.
The first input information determining module 1120 is configured to determine first input information according to the first blood flow information and reference information.
In one embodiment, referring to fig. 12, the first input information determining module 1120 may include:
a gesture type determining submodule 1121 configured to determine a gesture type of the target gesture according to the first blood flow information and the reference information;
a first input information determining submodule 1122 for determining the first input information corresponding to the gesture type.
As mentioned above, the blood flow information may be PPG information or doppler measurement information, which will be described below.
a) In an embodiment, the first blood flow information is first PPG information, and the gesture type determination submodule 1121 is configured to determine the gesture type of the target gesture according to the first PPG information and a PPG reference information.
The first PPG information is PPG information of a first time period, and the length of the first time period may be set to 0.8 seconds, for example.
In an embodiment, the gesture type determining sub-module 1121 is configured to determine the gesture type of the target gesture according to the average amplitude value of the first PPG information and a reference threshold.
Wherein the reference threshold may be an amplitude value determined by pre-training. For example, the user performs the thumb-up gesture for multiple times, acquires multiple groups of PPG information from the thumb, and calculates an average amplitude value corresponding to the thumb-up gesture; and then, the user executes the thumb-down gesture for multiple times, acquires multiple groups of PPG information from the thumb respectively, and calculates to obtain an average amplitude value corresponding to the thumb-down gesture. Further, a magnitude value between the average magnitude value corresponding to the thumb-up gesture and the average magnitude value corresponding to the thumb-down gesture may be determined as the reference threshold.
In one embodiment, referring to fig. 13, the gesture type determination sub-module 1121 may further include:
a first determining unit 11211a, configured to determine that the gesture type is a thumb-up gesture in response to the mean amplitude value of the first PPG information being greater than the reference threshold.
In another embodiment, still referring to fig. 13, the gesture type determination sub-module 1121 may further include:
a second determining unit 11212a, configured to determine that the gesture type is a thumb-down gesture in response to the mean amplitude value of the first PPG information being smaller than the reference threshold.
In another embodiment, the gesture type determining sub-module 1121 is configured to determine the gesture type of the target gesture according to the average amplitude value of the first PPG information and at least one reference interval.
Wherein the reference interval is an amplitude value interval, which can be pre-trained to determine. For example, the user performs the thumb-up gesture for multiple times, acquires multiple groups of PPG information at the thumb, calculates an average amplitude value according to each group of PPG information, and determines a reference interval corresponding to the thumb-up gesture according to the maximum value and the minimum value; similarly, the user performs the thumb-down gesture for multiple times, acquires multiple groups of PPG information from the thumb, calculates an average amplitude value according to each group of PPG information, and determines another reference interval corresponding to the thumb-down gesture according to the maximum value and the minimum value.
In one embodiment, referring to fig. 14, the gesture type determination submodule 1121 may include:
a first determining unit 11211 a' configured to determine that the gesture type is a thumb-up gesture in response to that the average amplitude value of the first PPG information belongs to the first reference interval.
In another embodiment, still referring to fig. 14, the gesture type determination sub-module 1121 may further include:
a second determining unit 11212 a' configured to determine that the gesture type is a thumb-down gesture in response to that the average amplitude value of the first PPG information belongs to a second reference interval.
The first input information determining submodule 1122 is configured to determine the first input information corresponding to the gesture type.
The corresponding relationship between the gesture type of the target gesture and the first input information may be preset, for example, an opening command corresponds to a thumb-up gesture, and a closing command corresponds to a thumb-down gesture.
Besides determining the gesture type of the target gesture and further inputting the input information corresponding to the gesture type, the device may further determine the execution duration of the target gesture. In an embodiment, the obtaining module 1110 is further configured to obtain a second PPG information adjacent to the first PPG information;
referring to fig. 15, the apparatus 1100 further comprises:
a first determining module 1130a, configured to determine a gesture type corresponding to the second PPG information according to the PPG reference information;
a second determining module 1140a, configured to determine that the execution duration of the target gesture includes the time corresponding to the first PPG information and the time corresponding to the second PPG information, in response to that the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture.
The second PPG information is the PPG information of the second time period. The second time period may be the same as or close to the first time period.
The implementation principle of the first determining module 1130a determining the gesture type corresponding to the second PPG information may be the same as the implementation principle of the gesture type determining sub-module 1121 determining the gesture type corresponding to the first PPG information, and is not described again.
As understood by those skilled in the art, if the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, it indicates that the user has kept the target gesture for the time range of the first time period and the second time period, that is, the execution duration of the target gesture includes the time corresponding to the first PPG information and the time corresponding to the second PPG information.
In order to obtain an accurate value of the execution duration of the target gesture, the device should acquire and determine a gesture type … … corresponding to second PPG information and a gesture type … … corresponding to third PPG information that are sequentially adjacent to the first PPG information until a PPG information different from the gesture type of the target gesture is obtained, which may be called as cut-off PPG information, and a time length from a time corresponding to the first PPG information to a time before the cut-off PPG information is the execution duration of the target gesture.
Therefore, in one embodiment, the second determining module 1140a is further configured to determine that the execution duration of the target gesture is equal to the time corresponding to the first PPG information in response to that the gesture type corresponding to the second PPG information is different from the gesture type of the target gesture.
That is, if the second PPG information is the cutoff PPG information, the second determination module 1140a may determine that the duration of execution of the target gesture is exactly the time corresponding to the first PPG information. Otherwise, the obtaining module 1110 continues to obtain the subsequent adjacent PPG information, the first obtaining module continues to determine the gesture type corresponding thereto, and if the gesture type is different from the gesture type of the target gesture, the second determining module 1140a determines the execution duration of the target gesture; if so, the above operations are repeated until the second determination module 1140a determines the execution duration of the target gesture.
Through the processing, the device can finally obtain the execution duration of the target gesture, and the execution duration can correspond to different input information. Thus, in one embodiment, still referring to fig. 15, the device 1100 may further comprise:
a second input information determining module 1150a, configured to determine second input information corresponding to the execution duration of the target gesture.
In addition to determining the gesture type and duration of execution of the target gesture, the device 1100 may also determine the number of executions of the target gesture. In an embodiment, the obtaining module 1110 is further configured to obtain a second PPG information adjacent to the first PPG information;
the first determining module 1130a is configured to determine, according to the PPG reference information, a gesture type corresponding to the second PPG information;
referring to fig. 16, the apparatus 1100 further comprises:
a third determining module 1160a, configured to determine that the number of times of execution of the target gesture is increased once in response to that the gesture type corresponding to the second PPG information is different from the gesture type of the target gesture.
The number of executions of the target gesture may have an initial value, which may be 0. The device may acquire and determine a gesture type corresponding to second PPG information and a gesture type … … corresponding to third PPG information that are sequentially adjacent to the first PPG information until a PPG information different from the gesture type of the target gesture is obtained, which may be referred to as cut-off PPG information, and the user is performing the same gesture all the time from the time corresponding to the first PPG information to the time before the cut-off PPG information, and the target gesture is completed exactly once at the time before the cut-off PPG information. Therefore, for the third determining module 1160a, if the gesture type corresponding to the second PPG information is different from the gesture type of the target gesture, it indicates that the user has completed one target gesture, so that it may be determined that the number of times of execution of the target gesture is increased by one, that is, the number of times of execution of the target gesture is changed from zero to 1.
In addition, if the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, it indicates that the user is still performing the target gesture within the time corresponding to the second PPG information. Thus, it is possible to prevent the occurrence of,
the obtaining module 1110 is further configured to, in response to that the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, obtain third PPG information adjacent to the second PPG information;
the first determining module 1130a is further configured to determine a gesture type corresponding to the third PPG information according to the PPG reference information;
the third determining module 1160a is further configured to, in response to that the gesture type corresponding to the third PPG information is different from the gesture type of the target gesture, increase the number of times of execution of the target gesture by one.
That is to say, in a case that the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, it is necessary to further determine the gesture type corresponding to the third PPG information. If the gesture type corresponding to the third PPG information is different from the gesture type of the target gesture, the target gesture is completed by the user, and therefore the execution times of the target gesture are increased by one time; on the contrary, if the gesture type corresponding to the third PPG information is also the same as the gesture type of the target gesture, the gesture type corresponding to a fourth PPG information after the third PPG information needs to be further determined, and so on until a PPG information different from the gesture type of the target gesture is found, that is, the PPG information is cut off, and at this time, the third determining module 1160a determines that the number of times of execution of the target gesture is changed from zero to 1.
Similarly, after the ending PPG information, when the gesture type of a PPG information is the same as the gesture type of the target gesture again, the PPG information may be marked as new first PPG information, and the above steps are repeated to find a new ending PPG information, where the third determination module 1150a may determine that the number of times of execution of the target gesture is increased once again, i.e., changed from 1 to 2. By analogy, the device 1100 may obtain, according to a user requirement, a total number of times that the target gesture is executed by the user within a predetermined time, that is, the number of times that the target gesture is executed, which is finally determined by the third determining module 1160 a.
Through the above processing, the device 1100 may finally obtain the number of times of execution of the target gesture, and the number of times of execution may correspond to different input information. Thus, in one embodiment, still referring to fig. 16, the device 1100 further comprises:
a third input information determining module 1170a, configured to determine third input information corresponding to the number of times of executing the target gesture.
b) In another embodiment, the first blood flow information is first doppler measurement information, which may be, for example, LDF, LDV, ultrasonic doppler shift, etc., and which includes a series of envelope wave signals, which are subjected to, for example, fast fourier transform to obtain corresponding frequency domain signals, the amount of doppler shift in the frequency domain signals being proportional to the blood flow velocity. Therefore, when the amplitude value of the corresponding PPG information is lower, the blood flow velocity is increased, and the Doppler frequency shift amount in the frequency domain signal is larger; when the higher the amplitude value of the corresponding PPG information is, it indicates that the blood flow velocity is reduced, the smaller the amount of doppler shift in the frequency domain signal is.
The gesture type determining submodule 1121 is configured to determine a gesture type of the target gesture according to the first doppler measurement information and doppler reference information.
The first doppler measurement information is also doppler measurement information of a first time period, and the length of the first time period may be set to 0.8 second, for example.
In one embodiment, referring to fig. 17, the gesture type determination sub-module 1121 may further include:
a converting unit 11211b, configured to perform frequency domain conversion on the first doppler measurement information to obtain first frequency domain information;
a determining unit 11212b, configured to determine the gesture type according to the doppler shift amount of the first frequency domain information and a frequency shift reference information.
The converting unit 11211b may obtain the first frequency domain information corresponding to the first doppler measurement information by using a fast fourier transform or the like. The first doppler measurement information is doppler measurement information of a first time period, and the first time period may be between 0.8 second and 1 second.
In one embodiment, the determining unit 11212b is configured to determine a gesture type of the target gesture according to the doppler shift amount and a reference threshold.
The reference threshold may be a doppler shift amount determined by pre-training. For example, the user executes the thumb-up gesture for multiple times, acquires multiple groups of LDF information at the thumb, performs frequency domain conversion, calculates corresponding doppler shift amounts, and calculates an average value of the doppler shift amounts to obtain an average doppler shift amount corresponding to the thumb-up gesture; then, the user executes the thumb-down gesture for multiple times, acquires multiple groups of LDF information at the thumb respectively, performs frequency domain conversion respectively, calculates corresponding Doppler frequency shift amount, and then calculates the average value of the Doppler frequency shift amount to obtain the average Doppler frequency shift amount corresponding to the thumb-down gesture. Further, a value between an average doppler shift amount corresponding to the thumb-up gesture and an average doppler shift amount corresponding to the thumb-down gesture may be determined as the reference threshold.
In one embodiment, referring to fig. 18, the determining unit 11212b may further include:
a first determining subunit 112121b, configured to determine that the gesture type is a thumb-up gesture in response to the doppler shift amount being less than the reference threshold.
In another embodiment, still referring to fig. 18, the determining unit 11212b may further include:
a second determining subunit 112122b, configured to determine that the gesture type is a thumb-down gesture in response to the doppler shift amount being greater than the reference threshold.
In another embodiment, the determining unit 11212b is configured to determine a gesture type of the target gesture according to the doppler shift amount and at least one reference interval.
The reference interval is a doppler shift interval, which can be determined by pre-training. For example, the user executes the thumb-up gesture for multiple times, acquires multiple groups of LDF information at the thumb, calculates a doppler frequency shift amount according to the frequency domain information corresponding to each group of LDF information, and determines a reference interval corresponding to the thumb-up gesture according to the maximum value and the minimum value; similarly, the user executes the thumb-down gesture for multiple times, acquires multiple groups of LDF information at the thumb respectively, calculates a Doppler frequency shift amount according to the frequency domain information corresponding to each group of LDF information respectively, and then determines another reference interval corresponding to the thumb-down gesture according to the maximum value and the minimum value.
In one embodiment, referring to fig. 19, the determining unit 11212b may include:
a first determining subunit 112121 b' is configured to determine that the gesture type is a thumb-up gesture in response to the doppler shift amount belonging to the first reference interval.
In another embodiment, still referring to fig. 19, the determining unit 11212b may further include:
a second determining subunit 112122 b' for determining that the gesture type is a thumb-down gesture in response to the doppler shift amount belonging to a second reference interval.
The first input information determining submodule 1122 is configured to determine the first input information corresponding to the gesture type.
The corresponding relationship between the gesture type of the target gesture and the first input information may be preset, for example, an opening command corresponds to a thumb-up gesture, and a closing command corresponds to a thumb-down gesture.
Besides determining the gesture type of the target gesture and further inputting the input information corresponding to the gesture type, the device may further determine the execution duration of the target gesture. In one embodiment, the obtaining module 1110 is further configured to obtain a second doppler measurement information adjacent to the first doppler measurement information;
referring to fig. 20, the apparatus 1100 further comprises:
a first determining module 1130b, further configured to determine a gesture type corresponding to the second doppler measurement information according to the doppler reference information;
a second determining module 1140b, configured to determine that the execution duration of the target gesture includes the time corresponding to the first doppler measurement information and the time corresponding to the second doppler measurement information in response to that the gesture type corresponding to the second doppler measurement information is the same as the gesture type of the target gesture.
The second doppler measurement information is also doppler measurement information of a second time period. The second time period may be the same as or close to the first time period.
The implementation principle of the first determining module 1130b determining the gesture type corresponding to the second doppler measurement information may be the same as the implementation principle of the gesture type determining sub-module 1121 determining the gesture type corresponding to the first doppler measurement information, and is not repeated here.
As understood by those skilled in the art, if the gesture type corresponding to the second doppler measurement information is the same as the gesture type of the target gesture, it indicates that the user keeps the target gesture in the time range of the first time period and the second time period, that is, the execution duration of the target gesture includes the time corresponding to the first doppler measurement information and the time corresponding to the second doppler measurement information.
In order to obtain an accurate value of the execution duration of the target gesture, the method should acquire and determine … … a gesture type corresponding to second doppler measurement information and a gesture type corresponding to third doppler measurement information which are sequentially adjacent after the first doppler measurement information until doppler measurement information different from the gesture type of the target gesture is obtained, which may be referred to as cutoff doppler measurement information, and a time length from a time corresponding to the first doppler measurement information to a time before the cutoff doppler measurement information is the execution duration of the target gesture.
Therefore, in one embodiment, the second determining module 1140b is further configured to determine that the target gesture is performed for a duration equal to the time corresponding to the first doppler measurement information in response to the gesture type corresponding to the second doppler measurement information being different from the gesture type corresponding to the target gesture.
That is, if the second doppler measurement information is the cutoff doppler measurement information, the execution duration of the target gesture is just the time corresponding to the first doppler measurement information. Otherwise, the obtaining module 1110 continues to obtain the subsequent adjacent doppler measurement information, the first obtaining module continues to determine the gesture type corresponding thereto, and if the gesture type is different from the gesture type of the target gesture, the second determining module 1140b determines the execution duration of the target gesture; if so, the above operations are repeated until the second determination module 1140b determines the execution duration of the target gesture.
Through the processing, the device can finally obtain the execution duration of the target gesture, and the execution duration can correspond to different input information. Thus, in one embodiment, still referring to fig. 20, the apparatus may further comprise:
a second input information determining module 1150b, configured to determine second input information corresponding to the execution duration of the target gesture.
In addition to determining the gesture type and duration of execution of the target gesture, the device 1100 may also determine the number of executions of the target gesture. In one embodiment, the obtaining module 1110 is further configured to obtain a second doppler measurement information adjacent to the first doppler measurement information;
the first determining module 1130b is further configured to determine a gesture type corresponding to the second doppler measurement information according to the doppler reference information;
referring to fig. 21, the apparatus 1100 further comprises:
a third determining module 1160b, configured to determine that the number of times of execution of the target gesture is increased once in response to that the gesture type corresponding to the second doppler measurement information is different from the gesture type of the target gesture.
The number of executions of the target gesture may have an initial value, which may be 0. The device may acquire and determine a gesture type corresponding to second doppler measurement information and a gesture type corresponding to third doppler measurement information which are sequentially adjacent after the first doppler measurement information … … until doppler measurement information different from the gesture type of the target gesture is obtained, which may be referred to as cutoff doppler measurement information, and the user is executing the same gesture all the time from the time corresponding to the first doppler measurement information to the time before the cutoff doppler measurement information, and the target gesture is completed exactly once at the time before the cutoff doppler measurement information. Therefore, for the third determining module 1160b, if the gesture type corresponding to the second doppler measurement information is different from the gesture type of the target gesture, it indicates that the user has completed one target gesture, so that it may be determined that the number of times of performing the target gesture is increased once, that is, the number of times of performing the target gesture is changed from zero to 1.
In addition, if the gesture type corresponding to the second doppler measurement information is the same as the gesture type of the target gesture, it indicates that the user is still performing the target gesture within the time corresponding to the second doppler measurement information. Thus, it is possible to prevent the occurrence of,
the obtaining module 1110 is further configured to, in response to that the gesture type corresponding to the second doppler measurement information is the same as the gesture type of the target gesture, obtain third doppler measurement information adjacent to the second doppler measurement information;
the first determining module 1130b is further configured to determine a gesture type corresponding to the third doppler measurement information according to the doppler reference information;
the third determining module 1160b is further configured to determine that the number of times of execution of the target gesture is increased once in response to that the gesture type corresponding to the third doppler measurement information is different from the gesture type of the target gesture.
That is to say, when the gesture type corresponding to the second doppler measurement information is the same as the gesture type of the target gesture, the gesture type corresponding to the third doppler measurement information needs to be further determined. If the gesture type corresponding to the third Doppler measurement information is different from the gesture type of the target gesture, the target gesture is finished once by the user, and therefore the execution times of the target gesture are increased once; on the contrary, if the gesture type corresponding to the third doppler measurement information is the same as the gesture type of the target gesture, the gesture type corresponding to the fourth doppler measurement information after the third doppler measurement information needs to be further judged, and so on until a doppler measurement information different from the gesture type of the target gesture is found, that is, the doppler measurement information is cut off, and at this time, the execution frequency of the target gesture is changed from zero to 1.
Similarly, after the cutoff doppler measurement information, when the gesture type of a doppler measurement information appearing again is the same as the gesture type of the target gesture, the doppler measurement information may be marked as a new first doppler measurement information, and then the above steps are repeated to find a new cutoff doppler measurement information, where the number of times of execution of the target gesture is increased once again, that is, from 1 to 2. By analogy, the method can acquire the total times of executing the target gesture by the user within a preset period of time according to the user requirement.
Through the above processing, the device 1100 may finally obtain the number of times of execution of the target gesture, and the number of times of execution may correspond to different input information. Thus, in one embodiment, still referring to fig. 21, the device 1100 further comprises:
a third input information determining module 1170b, configured to determine third input information corresponding to the number of times of executing the target gesture.
In summary, the device according to the embodiment of the present application can determine the corresponding input information according to the blood flow information of the thumb of the user when the user executes the corresponding gesture, so that the input capability of the electronic devices such as the wearable device is effectively improved.
The hardware structure of a user equipment in one embodiment of the present application is shown in fig. 22. The specific embodiment of the present application does not limit the specific implementation of the user equipment, and referring to fig. 22, the apparatus 2200 may include:
a processor (processor)2210, a communication Interface 2220, a memory 2230, a blood flow information sensor, and a communication bus 2240. Wherein:
processor 2210, communication interface 2220, and memory 2230 communicate with each other via a communication bus 2240.
Communication interface 2220, used for communicating with other network elements.
Processor 2210, for executing program 2232, may specifically perform the steps associated with the method embodiment shown in fig. 1 and described above.
In particular, the program 2232 may include program code that includes computer operating instructions.
Processor 2210 may be a central processing unit CPU, or an application specific Integrated circuit asic, or one or more Integrated circuits configured to implement embodiments of the present application.
A memory 2230 for storing programs 2232. The memory 2230 may include high-speed RAM memory and may also include non-volatile memory, such as at least one disk memory. The program 2232 may specifically perform the following steps:
acquiring first blood flow information at the thumb in response to a target gesture performed by a user's hand, the target gesture being a thumb-up gesture or a thumb-down gesture;
and determining input information according to the first blood flow information and reference information.
For specific implementation of each step in the program 2232, reference may be made to corresponding steps or modules in the foregoing embodiments, which are not described herein again. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described devices and modules may refer to the corresponding process descriptions in the foregoing method embodiments, and are not described herein again.
The hardware structure of a user equipment in another embodiment of the present application is shown in fig. 23. The specific embodiment of the present application does not limit the specific implementation of the user equipment, and referring to fig. 23, the device 2300 may include:
a processor (processor)2310, a Communications Interface 2320, a memory (memory)2330, a doppler measurement sensor, and a Communications bus 2340. Wherein:
the processor 2310, communication interface 2320, and memory 2330 communicate with each other via a communication bus 2340.
Communication interface 2320 for communicating with other network elements.
Processor 2310 is configured to execute process 2332, which may specifically perform the steps associated with the method embodiment illustrated in fig. 1.
In particular, the programs 2332 may include program code that includes computer operational instructions.
The processor 2310 may be a central processing unit CPU or an application specific Integrated circuit asic or one or more Integrated circuits configured to implement embodiments of the present application.
A memory 2330 for storing programs 2332. Memory 2330 may include high-speed RAM memory and may also include non-volatile memory (non-volatile memory), such as at least one disk memory. The program 2332 may specifically perform the following steps:
acquiring first PPG information at the thumb in response to a target gesture performed by a user's hand, the target gesture being a thumb-up gesture or a thumb-down gesture;
and determining input information according to the first PPG information and PPG reference information.
The specific implementation of each step in the program 2332 can refer to the corresponding step or module in the above embodiments, which is not described herein again. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described devices and modules may refer to the corresponding process descriptions in the foregoing method embodiments, and are not described herein again.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a controller, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above embodiments are merely illustrative, and not restrictive, and those skilled in the relevant art can make various changes and modifications without departing from the spirit and scope of the present application, and therefore all equivalent technical solutions also fall within the scope of the present application, and the scope of the present application is defined by the appended claims.

Claims (9)

1. A method for determining input information, the method comprising:
acquiring first PPG information at a thumb in response to a target gesture performed by a hand of a user, the target gesture being a thumb-up gesture or a thumb-down gesture;
determining first input information according to the first PPG information and PPG reference information;
the determining a first input information according to the first PPG information and a PPG reference information comprises:
determining a gesture type of the target gesture according to the first PPG information and the PPG reference information;
determining the first input information corresponding to the gesture type;
the method further comprises the following steps:
acquiring second PPG information adjacent to the first PPG information;
determining a gesture type corresponding to the second PPG information according to the PPG reference information;
in response to that the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, determining that the execution duration of the target gesture comprises a time corresponding to the first PPG information and a time corresponding to the second PPG information;
in response to that the gesture type corresponding to the second PPG information is different from the gesture type of the target gesture, determining that the execution duration of the target gesture is equal to the time corresponding to the first PPG information;
and determining second input information corresponding to the execution duration of the target gesture.
2. The method of claim 1, wherein the determining a gesture type of the target gesture from the first PPG information and the PPG reference information comprises:
and determining the gesture type of the target gesture according to the average amplitude value of the first PPG information and a reference threshold.
3. The method of claim 1, wherein the determining a gesture type of the target gesture from the first PPG information and the PPG reference information comprises:
determining a gesture type of the target gesture according to the average amplitude value of the first PPG information and at least one reference interval.
4. A method for determining input information, the method comprising:
acquiring first blood flow information at a thumb in response to a target gesture performed by a user's hand, the target gesture being a thumb-up gesture or a thumb-down gesture;
determining first input information according to the first blood flow information and reference information;
determining a first input information according to the first blood flow information and a reference information, comprising:
determining a gesture type of the target gesture according to the first blood flow information and the reference information;
determining the first input information corresponding to the gesture type;
the method further comprises the following steps:
acquiring adjacent second blood flow information after the first blood flow information;
determining a gesture type corresponding to the second blood flow information according to the reference information;
in response to that the gesture type corresponding to the second blood flow information is the same as the gesture type of the target gesture, determining that the execution duration of the target gesture comprises the time corresponding to the first blood flow information and the time corresponding to the second blood flow information;
in response to that the gesture type corresponding to the second blood flow information is different from the gesture type of the target gesture, determining that the execution duration of the target gesture is equal to the time corresponding to the first blood flow information;
and determining second input information corresponding to the execution duration of the target gesture.
5. An input information determination device, characterized in that the device comprises:
the acquisition module is used for responding to a target gesture executed by the hand of a user and acquiring first PPG information at the thumb, wherein the target gesture is a thumb-up gesture or a thumb-down gesture;
a first input information determining module, configured to determine first input information according to the first PPG information and PPG reference information;
the first input information determination module includes:
a gesture type determination submodule for determining a gesture type of the target gesture according to the first PPG information and the PPG reference information;
the first input information determining submodule is used for determining the first input information corresponding to the gesture type;
the acquisition module is further configured to acquire second PPG information adjacent to the first PPG information;
the apparatus further comprises:
a first determining module, configured to determine, according to the PPG reference information, a gesture type corresponding to the second PPG information;
a second determining module, configured to determine, in response to that the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, that the execution duration of the target gesture includes a time corresponding to the first PPG information and a time corresponding to the second PPG information;
the second determining module is further configured to determine, in response to that the gesture type corresponding to the second PPG information is different from the gesture type of the target gesture, that the execution duration of the target gesture is equal to the time corresponding to the first PPG information;
the apparatus further comprises:
and the second input information determining module is used for determining second input information corresponding to the execution duration of the target gesture.
6. An input information determination device, characterized in that the device comprises:
the acquisition module is used for responding to a target gesture executed by the hand of a user and acquiring first blood flow information at the thumb, wherein the target gesture is a thumb-up gesture or a thumb-down gesture;
the first input information determining module is used for determining first input information according to the first blood flow information and reference information;
the first input information determination module includes:
a gesture type determination submodule for determining a gesture type of the target gesture according to the first blood flow information and the reference information;
the first input information determining submodule is used for determining the first input information corresponding to the gesture type;
the acquisition module is further configured to acquire second blood flow information adjacent to the first blood flow information;
the apparatus further comprises:
the first determining module is used for determining the gesture type corresponding to the second blood flow information according to the reference information;
a second determining module, configured to determine, in response to that the gesture type corresponding to the second blood flow information is the same as the gesture type of the target gesture, that an execution duration of the target gesture includes a time corresponding to the first blood flow information and a time corresponding to the second blood flow information;
the second determining module is further configured to determine, in response to that the gesture type corresponding to the second blood flow information is different from the gesture type of the target gesture, that the execution duration of the target gesture is equal to the time corresponding to the first blood flow information;
the apparatus further comprises:
and the second input information determining module is used for determining second input information corresponding to the execution duration of the target gesture.
7. A wearable device characterized by comprising the input information determination device according to claim 5 or 6.
8. A user equipment, the device comprising:
a PPG sensor;
a memory for storing instructions;
a processor to execute the memory-stored instructions, the instructions to cause the processor to:
acquiring first PPG information at a thumb in response to a target gesture performed by a hand of a user, the target gesture being a thumb-up gesture or a thumb-down gesture;
determining a gesture type of the target gesture according to the first PPG information and PPG reference information;
determining first input information corresponding to the gesture type;
acquiring second PPG information adjacent to the first PPG information;
determining a gesture type corresponding to the second PPG information according to the PPG reference information;
in response to that the gesture type corresponding to the second PPG information is the same as the gesture type of the target gesture, determining that the execution duration of the target gesture comprises a time corresponding to the first PPG information and a time corresponding to the second PPG information;
in response to that the gesture type corresponding to the second PPG information is different from the gesture type of the target gesture, determining that the execution duration of the target gesture is equal to the time corresponding to the first PPG information;
and determining second input information corresponding to the execution duration of the target gesture.
9. A user equipment, the device comprising:
a blood flow information sensor;
a memory for storing instructions;
a processor to execute the memory-stored instructions, the instructions to cause the processor to:
acquiring first blood flow information at a thumb in response to a target gesture performed by a user's hand, the target gesture being a thumb-up gesture or a thumb-down gesture;
determining the gesture type of the target gesture according to the first blood flow information and reference information;
determining first input information corresponding to the gesture type;
acquiring adjacent second blood flow information after the first blood flow information;
determining a gesture type corresponding to the second blood flow information according to the reference information;
in response to that the gesture type corresponding to the second blood flow information is the same as the gesture type of the target gesture, determining that the execution duration of the target gesture comprises the time corresponding to the first blood flow information and the time corresponding to the second blood flow information;
in response to that the gesture type corresponding to the second blood flow information is different from the gesture type of the target gesture, determining that the execution duration of the target gesture is equal to the time corresponding to the first blood flow information;
and determining second input information corresponding to the execution duration of the target gesture.
CN201510584085.7A 2015-09-15 2015-09-15 Input information determination method and device Active CN106249851B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510584085.7A CN106249851B (en) 2015-09-15 2015-09-15 Input information determination method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510584085.7A CN106249851B (en) 2015-09-15 2015-09-15 Input information determination method and device

Publications (2)

Publication Number Publication Date
CN106249851A CN106249851A (en) 2016-12-21
CN106249851B true CN106249851B (en) 2020-03-17

Family

ID=57626749

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510584085.7A Active CN106249851B (en) 2015-09-15 2015-09-15 Input information determination method and device

Country Status (1)

Country Link
CN (1) CN106249851B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106249852A (en) * 2015-09-15 2016-12-21 北京智谷睿拓技术服务有限公司 Input information determines method and apparatus
CN111052049A (en) * 2017-10-09 2020-04-21 华为技术有限公司 Action identification method and device and terminal

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2075064C (en) * 1990-02-16 2001-07-24 Lars-Goran Lindberg A monitor which analyses pulse frequency by photoplethysmographic measurement and a measuring method therefor
CN104656896A (en) * 2015-02-10 2015-05-27 北京智谷睿拓技术服务有限公司 Method and device for confirming input information

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9049998B2 (en) * 2012-06-22 2015-06-09 Fitbit, Inc. Biometric monitoring device with heart rate measurement activated by a single user-gesture

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2075064C (en) * 1990-02-16 2001-07-24 Lars-Goran Lindberg A monitor which analyses pulse frequency by photoplethysmographic measurement and a measuring method therefor
CN104656896A (en) * 2015-02-10 2015-05-27 北京智谷睿拓技术服务有限公司 Method and device for confirming input information

Also Published As

Publication number Publication date
CN106249851A (en) 2016-12-21

Similar Documents

Publication Publication Date Title
CN104052701B (en) A kind of intrapulse modulation characteristic extract real-time and categorizing system realized based on FPGA
CN104656896B (en) The method and apparatus for determining input information
CN104699241B (en) It is determined that the method and apparatus at action and/or action position
CN104615248B (en) The method and apparatus for determining input information
CN107024975B (en) Interaction method and device
CN104699242B (en) It is determined that the method and apparatus at action and/or action position
CN103169456A (en) Processing method and processing system for pulse wave signals
CN104323771B (en) Detect method and the device of P ripple, T ripple in ECG signal
CN104656895B (en) It is determined that the method and apparatus of input information
CN106249851B (en) Input information determination method and device
WO2016004687A1 (en) Method for distinguishing initial time point of ultra-high-frequency partial discharge signal
CN113156396B (en) Method and device for optimizing influence of interference source on laser radar
CN104483602A (en) Local discharge signal identification method and device
CN109214318B (en) Method for searching weak peak of unsteady time sequence
CN106249853B (en) Exchange method and equipment
CN106293023B (en) Attitude determination method and equipment
CN110632563B (en) Intra-pulse frequency coding signal parameter measuring method based on short-time Fourier transform
CN104783786A (en) Feature recognition system and method for electrocardiogram
CN106951151B (en) Key value generation method, device and terminal
CN109117020B (en) Positioning method and device of touch position, storage medium and electronic device
CN108549480B (en) Trigger judgment method and device based on multi-channel data
CN111126616A (en) Method, device and equipment for realizing super-parameter selection
CN105049105A (en) Frequency extraction method of frequency diversity signal
CN115793869A (en) Key state identification method and device
CN106293024B (en) Attitude determination method and equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant