WO2022199312A1 - 手势数据获取方法、装置、终端及存储介质 - Google Patents

手势数据获取方法、装置、终端及存储介质 Download PDF

Info

Publication number
WO2022199312A1
WO2022199312A1 PCT/CN2022/077603 CN2022077603W WO2022199312A1 WO 2022199312 A1 WO2022199312 A1 WO 2022199312A1 CN 2022077603 W CN2022077603 W CN 2022077603W WO 2022199312 A1 WO2022199312 A1 WO 2022199312A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
signal
feature
stage
point
Prior art date
Application number
PCT/CN2022/077603
Other languages
English (en)
French (fr)
Inventor
陈明
曾理
张晓帆
钟卫东
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP22773974.5A priority Critical patent/EP4270156A4/en
Publication of WO2022199312A1 publication Critical patent/WO2022199312A1/zh
Priority to US18/231,965 priority patent/US20230384925A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/448Execution paradigms, e.g. implementations of programming paradigms
    • G06F9/4498Finite state machines

Definitions

  • the present application relates to the field of computer technologies, and in particular, to a method, device, terminal and storage medium for acquiring gesture data.
  • gestures are an important way of human-computer interaction. Gestures not only conform to human interaction habits, but also have the characteristics of simplicity, efficiency and directness.
  • the user usually inputs different gestures in the terminal, and the sensor in the terminal can collect gesture data during the gesture operation, so that the terminal can recognize the gesture and perform an operation corresponding to the gesture.
  • Embodiments of the present application provide a method, device, terminal and storage medium for acquiring gesture data, which can improve the accuracy of gesture data acquisition and at the same time improve the convenience of gesture data acquisition, thereby improving the user's gesture operation experience.
  • the technical solutions of the embodiments of the present application are as follows:
  • an embodiment of the present application provides a method for acquiring gesture data, the method comprising:
  • the feature point is the feature point of the feature value in the gesture data acquisition stage
  • the gesture signal set acquire all eigenvalues and all signal states between the gesture start point and the gesture end point in the gesture data acquisition stage, and generate signal change information based on all the eigenvalues and all the signal states ;
  • the signal change information satisfies the signal verification information, it is determined that the signal change information is the signal data segment of the target gesture.
  • an embodiment of the present application provides an apparatus for acquiring gesture data, the apparatus comprising:
  • a signal acquisition unit configured to acquire a gesture signal collected by a sensor, and acquire a feature value corresponding to the gesture signal
  • a signal adding unit configured to obtain the signal state corresponding to the feature point based on the feature value, and add the feature value and the signal state to the gesture signal set;
  • the feature point is that the feature value is in the gesture data acquisition stage feature points in ;
  • an information acquisition unit for acquiring, in the gesture signal set, all eigenvalues and all signal states between the gesture start point and the gesture end point in the gesture data acquisition stage, based on all the eigenvalues and the all The signal state generates signal change information
  • a data segment acquisition unit configured to determine that the signal change information is a signal data segment of the target gesture when the signal change information satisfies the signal verification information.
  • an embodiment of the present application provides a terminal, which may include: a processor and a memory; wherein, the memory stores a computer program, and the computer program is adapted to be loaded by the processor and execute the above method steps.
  • an embodiment of the present application provides a computer storage medium, where the computer storage medium stores a plurality of instructions, and the instructions are suitable for being loaded by a processor and executing the above method steps.
  • a gesture signal collected by a sensor is acquired, a characteristic value corresponding to the gesture signal is acquired, a signal state corresponding to a characteristic point is acquired based on the characteristic value, and the characteristic value and the signal state are added to the gesture signal set
  • the gesture signal set all eigenvalues and all signal states between the gesture start point and gesture end point in the gesture data acquisition stage can be obtained, and signal change information can be generated based on all eigenvalues and all signal states.
  • the signal change information is determined to be the signal data segment of the target gesture.
  • the target gesture can be acquired only based on the signal change information corresponding to the target gesture collected by the sensor.
  • the signal data segment can not only improve the accuracy of gesture data acquisition, but also improve the convenience of gesture data acquisition, thereby improving the user's gesture operation experience.
  • FIG. 1 shows a schematic background diagram of a method for acquiring gesture data according to an embodiment of the present application
  • FIG. 2 shows a schematic background diagram of a method for acquiring gesture data according to an embodiment of the present application
  • FIG. 3 shows a schematic flowchart of a method for acquiring gesture data provided by an embodiment of the present application
  • FIG. 4 shows a system architecture diagram of a method for acquiring gesture data according to an embodiment of the present application
  • FIG. 5 shows a schematic flowchart of a method for acquiring gesture data provided by an embodiment of the present application
  • FIG. 6 shows a schematic diagram of a scene of a method for acquiring gesture data according to an embodiment of the present application
  • FIG. 7 shows a schematic diagram of an example of gesture data in a method for acquiring gesture data according to an embodiment of the present application
  • FIG. 8 shows a schematic diagram of switching a gesture detection state according to an embodiment of the present application.
  • FIG. 9 shows a schematic flowchart of a method for acquiring gesture data provided by an embodiment of the present application.
  • FIG. 10 shows a schematic structural diagram of an apparatus for acquiring gesture data provided by an embodiment of the present application.
  • FIG. 11 shows a schematic structural diagram of an apparatus for acquiring gesture data provided by an embodiment of the present application.
  • FIG. 12 shows a schematic structural diagram of an apparatus for acquiring gesture data provided by an embodiment of the present application
  • FIG. 13 shows a schematic structural diagram of an apparatus for acquiring gesture data provided by an embodiment of the present application
  • FIG. 14 shows a schematic structural diagram of an apparatus for acquiring gesture data provided by an embodiment of the present application.
  • FIG. 15 is a schematic structural diagram of a terminal provided by an embodiment of the present application.
  • 16 is a schematic structural diagram of an operating system and a user space provided by an embodiment of the present application.
  • Figure 17 is an architecture diagram of the Android operating system in Figure 15;
  • FIG. 18 is an architectural diagram of the IOS operating system in FIG. 15 .
  • a plurality means two or more.
  • “And/or”, which describes the association relationship of the associated objects means that there can be three kinds of relationships, for example, A and/or B, which can mean that A exists alone, A and B exist at the same time, and B exists alone.
  • the character “/” generally indicates that the associated objects are an "or" relationship.
  • Gesture interaction is an important way of human-computer interaction.
  • the terminal can collect gesture data during the gesture operation through its own sensor, so that the user can perform gesture control on the terminal.
  • the power consumption and cost of gesture interaction are relatively low, not affected by lighting conditions, and can be applied to different terminals.
  • the sensor-based gesture interaction process mainly includes two stages: gesture detection and segmentation, and gesture recognition.
  • Gesture detection and segmentation is mainly used for how to segment the complete sensor data during the gesture operation from the continuous data stream.
  • the methods by which the terminal can obtain gesture data include manual control and segmentation methods, single-threshold threshold detection methods, multi-threshold threshold detection methods, detection methods based on rising and falling edges, detection methods based on models, and bottom-up detection and segmentation methods. Wait.
  • the manual control and segmentation method refers to capturing and segmenting gesture data through user operations. For example, the user can mark the starting point of the gesture by pressing the control, and the terminal is prohibited from using the terminal for a preset period of time before and after the gesture operation. The terminal may use the data between the start and end points of the gesture as gesture data. Therefore, the gesture control and segmentation methods require manual control, which increases the steps for acquiring gesture data, and reduces the user's gesture operation experience.
  • FIG. 1 shows a schematic background diagram of a method for acquiring gesture data according to an embodiment of the present application.
  • the terminal can set a start threshold and an end threshold, and use a single-threshold threshold detection method to obtain gesture data.
  • the threshold is set too small, there will be noise data in the gesture data obtained by the terminal. If the threshold is set too large, the starting and ending points of the gesture cannot be accurately determined, which makes the gesture data obtained by the terminal inaccurate and reduces the user's gesture operation experience.
  • FIG. 2 shows a schematic background diagram of a method for acquiring gesture data according to an embodiment of the present application.
  • the terminal may acquire gesture data by adopting a multi-threshold threshold detection method.
  • the terminal may set a start threshold, an end threshold, a peak-to-peak threshold threshold of gesture data, and a trough threshold of gesture data.
  • the starting threshold and the ending threshold may be the same or different.
  • the start and end thresholds shown in Figure 2 are the same.
  • the terminal will discard the data at the initial stage of the gesture and cannot accurately detect the start and end points of the gesture.
  • the gesture data fluctuates violently, the gesture data will be divided into multiple pieces of data, and the end point of the gesture cannot be accurately determined, which makes the acquisition of the gesture data inaccurate and reduces the user's gesture operation experience.
  • the terminal may also acquire gesture data by using a rising and falling edge detection method.
  • the terminal can calculate the rising edge occurrence time T1 and the falling edge occurrence time T2 of the sensor, and use the rising edge occurrence time T1 and the falling edge occurrence time T2 as the starting and ending points of the gesture data respectively, and the terminal can obtain the gesture data.
  • the terminal cannot obtain the starting and ending points of the gesture data by using the rising and falling edge detection method, which makes the gesture data acquisition inaccurate and reduces the user's gesture operation experience.
  • the terminal when the terminal adopts the model detection method to acquire gesture data, the gesture data needs to be manually marked, and this method requires a large amount of computation, which reduces the convenience of acquiring gesture data and reduces the user's gesture operation experience.
  • the terminal can also adopt a top-down segmentation method, and the gesture data can be obtained by combining the gesture data from the top down.
  • the terminal adopts the top-down segmentation method it needs to obtain the gesture data before and after the current gesture operation, and obtain the gesture data and non-gesture data, which will increase the steps of gesture data acquisition, reduce the convenience of gesture data acquisition, and reduce the user's gesture operation. experience.
  • the present application provides a method for acquiring gesture data, which can improve the convenience of acquiring gesture data while improving the accuracy of acquiring gesture data.
  • a gesture data acquisition method is proposed, which can be implemented by relying on a computer program, and can be run on a gesture data acquisition device including a sensor.
  • the computer program can be integrated into an application or run as a stand-alone utility application.
  • the gesture data acquisition device can be a terminal with sensor function
  • the terminal includes but is not limited to: wearable device, handheld device, personal computer, tablet computer, vehicle-mounted device, smart phone, computing device or connected to a wireless modem other processing equipment, etc.
  • Terminal equipment may be called by different names in different networks, for example: user equipment, access terminal, subscriber unit, subscriber station, mobile station, mobile station, remote station, remote terminal, mobile device, user terminal, terminal, wireless communication Equipment, user agent or user equipment, cellular phone, cordless phone, personal digital assistant (PDA), terminal equipment in 5G network or future evolution network, etc.
  • PDA personal digital assistant
  • the gesture data acquisition method includes:
  • a sensor is a detection device that can sense the measured information, and can transform the sensed information into an electrical signal or other desired form of information output according to a certain rule, so as to satisfy the information requirements for transmission, processing, storage, display, recording and control.
  • the terminal can usually have a variety of built-in sensors, such as gyro sensor, magnetic sensor, acceleration sensor, direction sensor, pressure sensor, temperature sensor, angular velocity sensor, etc.
  • the sensor can monitor whether there is gesture input on the terminal display screen. When the sensor detects that there is a gesture input, the sensor can collect the gesture signal during the gesture operation.
  • the change of the gesture signal will change with the change of the user's input gesture.
  • the gesture of the user performing the gesture operation in the three-dimensional space may be consistent with the gesture pre-stored in the terminal, so that the terminal can perform the corresponding operation when recognizing the gesture input by the user.
  • the gestures stored in the terminal may be set based on the terminal factory setting, may also be set by the terminal based on a user's gesture setting instruction, or may be modified by the terminal based on a user's gesture modification instruction.
  • a gesture signal refers to a gesture signal collected by a sensor when a user performs a spatial gesture operation in a three-dimensional space
  • the spatial gesture refers to a gesture action performed in the three-dimensional space when the user performs a gesture operation.
  • the gesture signal corresponds to a spatial gesture.
  • the gesture signal does not specifically refer to a fixed gesture signal.
  • the gesture signal also changes accordingly.
  • the type of sensor changes, the gesture signal also changes accordingly.
  • the feature value refers to a feature value corresponding to the gesture signal.
  • the feature value is a numerical value used to represent the gesture signal.
  • the feature value corresponds to the gesture signal, so when the gesture signal changes, the feature value also changes accordingly.
  • the sensor may collect gesture signals during the gesture operation.
  • the number of gesture signals collected by the sensor is at least one.
  • FIG. 4 shows a system architecture diagram of a method for acquiring gesture data according to an embodiment of the present application.
  • the terminal can acquire the gesture signal collected by the sensor, that is, the processor of the terminal can receive the gesture signal collected by the sensor.
  • the terminal may acquire the feature value of the gesture signal. For example, when the terminal can acquire gesture signals collected by multiple sensors, the terminal can acquire feature values corresponding to the gesture signals.
  • the signal state refers to a state corresponding to the feature point, and the signal state is used to represent the state of the feature point during the gesture input process.
  • the signal state includes, but is not limited to, an initial state, a gesture input state, a gesture state, a gesture end state, a noise state, a gesture end determination state, and the like.
  • a feature point corresponds to a signal state.
  • the feature point is the feature point of the feature value in the gesture data acquisition stage.
  • a feature value corresponds to a feature point, and the feature point includes but is not limited to a time point, a time point, and the like.
  • the feature point may be acquired based on the time record of the terminal itself, or the terminal may determine the feature point corresponding to the feature value based on the current absolute time.
  • the gesture signal set refers to a set formed by summarizing feature values and signal states corresponding to feature points.
  • the gesture signal set includes at least one feature value and a signal state corresponding to the feature point.
  • the set of gesture signals does not specifically refer to a fixed set.
  • the gesture signal changes, the feature value of the gesture signal also changes accordingly, and the signal state corresponding to the feature point obtained based on the feature value also changes accordingly, so the gesture signal set also changes accordingly.
  • the terminal when the terminal acquires the gesture signal collected by the sensor, the terminal may acquire the feature value of the gesture signal.
  • the terminal may acquire the characteristic value of the gesture signal.
  • the terminal may acquire the characteristic point corresponding to the characteristic value.
  • the terminal may acquire the signal state corresponding to the feature point based on the feature value.
  • the terminal may add the feature value and the signal state to the gesture signal set.
  • the gesture data acquisition phase refers to a phase in which the terminal acquires gesture data.
  • the eigenvalues and signal states included in the gesture signal set correspond to all gesture signals collected by the sensor. Since all gesture signals are not the gesture signals corresponding to the target gesture, the terminal needs to acquire the gesture start point and the gesture end point of the gesture data acquisition stage.
  • the gesture start point is used to represent the start point of the gesture data acquisition phase, that is, the gesture start point is used to represent the point at which the gesture input starts.
  • the gesture starting point is determined based on the user's gesture input operation.
  • the terminal's determination of the starting point of the gesture may be determined based on the magnitude relationship between the feature value of the gesture signal and the feature threshold.
  • the gesture end point is used to represent the end point of the gesture data acquisition phase, that is to say, the gesture end point is used to represent the point where the gesture ends the input.
  • the gesture end point does not specifically refer to a fixed end point, and the gesture end point is determined based on the user's gesture input operation.
  • the signal change information is used to represent the overall signal change information of the gesture data acquisition stage.
  • the signal change information corresponds to all feature values and all signal states between the gesture start point and the gesture end point in the gesture data acquisition stage.
  • the signal change information does not specifically refer to a certain fixed signal change information. When the gesture start point or the gesture end point changes, the signal change information also changes accordingly.
  • the terminal when the terminal acquires the gesture signal collected by the sensor, the terminal may acquire the feature value of the gesture signal.
  • the terminal may acquire the characteristic point corresponding to the characteristic value.
  • the terminal When the terminal acquires the feature point, the terminal may acquire the signal state corresponding to the feature point based on the feature value.
  • the terminal When the terminal acquires the signal state corresponding to the feature point, the terminal may add the feature value and the signal state to the gesture signal set.
  • the terminal may acquire the gesture start point and the gesture end point in the gesture acquisition stage from the gesture signal set, and acquire all feature values and all signal states between the gesture start point and the gesture end point.
  • the terminal may generate signal change information based on all feature values and all signal states between the gesture start point and the gesture end point, that is, the terminal may acquire the signal change information in the gesture data acquisition stage.
  • the signal verification information refers to whether the signal change information used to verify the gesture data acquisition stage is consistent with the signal state change of the target gesture.
  • the signal verification information is information stored in the terminal for verifying signal change information.
  • the signal verification information may be set by the terminal when it leaves the factory, or may be set by the terminal based on a user's setting instruction, or may be set by the terminal based on update information pushed by the server.
  • the terminal may determine that the complete gesture is the target gesture.
  • the target gesture may be, for example, a gesture action formed by a user's limb in a three-dimensional space according to a movement trajectory.
  • the target gesture does not specifically refer to a fixed gesture, and the target gesture may be determined based on the gesture input by the user.
  • the signal data segment refers to the stage of acquiring gesture data
  • the gesture signal included in the signal data segment is the gesture signal corresponding to the target gesture.
  • the terminal when the terminal acquires the signal state change information in the gesture data acquisition stage, the terminal can detect whether the signal change information satisfies the signal verification information.
  • the terminal when the terminal detects that the signal change information satisfies the signal verification information, the terminal can determine that the signal change information is the signal data segment of the target gesture, that is, the terminal can determine the data between the gesture start point and the gesture end point in the gesture data acquisition stage as the target Gesture data corresponding to the gesture.
  • a gesture signal collected by a sensor is acquired, a characteristic value corresponding to the gesture signal is acquired, a signal state corresponding to a characteristic point is acquired based on the characteristic value, and the characteristic value and the signal state are added to the gesture signal set
  • the gesture signal set all eigenvalues and all signal states between the gesture start point and gesture end point in the gesture data acquisition stage can be obtained, and signal change information can be generated based on all eigenvalues and all signal states.
  • the signal change information is determined to be the signal data segment of the target gesture.
  • the target gesture can be acquired only based on the signal change information corresponding to the target gesture collected by the sensor.
  • the signal data segment can not only improve the accuracy of gesture data acquisition, but also improve the convenience of gesture data acquisition, thereby improving the user's gesture operation experience.
  • the solution in the embodiment of the present application has a small amount of calculation, which can reduce the acquisition time of gesture data and improve the acquisition efficiency of gesture data.
  • FIG. 5 shows a schematic flowchart of a method for acquiring gesture data provided by an embodiment of the present application. specific:
  • the terminal may receive a user's frequency setting instruction before collecting gestures.
  • the terminal can set the sampling frequency based on the frequency setting instruction, which can improve the accuracy of gesture data acquisition.
  • the frequency setting instructions include, but are not limited to, voice frequency setting instructions, click frequency setting instructions, timing frequency setting instructions, and the like.
  • the frequency setting instruction can be, for example, a voice frequency setting instruction, and the voice frequency setting instruction can be, for example, "set the sampling frequency of the sensor to 100MHz", then the terminal can set the sampling frequency of the sensor to 100MHz, that is, the sensor of the terminal can be set every 10ms Acquire a gesture signal.
  • the timing frequency setting instruction may be, for example, an instruction for setting different time periods or different time points corresponding to different frequencies.
  • the terminal can acquire the feature function of the gesture signal at the current moment, and obtain the feature value of the gesture signal at the current moment based on the feature function. Therefore, the terminal can obtain the feature values corresponding to all gesture signals collected by the sensor.
  • FIG. 6 shows a schematic scene diagram of a method for acquiring gesture data according to an embodiment of the present application.
  • the sensor when the sensor detects that the user inputs the target gesture, the sensor can collect the gesture signal.
  • the terminal can obtain the gesture signal collected by the sensor.
  • the target gesture is a spatial gesture.
  • FIG. 7 shows a schematic diagram of an example of gesture data in a method for acquiring gesture data according to an embodiment of the present application.
  • the noise stage refers to a stage that includes only noisy data.
  • the gesture data collected by the sensor may be composed of terminal noise and environmental noise, for example.
  • the noise stage in this step may refer to a stage in which only one first feature value is greater than or equal to the first threshold in the feature values corresponding to the gesture signal collected by the sensor before the user inputs the gesture.
  • the terminal can detect whether the first feature value is greater than or equal to the first threshold.
  • the first eigenvalue refers to the eigenvalue corresponding to the gesture signal collected by the sensor acquired by the terminal during the noise stage.
  • the first eigenvalue does not specifically refer to a fixed eigenvalue, and the first eigenvalue may be used to represent the eigenvalue obtained in the noise stage.
  • the first threshold refers to a critical value corresponding to the characteristic value.
  • the first threshold does not specifically refer to a fixed threshold.
  • the terminal may change the first threshold based on a user's threshold changing instruction.
  • the threshold value modification instructions include but are not limited to voice threshold value modification instructions, click threshold value modification instructions, timing threshold value modification instructions, and the like.
  • the threshold value modification instruction may be determined by the user based on the acquisition result of historical gesture data. For example, when the user determines that the first threshold is larger based on the historical gesture data results, the user may input a voice threshold changing instruction of "adjust the first threshold to a third threshold". The terminal may change the first threshold based on the voice threshold changing instruction. Wherein, the third threshold is smaller than the first threshold.
  • the terminal when the terminal detects in the noise stage that the first feature value is greater than or equal to the first threshold, the terminal may determine to switch from the noise stage to the gesture data acquisition stage.
  • the terminal may determine the first feature point corresponding to the first feature value as the gesture start point in the gesture data acquisition stage.
  • the first threshold may be, for example, a threshold.
  • the terminal may determine to switch from the noise stage to the gesture data acquisition stage.
  • the terminal may determine the A1 feature point corresponding to the A feature value as the gesture start point in the gesture data acquisition stage.
  • the terminal when the terminal detects in the initial stage that the feature value is greater than or equal to the first threshold, the terminal may also switch from the initial stage to the gesture data acquisition stage.
  • the initial stage refers to the stage in which no gesture signal is acquired.
  • the gesture data acquisition phase includes a gesture start phase, a gesture phase and a gesture end phase
  • the gesture start point is the start point of the gesture start phase
  • the gesture end point is the end point of the gesture end phase.
  • the gesture start phase is used to represent the initial input phase of the target gesture
  • the gesture phase is used to represent the input phase of the target gesture
  • the gesture end phase is used to represent the end input phase of the target gesture.
  • the signal verification information stored in the terminal may be, for example, information of switching from the gesture start stage to the gesture stage, and then switching from the gesture stage to the gesture end stage.
  • the initial stage of the gesture may refer to a signal data segment whose characteristic value is greater than the first threshold and less than the second threshold.
  • the gesture phase may, for example, refer to a signal data segment in which the characteristic value fluctuates sharply above and below the second threshold.
  • the gesture end phase may refer to a signal data segment whose characteristic value is greater than the first threshold and less than the second threshold after the gesture phase. It should be noted that any stage in the gesture data acquisition stage does not only include the above-defined feature value, and when the feature value does not meet the feature value requirement of the current stage, the terminal needs to detect the feature value.
  • the terminal may detect, for example, whether there is an X feature value greater than the first threshold within a preset time period after the V1 feature point. If the terminal detects that the X feature value is greater than the first threshold, the terminal can determine that the V feature value belongs to the gesture initial stage, and the terminal can add the V feature value and the signal state corresponding to the V1 feature point to the gesture signal set. If the terminal detects that there is no X characteristic value greater than the first threshold, the terminal may determine that the V characteristic value does not belong to the initial stage of the gesture, and the terminal may clear the characteristic value and signal state in the gesture data set.
  • the terminal when switching from the noise stage to the gesture data acquisition stage, when the terminal detects that the first feature value is greater than or equal to the first threshold in the noise stage, it may determine to switch from the noise stage to the gesture start stage, and use The first feature point corresponding to the first feature value is determined as the starting point of the initial stage of the gesture.
  • the start point of the gesture start phase is the gesture start point of the gesture data acquisition phase.
  • the first threshold may be, for example, a threshold.
  • the terminal may determine to switch from the noise stage to the gesture start stage.
  • the terminal may determine the A1 feature point corresponding to the A feature value as the starting point of the initial stage of the gesture.
  • the terminal may detect whether the fourth feature value is greater than or equal to the second threshold when the gesture starts.
  • the second threshold is greater than the first threshold, the first threshold is used to detect the start and end points of the gesture data acquisition stage, and the second threshold is used to detect whether there is a gesture action corresponding to the target gesture.
  • the second threshold does not specifically refer to a fixed threshold.
  • the second threshold may, for example, be based on a user's threshold setting command change.
  • the start point of the gesture phase is the end point of the gesture start phase.
  • the fourth feature point may be the feature point corresponding to the first feature value determined by the terminal based on the feature value that is greater than or equal to the second threshold when the gesture signal corresponding to the target gesture is collected.
  • the second threshold may be, for example, the b threshold.
  • the terminal When in the gesture initial stage, and the terminal detects that the fourth feature value S feature value is greater than the b threshold, the terminal can switch the gesture initial stage to the gesture stage, and the terminal can set the S1 feature point corresponding to the S feature value to the gesture stage.
  • the starting point, the S1 feature point is also the end point of the initial stage of the gesture.
  • the second feature value refers to a feature value that is continuously smaller than the first threshold in the gesture data acquisition stage, and the second feature value does not specifically refer to a fixed feature value.
  • the second feature value may include a plurality of Eigenvalues.
  • the third feature value is the last feature value greater than or equal to the first threshold before the second feature point corresponding to the second feature value.
  • the third eigenvalue does not specifically refer to a fixed eigenvalue. For example, when the target gesture input by the user changes, the third feature value also changes accordingly.
  • the terminal can detect whether the second feature value is continuously smaller than the first threshold.
  • the terminal may determine to switch from the gesture data acquisition phase to the noise phase.
  • the terminal may determine the third feature point as the gesture termination point in the gesture data acquisition stage.
  • the third feature point is the feature point corresponding to the last feature value greater than or equal to the first threshold before the second feature point corresponding to the second feature value.
  • the terminal detects whether the second characteristic value is continuously smaller than the first threshold, which can reduce the inaccuracy of determining the gesture end point directly based on the threshold value, can improve the accuracy of determining the gesture start and end points, and can improve the accuracy of gesture data acquisition.
  • the terminal determines the starting and ending points of the gesture, there is no need to manually mark and no manual operation steps are required, which can reduce the steps of acquiring gesture data, and can improve the convenience of acquiring gesture data.
  • the first threshold may be a threshold, for example.
  • the terminal After switching from the noise phase to the gesture data acquisition phase, when the terminal detects that the B feature value is less than the threshold a, the terminal may determine to switch from the gesture data acquisition phase to the noise phase.
  • the terminal may determine the W1 feature point corresponding to the W feature value as the gesture termination point in the gesture data acquisition stage.
  • the W feature value may be, for example, the last feature value greater than the a threshold before the B1 feature point corresponding to the B feature value.
  • the terminal when the terminal switches from the noise phase to the gesture data acquisition phase, the terminal may switch from the noise phase to the gesture initiation phase.
  • the terminal may determine to switch from the gesture initiation phase to the gesture phase.
  • the terminal determines to switch from the gesture stage to the gesture end stage, and determines the fifth feature point corresponding to the fifth feature value as the end point of the gesture stage.
  • the end point of the gesture phase is the start point of the gesture end phase
  • the second threshold is greater than the first threshold.
  • the fifth feature point is the feature point corresponding to the fifth feature value
  • the fifth feature value is the last feature value greater than or equal to the second threshold before the sixth feature point corresponding to the sixth feature value, that is, the fifth feature point is the feature point corresponding to the last feature value greater than or equal to the second threshold before the sixth feature point corresponding to the sixth feature value.
  • the second threshold may be, for example, the b threshold.
  • the terminal When in the gesture initial stage, and the terminal detects that the fourth feature value S feature value is greater than the b threshold, the terminal may switch the gesture initial stage to the gesture stage.
  • the terminal In the gesture phase, when the terminal detects that the F characteristic value is continuously smaller than the b threshold, the terminal may switch the gesture phase to the gesture end phase.
  • the terminal may determine the E1 feature point corresponding to the E feature value as the termination point of the gesture phase.
  • the E feature value may be, for example, the last feature value greater than the b threshold before the F1 feature point corresponding to the F feature value.
  • the terminal determines to switch from the gesture phase to the gesture end phase.
  • the terminal determines to switch from the gesture end stage to the noise stage, that is, the terminal determines to switch from the gesture data acquisition stage to the noise stage.
  • the terminal may determine the third feature point corresponding to the third feature value as the termination point of the end stage of the gesture, and the third feature point is before the second feature point corresponding to the second feature value, and the last feature point is greater than or equal to the first threshold value
  • the stage in which the second feature value is located may be the stage of determining the end state of the gesture.
  • the gesture end state determination phase may be a partial phase of the noise phase.
  • the terminal may acquire the signal state corresponding to the feature point based on the feature value, and add the feature value and signal state to the gesture signal set, reducing the need for detection methods based on multi-threshold thresholds.
  • the gesture data is discarded, the accuracy of the gesture data acquisition can be improved.
  • FIG. 8 shows a schematic diagram of switching a gesture detection state according to an embodiment of the present application.
  • state 0 indicates the initial state, no gesture action is detected, state 1 indicates that the gesture start segment data is detected; state 2 indicates that the gesture segment data is detected; state 3 indicates that the gesture end segment data is detected; state 4 Indicates that the gesture end state judgment segment data is detected; state 5 represents the detection of noise segment data; state 6 represents entering the detection freeze state; ⁇ lower represents the first threshold; ⁇ upper represents the second threshold; L stable represents the gesture end state judgment segment data
  • L max_buf represents the duration of the gesture end segment data
  • L freeze represents the duration of the detected freeze state;
  • L start represents the starting point adjustment value corresponding to the starting point;
  • x represents the feature value.
  • the terminal may switch the signal states based on the signal states of two adjacent feature points.
  • the O feature point and the P feature point are two adjacent feature points.
  • the signal state corresponding to the O feature point may be, for example, a gesture start state.
  • the signal state corresponding to the P feature point may be, for example, a noise state. Therefore, when the terminal determines the signal state of the P feature point, the signal state may be switched from the gesture start state to the noise state. Therefore, at least one signal state may be included in any of the gesture data acquisition phases.
  • the terminal when the terminal detects that the first feature value is smaller than the first threshold, the terminal sets the signal state corresponding to the first feature point to the noise state.
  • the terminal In the gesture initiation stage, when it is detected that the first feature value is greater than or equal to the first threshold and smaller than the second threshold, the terminal sets the signal state corresponding to the first feature point to the gesture initiation state.
  • the terminal In the gesture initial stage, when it is detected that the first feature value is greater than or equal to the second threshold, the terminal sets the signal state corresponding to the first feature point to the gesture state, and the first feature point corresponding to the first feature value is is the starting point of the gesture phase.
  • the terminal may add the first feature value and the signal state corresponding to the first feature point to the gesture signal set.
  • the first threshold may be a threshold, for example.
  • the second threshold may be, for example, the bthreshold.
  • the terminal In the gesture initial stage, when the terminal detects that the first feature value A is smaller than the a threshold, the terminal can set the signal state corresponding to the A1 feature point to the noise state, and set the A feature value and the noise corresponding to the A1 feature point to the noise state. The state is added to the gesture signal collection.
  • the terminal when the terminal detects that the first feature value A is greater than the a threshold and smaller than the b threshold, the terminal can set the signal state corresponding to the A1 feature point to the gesture initial state, and set the A feature value and the A feature value to the gesture initial state.
  • the gesture start state corresponding to the feature point A1 is added to the gesture signal set.
  • the terminal detects that the first feature value A is greater than the b threshold, the terminal can set the signal state corresponding to the A1 feature point to the gesture state, and set the A feature value and the gesture corresponding to the A1 feature point to the gesture state.
  • the state is added to the gesture signal collection.
  • the terminal may set the signal state corresponding to the third feature point to the noise state.
  • the terminal when it is detected that the third feature value is greater than or equal to the first threshold and less than the second threshold, the terminal may set the signal state corresponding to the third feature point to the gesture end state.
  • the terminal In the gesture phase, when it is detected that the third feature value is greater than or equal to the second threshold, the terminal may set the signal state corresponding to the third feature point to the gesture state.
  • the terminal acquires the signal state corresponding to the third feature point in the gesture stage, the terminal may add the third feature value and the signal state corresponding to the third feature point to the gesture signal set.
  • the first threshold may be a threshold, for example.
  • the second threshold may be, for example, the bthreshold.
  • the terminal In the gesture stage, when the terminal detects that the third feature value W is smaller than the threshold a, the terminal can set the signal state corresponding to the W1 feature point to the noise state, and add the W feature value and the noise state corresponding to the W1 feature point. into the gesture signal set.
  • the terminal In the gesture stage, when the terminal detects that the third feature value W feature value is greater than the a threshold and less than the b threshold, the terminal can set the signal state corresponding to the W1 feature point to the gesture end state, and set the W feature value and the W1 feature point. The corresponding gesture start state is added to the gesture signal set.
  • the terminal when the terminal detects that the third feature value A is greater than the b threshold, the terminal can set the signal state corresponding to the W1 feature point to the gesture state, and add the W feature value and the gesture state corresponding to the W1 feature point. into the gesture signal set.
  • the terminal may set the signal state corresponding to the sixth feature point to the noise state.
  • the terminal may set the signal state corresponding to the sixth feature point to the gesture end state.
  • the terminal may set the signal state corresponding to the sixth feature point to the gesture state, and the sixth feature point corresponding to the sixth feature value is The starting point for the end phase of the gesture.
  • the terminal may add the sixth feature value and the signal state corresponding to the sixth feature point to the gesture signal set.
  • the first threshold may be a threshold, for example.
  • the second threshold may be, for example, the bthreshold.
  • the terminal In the gesture end stage, when the terminal detects that the sixth characteristic value F characteristic value is smaller than the threshold a, the terminal can set the signal state corresponding to the F1 characteristic point to the noise state, and set the F characteristic value and the noise state corresponding to the F1 characteristic point Added to the gesture signal collection.
  • the terminal In the gesture ending stage, when the terminal detects that the sixth feature value F feature value is greater than the a threshold and less than the b threshold, the terminal can set the signal state corresponding to the F1 feature point to the gesture end state, and use the F feature value and F1 feature value. The gesture end state corresponding to the point is added to the gesture signal set.
  • the terminal when the terminal detects that the sixth feature value A is greater than the b threshold, the terminal can set the signal state corresponding to the F1 feature point as the gesture state, and set the F feature value and the gesture state corresponding to the F1 feature point. Added to the gesture signal collection.
  • the abnormal signal state does not specifically refer to a fixed signal state
  • the abnormal signal state is a signal state not corresponding to any stage
  • the normal signal state is a signal state corresponding to any stage.
  • the abnormal signal state in the gesture initial stage is the signal state that is not the gesture initial state
  • the normal signal state in the gesture initial stage is the signal state
  • the state is the signal state of the gesture start state.
  • the terminal can obtain the abnormal signal state and normal signal at any stage of the gesture data acquisition stage. state.
  • the terminal when the terminal obtains the signal state corresponding to the feature point based on the feature value, and adds the feature value and the signal state to the gesture signal set, the terminal can, for example, obtain the abnormal signal state noise state and the normal signal state gesture at the initial stage of the gesture. starting state.
  • the terminal when the terminal acquires the abnormal signal state and the normal signal state in any of the gesture data acquisition stages, the terminal may acquire the duration of the abnormal signal state and the duration of the normal signal state.
  • the terminal determines that the duration of the abnormal signal state is greater than the first duration, or the duration of the normal signal state is greater than the second duration, the terminal may clear all feature values and all signal states in the gesture signal set.
  • the terminal detects the abnormal signal state and the duration of the normal signal state at any stage in the gesture data acquisition stage, so that when it is determined that the gesture data is abnormal, the gesture data does not need to be segmented, which can improve the accuracy of the gesture data acquisition.
  • the first duration is a duration corresponding to the duration of the abnormal signal state.
  • the second duration is a duration corresponding to the duration of the normal signal state.
  • the first duration and the second duration do not specifically refer to a fixed duration, and the first duration and the second duration may be set based on a user's duration setting instruction, for example.
  • the values of the first duration and the second duration in each stage may be different.
  • the first duration in the gesture start phase may be 0.2 seconds
  • the first duration in the gesture phase may be, for example, 0.6 seconds
  • the first duration in the gesture end phase may be, for example, 0.3 seconds.
  • the first duration of the gesture initiation phase may be 0.2 seconds, and the second duration may be, for example, 0.5 seconds.
  • the terminal can obtain, for example, the abnormal signal state noise state and the normal signal state gesture start state at the initial stage of the gesture.
  • the terminal can obtain the duration of the noise state and the duration of the normal signal state.
  • the duration of the terminal acquiring the noise state may be, for example, 0.3 seconds, and the duration of the normal signal state may be, for example, 0.4 seconds.
  • the terminal determines that the duration of the abnormal signal state is greater than the first duration of 0.2 seconds, the terminal may clear all feature values and all signal states in the gesture signal set.
  • the terminal can directly time the abnormal signal state, that is, the abnormal signal state can be acquired.
  • the duration of the signal state When the duration of the abnormal signal state is longer than the first duration, the terminal can clear all feature values and all signal states in the gesture signal set;
  • the terminal since the gesture start point in an actual application is a little earlier than the gesture signal collected by the sensor, the terminal adjusts the gesture start point, which can improve the accuracy of gesture data acquisition.
  • the terminal may acquire the starting point adjustment value corresponding to the starting point of the gesture.
  • the terminal may adjust the starting point of the gesture based on the starting point adjustment value to obtain the adjusted starting point of the gesture.
  • the terminal may determine the state change information between the adjusted gesture start point and the gesture end point as the signal data segment of the target gesture.
  • the adjustment value of the starting point corresponding to the starting point of the gesture can be set based on the adjustment instruction of the user. For example, when the terminal determines that the recognition accuracy of the last target gesture is lower than the preset accuracy, the terminal may adjust the starting point adjustment value based on the user's adjustment instruction.
  • the gesture start point may be, for example, the 3rd second
  • the gesture end point may be, for example, the 4th second.
  • the terminal can obtain the start point adjustment value corresponding to the gesture start point, which may be, for example, 0.08 seconds.
  • the terminal can adjust the starting point of the gesture based on the starting point adjustment value, and the adjusted starting point of the gesture is 2.92 seconds.
  • the terminal may determine the state change information between the adjusted gesture start point 2.92 seconds and the gesture termination point 4 seconds as the signal data segment of the target gesture.
  • the terminal may further adjust the gesture termination point, so as to improve the accuracy of gesture data acquisition.
  • the gesture signal collected by the sensor is acquired, and the characteristic value corresponding to the gesture signal is acquired, and when it is detected that the first characteristic value is greater than or equal to the first threshold in the noise phase, it is determined that the noise phase is caused by the noise phase.
  • the gesture data acquisition stage determines the first feature point as the starting point of the gesture data acquisition stage, when it is detected in the gesture data acquisition stage that the persistent second feature value is smaller than the first threshold, determine that the gesture data acquisition stage Switch to the noise stage, and determine the third feature point corresponding to the third feature value as the gesture termination point in the gesture data acquisition stage, which can reduce the inaccuracy of determining the gesture termination point directly based on the threshold value, and can improve the accuracy of determining the gesture start and end points. It can improve the accuracy of gesture data acquisition.
  • the terminal determines the starting and ending points of the gesture, there is no need to manually mark and no manual operation steps are required, which can reduce the steps of acquiring gesture data, and can improve the convenience of acquiring gesture data.
  • the signal state corresponding to the feature point is obtained based on the feature value, and when the feature value and signal state are added to the gesture signal set, the signal state of the feature point in any stage of the gesture data acquisition stage can be set, which can reduce the direct Based on the situation that the gesture data is incomplete due to the threshold, the accuracy of the gesture data acquisition can be improved.
  • the terminal detects the abnormal signal state and the duration of the normal signal state at any stage in the gesture data acquisition stage, so that when it is determined that the gesture data is abnormal, the gesture data does not need to be segmented, which can improve the accuracy of the gesture data acquisition.
  • the signal change information satisfies the signal verification information, it is determined that the signal change information is the signal data segment of the target gesture, which can reduce the inaccuracy of obtaining gesture data only based on the threshold, and can improve the accuracy of gesture data acquisition.
  • FIG. 9 shows a schematic flowchart of a method for acquiring gesture data provided by an embodiment of the present application. specific:
  • the senor provided in the terminal may be at least one.
  • at least one sensor may be the same type of sensor, and may also be different types of sensors.
  • the at least one sensor may be a pressure sensor from a different manufacturer.
  • At least one sensor may collect a gesture signal.
  • the terminal may acquire the gesture signal collected by at least one sensor.
  • at least one sensor may output the collected gesture signal to the processor of the terminal.
  • the processor of the terminal may acquire the gesture signal collected by the at least one sensor.
  • the terminal may acquire gesture signals collected by the i sensor, the y sensor, and the u sensor.
  • the gesture data acquired by the terminal i sensor at time t may be, for example, formula (1).
  • D i represents the data dimension of the i sensor
  • d i (t) is the gesture data collected by the i sensor at time t.
  • S302 splicing gesture signals collected by at least one sensor to obtain a splicing gesture signal
  • the terminal when the terminal acquires the gesture signal collected by the at least one sensor, the terminal may splicing the gesture signal collected by the at least one sensor to obtain the spliced gesture signal.
  • the terminal obtains the gesture signal collected by at least one sensor at time t, it splices the gesture signal collected by at least one sensor to obtain the spliced gesture signal at time t, that is, the spliced gesture signal at time t is the formula (2). ).
  • C represents the number of sensors.
  • the terminal when the terminal splices the gesture signals collected by at least one sensor, the terminal can obtain the spliced gesture signal. Based on the window size of the smoothing window and the weight coefficient corresponding to the smoothing window, the terminal may perform smooth filtering processing on the spliced gesture signal, and the terminal may obtain the feature value corresponding to the spliced gesture signal.
  • the terminal can obtain the feature of the gesture signal at time t, and the feature f(t) of the gesture signal that the terminal can obtain at time t is defined as formula (3).
  • the smoothing filtering of f(t) by the terminal can be defined as formula (4).
  • L is the size of the smoothing window
  • w is the corresponding weight of the smoothing window
  • the terminal performs smooth filtering processing on the spliced gesture signal, so as to obtain the feature value corresponding to the spliced gesture signal.
  • the terminal can also calculate and obtain the eigenvalue based on the signal energy, the signal amplitude, the signal zero-crossing rate, and the signal correlation coefficient.
  • the terminal may perform smooth filtering on signal energy, signal amplitude, signal zero-crossing rate, and signal correlation coefficient to obtain feature values corresponding to the gesture signal.
  • the energy of the gesture signal is defined as formula (5).
  • d i (t) represents the i-th dimension data of gesture data d(t) at time t
  • D represents the dimension of gesture data
  • the gesture signal amplitude is defined as Equation (6).
  • the zero-crossing rate of the gesture signal is defined as formula (7).
  • N represents the length of zero-crossing rate statistics.
  • the gesture signal correlation coefficient is defined as formula (8).
  • N represents the data length for statistical calculation of the correlation coefficient
  • k represents the data delay length
  • ⁇ > represents the dot product
  • the signal variation information includes a signal variation trend and a signal data segment length.
  • the terminal may generate signal change information based on all eigenvalues and all signal states.
  • the signal change information acquired by the terminal may include, for example, the signal change trend and the length of the signal data segment.
  • the terminal determines that the signal change trend satisfies the signal verification trend, and the signal data segment length satisfies the data verification length, the terminal determines that the signal change information is the signal data segment of the target gesture.
  • the detection of the length of the signal data segment by the terminal can reduce the situation that the length of the data segment does not conform to the data verification length, reduce the situation that the gesture data acquisition is inaccurate, and can improve the accuracy of the gesture data acquisition.
  • the data verification length may be, for example, 0.9 seconds to 1.1 seconds.
  • the terminal determines that the signal change trend satisfies the signal verification trend, and the signal data segment length of 1 second satisfies the data verification length of 0.9 seconds to 1.1 seconds, the terminal determines that the signal change information is the target gesture signal data segment.
  • the terminal may determine that the signal change information is the signal data segment of the target gesture.
  • the terminal can segment the gesture data corresponding to the target gesture from the gesture signal set, and the terminal can switch the gesture data acquisition phase to the detection freeze phase.
  • the purpose of detecting the freezing stage is to filter the gesture signal fluctuations caused by manual shaking after the gesture ends.
  • the sensor will collect a gesture signal due to hand shaking, and the terminal will determine the hand shaking as a new target gesture based on the gesture signal.
  • the terminal switches the gesture data acquisition phase to the detection and freezing phase, which can filter out the noise data in the gesture end phase and improve the accuracy of gesture data acquisition.
  • the terminal when the terminal switches the gesture data acquisition phase to the detection freeze phase, the terminal may acquire the duration of the detection freeze phase.
  • the terminal detects that the duration of the detection freeze phase reaches the third duration the terminal can release the detection freeze phase, which can reduce the situation that the detection freeze state lasts for too long and cause inaccurate acquisition of the next gesture data, which can improve the accuracy of gesture data acquisition.
  • the third duration refers to a duration corresponding to the duration of the detection freeze phase.
  • the third duration does not specifically refer to a fixed duration, and the third duration may be determined based on a user's duration setting instruction.
  • the third duration may be, for example, 0.5 seconds.
  • the terminal may acquire the duration of the detection freezing phase.
  • the duration of the detection freeze phase obtained by the terminal.
  • the terminal detects that the duration of the detection freeze phase reaches 0.5 seconds, the terminal can release the detection freeze state.
  • the terminal may start timing the detection freeze phase.
  • the terminal may directly release the detection freezing state.
  • a gesture signal collected by at least one sensor is acquired, and the gesture signal collected by at least one sensor is spliced to obtain a spliced gesture signal.
  • the window size of the smoothing window and the corresponding weight coefficient of the smoothing window Based on the window size of the smoothing window and the corresponding weight coefficient of the smoothing window, Performing smooth filtering processing on the spliced gesture signal to obtain eigenvalues corresponding to the spliced gesture signals, which can reduce the inaccurate acquisition of eigenvalues caused by noise data, can improve the accuracy of eigenvalue acquisition, and can improve the accuracy of gesture data acquisition.
  • the gesture data acquisition stage when the signal change information satisfies the signal verification information, after determining that the signal change information is the signal data segment of the target gesture, switch the gesture data acquisition stage to the detection freezing stage, and when the duration of the detection freezing stage is longer than the third duration, release the In the detection freezing stage, the noise data at the end stage of the gesture can be filtered out, the influence of the fluctuation of the sensor itself and the environmental noise on the gesture data can be reduced, and the accuracy of the gesture data acquisition can be improved.
  • FIG. 10 shows a schematic structural diagram of an apparatus for acquiring gesture data provided by an exemplary embodiment of the present application.
  • the device for acquiring gesture data can be implemented as a whole or a part of the device through software, hardware or a combination of the two.
  • the gesture data acquisition device 1000 includes a signal acquisition unit 1001, a signal addition unit 1002, an information acquisition unit 1003 and a data segment acquisition unit 1004, wherein:
  • a signal acquisition unit 1001 configured to acquire a gesture signal collected by a sensor, and acquire a feature value corresponding to the gesture signal
  • the signal adding unit 1002 is used to obtain the signal state corresponding to the feature point based on the feature value, and add the feature value and the signal state to the gesture signal set;
  • the feature point is the feature point of the feature value in the gesture data acquisition stage;
  • the information acquisition unit 1003 is used to acquire, in the gesture signal set, all feature values and all signal states between the gesture start point and the gesture end point in the gesture data acquisition stage, and generate signal change information based on all the feature values and all signal states;
  • the data segment acquisition unit 1004 is configured to determine that the signal change information is the signal data segment of the target gesture when the signal change information satisfies the signal verification information.
  • FIG. 11 shows a schematic structural diagram of an apparatus for acquiring gesture data according to an embodiment of the present application.
  • the gesture data acquisition device 1000 further includes a phase switching unit 1005, configured to acquire the signal state corresponding to the feature point based on the feature value, and before adding the feature value and the signal state to the gesture signal set, when in the noise stage
  • a phase switching unit 1005 configured to acquire the signal state corresponding to the feature point based on the feature value, and before adding the feature value and the signal state to the gesture signal set, when in the noise stage
  • the first feature point corresponding to the first feature value is determined as the gesture start point of the gesture data acquisition stage
  • the gesture data acquisition stage When it is detected in the gesture data acquisition stage that the second feature value is less than the first threshold, it is determined to switch from the gesture data acquisition stage to the noise stage, and the third feature point corresponding to the third feature value is determined as the gesture data acquisition stage.
  • the gesture termination point, the third feature point is the feature point corresponding to the last feature value greater than or equal to the first threshold before the second feature point corresponding to the second feature value.
  • the gesture data acquisition phase includes a gesture start phase, a gesture phase and a gesture end phase
  • the gesture start point is the start point of the gesture start phase
  • the gesture end point is the end point of the gesture end phase
  • the stage switching unit 1005 is configured to determine to switch from the noise stage to the gesture data acquisition stage when it is detected that the first feature value is greater than or equal to the first threshold in the noise stage, and the first feature value corresponds to When the first feature point of is determined as the gesture starting point in the gesture data acquisition stage, it is specifically used for:
  • the first feature value is greater than or equal to the first threshold in the noise stage, it is determined to switch from the noise stage to the gesture initial stage, and the first feature point corresponding to the first feature value is determined as the gesture initial stage. starting point;
  • the fourth feature value is greater than or equal to the second threshold in the gesture initial stage, it is determined to switch from the gesture initial stage to the gesture stage, and the fourth feature point corresponding to the fourth feature value is determined as the gesture stage starting point;
  • the start point of the gesture phase is the end point of the gesture start phase, and the second threshold is greater than the first threshold.
  • the phase switching unit 1005 is configured to determine to switch from the gesture data acquisition phase to the noise phase when it is detected in the gesture data acquisition phase that the second characteristic value is less than the first threshold, and the third characteristic value corresponds to the noise phase.
  • the third feature point of is determined as the gesture termination point in the gesture data acquisition stage, it is specifically used for:
  • the sixth feature value When it is detected in the gesture stage that the sixth feature value is less than the second threshold, it is determined to switch from the gesture stage to the gesture end stage, and the fifth feature point corresponding to the fifth feature value is determined as the termination point of the gesture stage.
  • the feature point is the feature point corresponding to the last feature value greater than or equal to the second threshold before the sixth feature point corresponding to the sixth feature value;
  • the third feature point corresponding to the third feature value is determined as the end point of the gesture end stage,
  • the third feature point is the feature point corresponding to the last feature value greater than or equal to the first threshold before the second feature point corresponding to the second feature value;
  • the termination point of the gesture phase is the start point of the gesture end phase, and the second threshold is greater than the first threshold.
  • the stage switching unit 1005 configured to obtain the signal state corresponding to the feature point based on the feature value, and when adding the feature value and the signal state to the gesture signal set, is specifically used for:
  • the signal state corresponding to the first feature point is set to the noise state
  • the signal state corresponding to the first feature point is set as the gesture state.
  • the signal adding unit 1002 configured to obtain the signal state corresponding to the feature point based on the feature value, and when adding the feature value and the signal state to the gesture signal set, is specifically configured to:
  • the signal state corresponding to the third feature point is set to the noise state
  • the signal state corresponding to the third feature point is set as the gesture state.
  • the signal adding unit 1002 is used to obtain the signal state corresponding to the feature point based on the feature value, and when adding the feature value and the signal state to the gesture signal set, it is specifically used for:
  • the signal state corresponding to the sixth feature point is set to the noise state
  • the signal state corresponding to the sixth feature point is set as the gesture state.
  • the gesture data acquisition device further includes a signal clearing unit, configured to acquire the signal state corresponding to the feature point based on the feature value, and after adding the feature value and the signal state to the gesture signal set, acquire any of the gesture data acquisition stages.
  • FIG. 12 shows a schematic structural diagram of an apparatus for acquiring gesture data according to an embodiment of the present application.
  • the data segment acquisition unit 1004 further includes an adjustment value acquisition subunit 1014, a start point adjustment subunit 1024, and a data segment acquisition subunit 1034.
  • the data segment acquisition unit 1004 is used for when the signal change information satisfies the signal verification information, When it is determined that the signal change information is the signal data segment of the target gesture:
  • the adjustment value obtaining subunit 1014 is configured to obtain the starting point adjustment value corresponding to the starting point of the gesture when the signal change information satisfies the signal verification information;
  • the starting point adjustment subunit 1024 is used to adjust the starting point of the gesture based on the starting point adjustment value to obtain the adjusted starting point of the gesture;
  • the data segment acquisition subunit 1034 is configured to determine the adjusted state change information between the gesture start point and the gesture end point as the signal data segment of the target gesture.
  • FIG. 13 shows a schematic structural diagram of an apparatus for acquiring gesture data according to an embodiment of the present application.
  • the signal acquisition unit 1001 includes a signal acquisition subunit 1011, a signal splicing subunit 1021 and an eigenvalue acquisition subunit 1031.
  • the signal acquisition unit is used to acquire the gesture signal collected by the sensor, and to acquire the eigenvalue corresponding to the gesture signal, include:
  • a signal acquisition subunit 1011 configured to acquire a gesture signal collected by at least one sensor
  • the signal splicing subunit 1021 is used for splicing gesture signals collected by at least one sensor to obtain a splicing gesture signal;
  • the eigenvalue obtaining subunit 1031 is configured to perform smooth filtering processing on the spliced gesture signal based on the window size of the smoothing window and the corresponding weight coefficient of the smoothing window to obtain the eigenvalue corresponding to the splicing gesture signal.
  • the data segment acquisition unit 1004 is used for the signal change information to include the signal change trend and the signal data segment length; when the signal change information satisfies the signal verification information, when it is determined that the signal change information is the signal data segment of the target gesture, specifically Used for:
  • the signal change information is the signal data segment of the target gesture.
  • FIG. 14 shows a schematic structural diagram of an apparatus for acquiring gesture data according to an embodiment of the present application.
  • the gesture data acquisition device 1000 further includes a phase release unit 1006 for switching the gesture data acquisition phase after determining that the signal change information is the signal data segment of the target gesture when the signal change information satisfies the signal verification information.
  • a phase release unit 1006 for switching the gesture data acquisition phase after determining that the signal change information is the signal data segment of the target gesture when the signal change information satisfies the signal verification information.
  • the detection freezing phase is released.
  • gesture data acquisition device when the gesture data acquisition device provided in the above embodiment executes the gesture data acquisition method, only the division of the above functional modules is used as an example for illustration. In practical applications, the above functions may be allocated to different functions as required. Module completion means dividing the internal structure of the device into different functional modules to complete all or part of the functions described above.
  • the gesture data acquisition device and the gesture data acquisition method embodiments provided by the above embodiments belong to the same concept, and the implementation process thereof is described in the method embodiments, which will not be repeated here.
  • the feature value corresponding to the gesture signal is obtained, and the signal state corresponding to the feature point is obtained based on the feature value, and the feature value and the signal state are added to the gesture signal set.
  • the gesture signal set acquire all eigenvalues and all signal states between the gesture start point and gesture end point in the gesture data acquisition stage, generate signal change information based on all eigenvalues and all signal states, and generate signal change information when the signal change information satisfies signal verification When the information is received, it is determined that the signal change information is the signal data segment of the target gesture.
  • the target gesture can be acquired only based on the signal change information corresponding to the target gesture collected by the sensor.
  • the signal data segment can not only improve the accuracy of gesture data acquisition, but also improve the convenience of gesture data acquisition, thereby improving the user's gesture operation experience.
  • An embodiment of the present application further provides a computer storage medium, where the computer storage medium can store a plurality of instructions, and the instructions are suitable for being loaded by a processor and executing the above-described embodiments shown in FIG. 3 to FIG. 9 .
  • the method for acquiring gesture data for the specific execution process, reference may be made to the specific description of the embodiments shown in FIG. 3 to FIG. 9 , which will not be repeated here.
  • the present application also provides a computer program product, where the computer program product stores at least one instruction, and the at least one instruction is loaded by the processor and executes the gesture data in the embodiments shown in FIG. 3 to FIG. 9 above.
  • the computer program product stores at least one instruction
  • the at least one instruction is loaded by the processor and executes the gesture data in the embodiments shown in FIG. 3 to FIG. 9 above.
  • FIG. 15 shows a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • the terminal in this application may include one or more of the following components: a processor 110 , a memory 120 , an input device 130 , an output device 140 and a bus 150 .
  • the processor 110 , the memory 120 , the input device 130 and the output device 140 may be connected through a bus 150 .
  • the processor loads and executes the gesture data acquisition method according to the embodiment shown in FIG. 3 to FIG. 9 .
  • For the specific execution process refer to the specific description of the embodiment shown in FIG. 3 to FIG. 9 , which will not be repeated here.
  • the processor 110 may include one or more processing cores.
  • the processor 110 uses various interfaces and lines to connect various parts in the entire terminal, and executes the terminal 100 by running or executing the instructions, programs, code sets or instruction sets stored in the memory 120, and calling the data stored in the memory 120. various functions and processing data.
  • the processor 110 may adopt at least one of digital signal processing (digital signal processing, DSP), field-programmable gate array (field-programmable gate array, FPGA), and programmable logic array (programmable logic Array, PLA).
  • DSP digital signal processing
  • FPGA field-programmable gate array
  • PLA programmable logic array
  • a hardware form is implemented.
  • the processor 110 may integrate one or a combination of a central processing unit (CPU), a graphics processing unit (GPU), a modem, and the like.
  • the CPU mainly handles the operating system, user interface and application programs, etc.; the GPU is used for rendering and drawing of the display content; the modem is used to handle wireless communication. It can be understood that, the above-mentioned modem may also not be integrated into the processor 110, and is implemented by a communication chip alone.
  • the memory 120 may include random access memory (RAM), or may include read-only memory (ROM).
  • the memory 120 includes a non-transitory computer-readable storage medium.
  • Memory 120 may be used to store instructions, programs, codes, sets of codes, or sets of instructions.
  • the memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playback function, an image playback function, etc.) , instructions for implementing the following method embodiments, etc.
  • the operating system can be an Android (Android) system, including a system based on the deep development of the Android system, an IOS system developed by Apple, including a system based on the deep development of the IOS system or other systems.
  • the storage data area may also store data created by the terminal during use, such as phone book, audio and video data, chat record data, and the like.
  • the memory 120 can be divided into an operating system space and a user space.
  • the operating system runs in the operating system space, and native and third-party applications run in the user space.
  • the operating system allocates corresponding system resources to different third-party applications.
  • different application scenarios in the same third-party application also have different requirements for system resources.
  • the third-party application has higher requirements on the disk read speed; in the animation rendering scenario, the first Third-party applications have higher requirements on GPU performance.
  • the operating system and the third-party application are independent of each other, and the operating system often cannot perceive the current application scenario of the third-party application in time, so that the operating system cannot perform targeted system resource adaptation according to the specific application scenario of the third-party application.
  • the programs and data stored in the memory 120 are shown in FIG. 17 , and the memory 120 can store the Linux kernel layer 320, the system runtime library layer 340, the application framework layer 360 and the application layer 380, Among them, the Linux kernel layer 320, the system runtime layer 340 and the application framework layer 360 belong to the operating system space, and the application layer 380 belongs to the user space.
  • the Linux kernel layer 320 provides underlying drivers for various hardware of the terminal, such as display drivers, audio drivers, camera drivers, Bluetooth drivers, Wi-Fi drivers, and power management.
  • the system runtime layer 340 provides main feature support for the Android system through some C/C++ libraries.
  • the SQLite library provides database support
  • the OpenGL/ES library provides 3D drawing support
  • the Webkit library provides browser kernel support.
  • An Android runtime library (Android runtime) is also provided in the system runtime library layer 340, which mainly provides some core libraries, which can allow developers to use the Java language to write Android applications.
  • the application framework layer 360 provides various APIs that may be used when building applications, and developers can also build their own applications by using these APIs, such as activity management, window management, view management, notification management, content provider, Package management, call management, resource management, location management.
  • the IOS system includes: a core operating system layer 420 (Core OS layer), a core service layer 440 (Core Services layer), a media layer 460 (Media layer), touchable layer 480 (Cocoa Touch Layer).
  • the core operating system layer 420 includes the operating system kernel, drivers, and low-level program frameworks, which provide functions closer to hardware for use by the program frameworks located in the core service layer 440 .
  • the core service layer 440 provides system services and/or program frameworks required by the application, such as a Foundation framework, an account framework, an advertisement framework, a data storage framework, a network connection framework, a geographic location framework, a motion framework, and the like.
  • the media layer 460 provides audiovisual interfaces for applications, such as graphics and image related interfaces, audio technology related interfaces, video technology related interfaces, and audio and video transmission technology wireless playback (AirPlay) interfaces.
  • the touchable layer 480 provides various common interface-related frameworks for application development, and the touchable layer 480 is responsible for the user's touch interaction operation on the terminal. Such as local notification service, remote push service, advertising framework, game tool framework, message user interface interface (User Interface, UI) framework, user interface UIKit framework, map framework and so on.
  • the frameworks related to most applications include but are not limited to: the basic framework in the core service layer 440 and the UIKit framework in the touchable layer 480 .
  • the basic framework provides many basic object classes and data types, and provides the most basic system services for all applications, regardless of UI.
  • the classes provided by the UIKit framework are the basic UI class libraries for creating touch-based user interfaces. iOS applications can provide UI based on the UIKit framework, so it provides the application's infrastructure for building user interfaces, drawing , handling and user interaction events, responding to gestures, and more.
  • the method and principle of implementing data communication between a third-party application and an operating system in the IOS system may refer to the Android system, which will not be repeated in this application.
  • the input device 130 is used for receiving input instructions or data, and the input device 130 includes but is not limited to a keyboard, a mouse, a camera, a microphone or a touch device.
  • the output device 140 is used for outputting instructions or data, and the output device 140 includes, but is not limited to, a display device, a speaker, and the like.
  • the input device 130 and the output device 140 can be co-located, and the input device 130 and the output device 140 are a touch display screen, which is used to receive any suitable objects such as a user's finger, a touch pen, etc. Nearby touch actions, as well as displaying the user interface of each application.
  • the touch display is usually located on the front panel of the terminal.
  • the touch screen can be designed as a full screen, a curved screen or a special-shaped screen.
  • the touch display screen can also be designed to be a combination of a full screen and a curved screen, or a combination of a special-shaped screen and a curved screen, which is not limited in this embodiment of the present application.
  • the structure of the terminal shown in the above drawings does not constitute a limitation on the terminal, and the terminal may include more or less components than those shown in the drawings, or combine certain components, Or a different component arrangement.
  • the terminal also includes components such as a radio frequency circuit, an input unit, a sensor, an audio circuit, a wireless fidelity (WiFi) module, a power supply, and a Bluetooth module, which will not be repeated here.
  • WiFi wireless fidelity
  • the execution body of each step may be the terminal described above.
  • the execution body of each step is the operating system of the terminal.
  • the operating system may be an Android system, an IOS system, or other operating systems, which are not limited in this embodiment of the present application.
  • the terminal according to the embodiment of the present application may also have a display device installed thereon, and the display device may be various devices that can implement a display function, such as a cathode ray tube display (Cathode ray tube display, CR for short), a light-emitting diode display (light-emitting diode display).
  • a display function such as a cathode ray tube display (Cathode ray tube display, CR for short), a light-emitting diode display (light-emitting diode display).
  • emitting diode display referred to as LED
  • electronic ink screen liquid crystal display (liquid crystal display, referred to as LCD), plasma display panel (plasma display panel, referred to as PDP) and so on.
  • LCD liquid crystal display
  • PDP plasma display panel
  • the user can use the display device on the terminal 100 to view the displayed text, image, video and other information.
  • the terminal may be a smart phone, a tablet computer, a game device, an AR (Augmented Reality) device, a car, a data storage device, an audio playback device, a video playback device, a notebook, a desktop computing device, a wearable device such as an electronic watch , electronic glasses, electronic helmets, electronic bracelets, electronic necklaces, electronic clothing and other equipment.
  • a smart phone such as a smart phone, a tablet computer, a game device, an AR (Augmented Reality) device, a car, a data storage device, an audio playback device, a video playback device, a notebook, a desktop computing device, a wearable device such as an electronic watch , electronic glasses, electronic helmets, electronic bracelets, electronic necklaces, electronic clothing and other equipment.
  • a wearable device such as an electronic watch , electronic glasses, electronic helmets, electronic bracelets, electronic necklaces, electronic clothing and other equipment.
  • the “unit” and “module” in this specification refer to software and/or hardware that can perform a specific function independently or in cooperation with other components, wherein the hardware can be, for example, a Field-Programmable Gate Array (FPGA) , Integrated Circuit (IC), etc.
  • FPGA Field-Programmable Gate Array
  • IC Integrated Circuit
  • the disclosed apparatus may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative, for example, the division of the units is only a logical function division, and there may be other division methods in actual implementation, for example, multiple units or components may be combined or Integration into another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some service interfaces, indirect coupling or communication connection of devices or units, and may be in electrical or other forms.
  • the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit if implemented as a software functional unit and sold or used as a stand-alone product, may be stored in a computer-readable memory.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art, or all or part of the technical solution, and the computer software product is stored in a memory.
  • a computer device which may be a personal computer, a server, or a network device, etc.
  • the aforementioned memory includes: U disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), mobile hard disk, magnetic disk or optical disk and other media that can store program codes.

Abstract

本申请涉及计算机技术领域,尤其涉及一种手势数据获取方法、装置、终端及存储介质。其中,一种手势数据获取方法,包括:获取传感器采集的手势信号,获取手势信号对应的特征值;基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中;特征点为特征值在手势数据获取阶段中的特征点;在手势信号集合中,获取手势数据获取阶段的手势起始点和手势终止点之间的所有特征值和所有信号状态,基于所有特征值和所有信号状态生成信号变化信息;当信号变化信息满足信号验证信息时,确定信号变化信息为目标手势的信号数据段。采用本申请可以在提高手势数据获取准确性的同时提高手势数据获取的便利性。

Description

手势数据获取方法、装置、终端及存储介质 技术领域
本申请涉及计算机技术领域,尤其涉及一种手势数据获取方法、装置、终端及存储介质。
背景技术
随着科学技术的发展,终端的发展也越来越迅速,因此提高用户使用终端的便利性成为用户关注的焦点。其中,手势是一种重要的人机交互方式,手势不仅符合人的交互习惯,还具有简洁、高效、直接的特点。用户通常会在终端中输入不同的手势,终端中的传感器可以采集手势操作过程中的手势数据,从而终端可以对手势进行识别并执行与手势对应的操作。
发明内容
本申请实施例提供了一种手势数据获取方法、装置、终端及存储介质,可以在提高手势数据获取准确性的同时提高手势数据获取的便利性,进而可以提高用户的手势操作体验。本申请实施例的技术方案如下:
第一方面,本申请实施例提供了一种手势数据获取方法,所述方法包括:
获取传感器采集的手势信号,获取所述手势信号对应的特征值;
基于所述特征值获取特征点对应的信号状态,将所述特征值和所述信号状态添加至手势信号集合中;所述特征点为所述特征值在手势数据获取阶段中的特征点;
在所述手势信号集合中,获取所述手势数据获取阶段的手势起始点和手势终止点之间的所有特征值和所有信号状态,基于所述所有特征值和所述所有信号状态生成信号变化信息;
当所述信号变化信息满足信号验证信息时,确定所述信号变化信息为目标手势的信号数据段。
第二方面,本申请实施例提供了一种手势数据获取装置,所述装置包括:
信号获取单元,用于获取传感器采集的手势信号,获取所述手势信号对应的特征值;
信号添加单元,用于基于所述特征值获取特征点对应的信号状态,将所述特征值和所述信号状态添加至手势信号集合中;所述特征点为所述特征值在手势数据获取阶段中的特征点;
信息获取单元,用于在所述手势信号集合中,获取所述手势数据获取阶段的手势起始点和手势终止点之间的所有特征值和所有信号状态,基于所述所有特征值和所述所有信号状态生成信号变化信息;
数据段获取单元,用于当所述信号变化信息满足信号验证信息时,确定所述信号变化信息为目标手势的信号数据段。
第三方面,本申请实施例提供一种终端,可包括:处理器和存储器;其中,所述存储器存储有计算机程序,所述计算机程序适于由所述处理器加载并执行上述的方法步骤。
第四方面,本申请实施例提供一种计算机存储介质,所述计算机存储介质存储有多条指令,所述指令适于由处理器加载并执行上述的方法步骤。
本申请一些实施例提供的技术方案带来的有益效果至少包括:
在本申请一个或多个实施例中,通过获取传感器采集的手势信号,获取手势信号对应的特征值,并基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中,可以在在手势信号集合中,获取手势数据获取阶段的手势起始点和手势终止点之间的所有特征值和所有信号状态,基于所有特征值和所有信号状态生成信号变化信息,在信号变化信息满足信号验证信息时,确定信号变化信息为目标手势的信号数据段。因此在获取手势数据时,无需人为控制,减少基于仅基于门限阈值获取手势数据不准确的情况,无需对模型手动标注手势数据,仅基于传感器采集的目标手势对应的信号变化信息即可获取目标手势的信号数据段,可以在提高手势数据获取准确性的同时提高手势数据获取的便利性,进而可以提高用户的手势操作体验。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1示出本申请实施例的一种手势数据获取方法的背景示意图;
图2示出本申请实施例的一种手势数据获取方法的背景示意图;
图3示出本申请实施例提供的一种手势数据获取方法的流程示意图;
图4示出本申请实施例的一种手势数据获取方法的系统架构图;
图5示出本申请实施例提供的一种手势数据获取方法的流程示意图;
图6示出本申请实施例的一种手势数据获取方法的场景示意图;
图7示出本申请实施例的一种手势数据获取方法中手势数据的举例示意图;
图8示出本申请实施例的一种手势检测状态的切换示意图;
图9示出本申请实施例提供的一种手势数据获取方法的流程示意图;
图10示出本申请实施例提供的一种手势数据获取装置的结构示意图;
图11示出本申请实施例提供的一种手势数据获取装置的结构示意图;
图12示出本申请实施例提供的一种手势数据获取装置的结构示意图;
图13示出本申请实施例提供的一种手势数据获取装置的结构示意图;
图14示出本申请实施例提供的一种手势数据获取装置的结构示意图;
图15是本申请实施例提供的一种终端的结构示意图;
图16是本申请实施例提供的操作系统和用户空间的结构示意图;
图17是图15中安卓操作系统的架构图;
图18是图15中IOS操作系统的架构图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
在本申请的描述中,需要理解的是,术语“第一”、“第二”等仅用于描述目的,而不能理解为指示或暗示相对重要性。在本申请的描述中,需要说明的是,除非另有明确的规定和限定,“包括”和“具有”以及它们任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或设备没有限定于已列出的步骤或单元,而是可选地还包括没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或设备固有的其他步骤或单元。对于本领域的普通技术人员而言,可以具体情况理解上述术语在本申请中的具体含义。此外,在本申请的描述中,除非另有说明,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。
随着物联网技术的日益成熟,人机交互对于终端之间的交互和控制将尤为重要。手势交互是一种重要的人机交互方式。基于传感器的手势交互中,终端可以通过本身的传感器采集手势操作过程中的手势数据,以便用户可以对终端进行手势控制。手势交互的功耗和成本比较低,不受光照条件的影响,可以适用于不同的终端。基于传感器采集数据的精度以及终端本身的计算能力,可以确定终端对手势识别的准确性。基于传感器的手势交互过程主要包括手势检测与分割、手势识别两个阶段。手势检测和分割主要用于如何将手势操作过程中的完整传感器数据从连续的数据流中分割出来。
进一步,终端可以获取手势数据的方法包括手动控制与分割方法、单门限阈值检测方法、多门限阈值检测方法、基于上升下降沿的检测方法、基于模型的检测方法、以及自下而上检测分割法等。其中,手动控制与分割方式是指通过用户的操作来截取和分割手势数据。例如用户通过按压控件可以实现手势起始点的标记,并且在手势操作前后的预设时长内禁止使用终端。终端可以将手势起止点之间的数据作为手势数据。因此手势控制与分割方式需要人为控制,增加了手势数据获取步骤,降低了用户的手势操作体验。
进一步,图1示出本申请实施例的一种手势数据获取方法的背景示意图。如图1所示,终端可以设置起始阈值和结束阈值,采用单门限阈值检测方法获取手势数据。但是阈值设定过小,终端获取的手势数据中会存在噪声数据,阈值设定过大,不能准确确定手势起止点,使得终端获取到手势数据不准确,降低用户的手势操作体验。
进一步,图2示出本申请实施例的一种手势数据获取方法的背景示意图。如图2所示,终端可以采用多门限阈值检测方法获取手势数据。例如终端可以设置起始阈值、结束阈值、手势数据波峰峰值门限阈值和手势数据波谷门限阈值。其中,起始阈值可以和结束阈值相同,也可以不同。图2所示起始阈值和结束阈值相同。当手势起始缓慢时,终端会将手势起始阶段的数据丢弃,不能准确检测到手势起止点。当手势数据波动较为剧烈时,手势数据会被分割为多段数据,不能准确确定手势终止点,使得手势数据获取不准确,降低用户的手势操作体验。
进一步,终端还可以采用上升下降沿检测方法获取手势数据。例如终端可以计算传感器上升沿发生时刻T1和下降沿发生时刻T2,分别以上升沿发生时刻T1和下降沿发生时刻T2作为手势数据的起止点,终端可以获取到手势数据。但是在手势变化幅度较小时,终端采用上升下降沿检测方法无法获取到手势数据的起止点,使得手势数据获取不准确,降低用户的手势操作体验。
进一步,终端采用模型检测方法获取手势数据时,需要手动标注手势数据,并且该方法的计算量较大,降低手势数据获取的便利性,降低用户的手势操作体验。终端还可以采用自上而下的分割法,通过自上而下对手势数据进行合并,可以获取到手势数据。终端采用自上而下的分割法时,需要获取当前手势操作前后的手势数据,得到手势数据和非手势数据,会增加手势数据获取的步骤,降低手势数据获取的便利性, 降低用户的手势操作体验。本申请提供一种手势数据获取方法,可以在提高手势数据获取准确性的同时提高手势数据获取的便利性。
下面结合具体的实施例对本申请进行详细说明。
在一个实施例中,如图3所示,提出了一种手势数据获取方法,该方法可依赖于计算机程序实现,可运行于包括传感器的手势数据获取装置上。该计算机程序可集成在应用中,也可作为独立的工具类应用运行。
其中,所述手势数据获取装置可以是具有传感器功能的终端,该终端包括但不限于:可穿戴设备、手持设备、个人电脑、平板电脑、车载设备、智能手机、计算设备或连接到无线调制解调器的其它处理设备等。在不同的网络中终端设备可以叫做不同的名称,例如:用户设备、接入终端、用户单元、用户站、移动站、移动台、远方站、远程终端、移动设备、用户终端、终端、无线通信设备、用户代理或用户装置、蜂窝电话、无绳电话、个人数字处理(personal digital assistant,PDA)、5G网络或未来演进网络中的终端设备等。
具体的,该手势数据获取方法包括:
S101,获取传感器采集的手势信号,获取手势信号对应的特征值;
根据一些实施例,传感器(transducer)是一种检测装置,能感受到被测量的信息,并能将感受到的信息,按一定规律变换成为电信号或其他所需形式的信息输出,以满足信息的传输、处理、存储、显示、记录和控制等要求。终端通常可内置的多种传感器,如陀螺传感器、磁力传感器、加速度传感器、方向传感器、压力传感器、温敏传感器、角速度传感器等。在实际应用中,传感器可以监测终端显示屏上是否存在手势输入。当传感器检测到存在手势输入时,传感器可以采集手势操作过程中的手势信号。
可选的,在实际应用中,手势信号的变化会随着用户输入手势的变化而变化。用户在三维空间执行手势操作的手势可以与终端中预先存储的手势一致,以便终端可以在识别到用户输入的手势时执行相应的操作。终端中存储的手势可以基于终端出厂时设置的,还可以是终端基于用户的手势设置指令设置的,还可以是终端基于用户的手势修改指令进行修改的。
易于理解的是,手势信号是指用户在三维空间中执行空间手势操作时传感器采集到的手势信号,该空间手势是指用为执行手势操作时在三维空间中所执行的手势动作。该手势信号与空间手势相对应。该手势信号并不特指某一固定手势信号。当用户输入的空间手势发生变化时,该手势信号也会相应变化。当传感器的类型发生变化时,该手势信号也会相应变化。
可选的,特征值是指与手势信号对应的特征值。特征值是用于表示手势信号的数值。该特征值与手势信号相对应,因此当手势信号发生变化时,该特征值也会相应变化。
根据一些实施例,当用户执行空间手势操作时,传感器可以采集手势操作过程中的手势信号。其中,传感器采集的手势信号的数量至少一个。图4示出本申请实施例的一种手势数据获取方法的系统架构图。如图4所示,终端可以获取传感器采集的手势信号,即终端的处理器可以接收传感器采集的手势信号。当终端获取到传感器采集的手势信号时,终端可以获取该手势信号的特征值。终端例如还可以获取到多个传感器采集的手势信号时,终端可以获取到该手势信号对应的特征值。
S102,基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中;
根据一些实施例,信号状态是指与特征点对应的状态,该信号状态用于表示特征点在手势输入过程中的状态。该信号状态包括但不限于初始状态、手势输入状态、手势状态、手势结束状态、噪声状态、手势结束判定状态等。其中,一个特征点对应一种信号状态。
易于理解的是,特征点为特征值在手势数据获取阶段中的特征点。一个特征值对应一个特征点,该特征点包括但不限于时间点、时刻点等。该特征点可以是基于终端自身的时间记录获取的,还可以是终端基于当前绝对时间确定特征值对应的特征点。
可选的,手势信号集合是指特征值和与特征点对应的信号状态汇总而成的集体。该手势信号集合中包括至少一个特征值和特征点对应的信号状态。该手势信号集合并不特指某一固定集合。当手势信号发生变化时,手势信号的特征值也会相应变化,基于特征值获取到的特征点对应的信号状态也会相应变化,因此手势信号集合也会相应变化。
根据一些实施例,当终端获取到传感器采集的手势信号时,终端可以获取手势信号的特征值。当终端获取到手势信号的特征值时,终端可以获取该特征值对应的特征点。当终端获取到该特征点时,终端可以基于特征值获取该特征点对应的信号状态。终端获取到特征点对应的信号状态时,终端可以将特征值和信号状态添加至手势信号集合中。
S103,在手势信号集合中,获取手势数据获取阶段的手势起始点和手势终止点之间的所有特征值和所有信号状态,基于所有特征值和所有信号状态生成信号变化信息;
根据一些实施例,手势数据获取阶段是指终端获取手势数据的阶段。手势信号集合中包括的特征值 和信号状态是与传感器采集到的所有手势信号对应的。由于所有的手势信号并不都是目标手势对应的手势信号,因此终端需要获取手势数据获取阶段的手势起始点和手势终止点。
可选的,手势起始点用于表示手势数据获取阶段的起始点,也就是说手势起始点用于表示手势开始输入的点。该手势起始点基于用户的手势输入操作确定的。终端确定手势起始点可以基于手势信号的特征值和特征阈值之间的大小关系确定。
根据一些实施例,手势终止点用于表示手势数据获取阶段的结束点,也就是说手势结束点用于表示手势结束输入的点。该手势终止点并不特指某一固定结束点,该手势结束点基于用户的手势输入操作确定的。
易于理解的是,信号变化信息用于表示手势数据获取阶段的整体的信号变化信息。该信号变化信息与手势数据获取阶段的手势起始点和手势终止点之间的所有特征值和所有信号状态对应的。该信号变化信息并不特指某一固定信号变化信息。当手势起始点或手势终止点发生变化时,该信号变化信息也会相应变化。
根据一些实施例,当终端获取到传感器采集的手势信号时,终端可以获取手势信号的特征值。当终端获取到手势信号的特征值时,终端可以获取该特征值对应的特征点。当终端获取到该特征点时,终端可以基于特征值获取该特征点对应的信号状态。终端获取到特征点对应的信号状态时,终端可以将特征值和信号状态添加至手势信号集合中。终端可以在手势信号集合中获取手势获取阶段的手势起始点和手势终止点,并获取手势起始点和手势终止点之间的所有特征值和所有信号状态。终端可以基于手势起始点和手势终止点之间的所有特征值和所有信号状态,生成信号变化信息,即终端可以获取到手势数据获取阶段的信号变化信息。
S104,当信号变化信息满足信号验证信息时,确定信号变化信息为目标手势的信号数据段。
根据一些实施例,信号验证信息是指用于验证手势数据获取阶段的信号变化信息是否与目标手势的信号状态变化一致。该信号验证信息为终端中存储的用于验证信号变化信息的信息。该信号验证信息可以是终端出厂时设置的,还可以是终端基于用户的设置指令设置的,或者是终端基于服务器推送的更新信息设置的。
可选的,终端基于信号变化信息确定用户输入的空间手势为完整手势时,终端可以确定该完整手势为目标手势。该目标手势例如可以是用户肢体按照移动轨迹在三维空间中所形成的手势动作。目标手势并不特指某一固定手势,该目标手势可以基于用户输入的手势确定。
易于理解的是,信号数据段是指手势数据获取的阶段,该信号数据段包括的手势信号即为目标手势对应的手势信号。
根据一些实施例,当终端获取到手势数据获取阶段的信号状态变化信息时,终端可以检测该信号变化信息是否满足信号验证信息。当终端检测到该信号变化信息满足信号验证信息时,终端可以确定信号变化信息为目标手势的信号数据段,即终端可以确定手势数据获取阶段的手势起始点和手势终止点之间的数据为目标手势对应的手势数据。
在本申请一个或多个实施例中,通过获取传感器采集的手势信号,获取手势信号对应的特征值,并基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中,可以在在手势信号集合中,获取手势数据获取阶段的手势起始点和手势终止点之间的所有特征值和所有信号状态,基于所有特征值和所有信号状态生成信号变化信息,在信号变化信息满足信号验证信息时,确定信号变化信息为目标手势的信号数据段。因此在获取手势数据时,无需人为控制,减少基于仅基于门限阈值获取手势数据不准确的情况,无需对模型手动标注手势数据,仅基于传感器采集的目标手势对应的信号变化信息即可获取目标手势的信号数据段,可以在提高手势数据获取准确性的同时提高手势数据获取的便利性,进而可以提高用户的手势操作体验。另外,本申请实施例的方案计算量小,可以减少手势数据获取时长,提高手势数据的获取效率。
请参见图5,图5示出本申请实施例提供的一种手势数据获取方法的流程示意图。具体的:
S201,获取传感器采集的手势信号,获取手势信号对应的特征值;
具体过程如上所述,此处不再赘述。
根据一些实施例,由于不同用户输入手势的差异,因此终端采集手势之前可以接收用户的频率设置指令。终端可以基于该频率设置指令设置采样频率,可以提高手势数据获取的准确性。该频率设置指令包括但不限于语音频率设置指令、点击频率设置指令、定时频率设置指令等等。该频率设置指令例如可以是语音频率设置指令,该语音频率设置指令例如可以是“将传感器的采样频率设置为100MHz”,则终端可以将传感器的采样频率设置为100MHz,即终端的传感器可以每10ms采集一个手势信号。定时频率设置指令例如可以是设置不同的时间段或者不同的时间点对应不同的频率的指令。
易于理解的是,当终端获取手势信号的特征值时,终端可以获取当前时刻手势信号的特征函数,并 基于该特征函数获取到当前时刻的手势信号的特征值。因此终端可以获取到传感器采集的所有手势信号对应的特征值。
可选的,图6示出本申请实施例的一种手势数据获取方法的场景示意图。如图6所示,当传感器检测到用户输入目标手势时,传感器可以采集手势信号。终端可以获取到传感器采集的手势信号。该目标手势为空间手势。
S202,当在噪声阶段中检测到存在第一特征值大于或等于第一阈值时,确定由噪声阶段切换至手势数据获取阶段,并将第一特征值对应的第一特征点确定为手势数据获取阶段的手势起始点;
根据一些实施例,图7示出本申请实施例的一种手势数据获取方法中手势数据的举例示意图。噪声阶段是指仅包括噪声数据的阶段。在噪声阶段,传感器采集得到的手势数据例如可以是终端噪声和环境噪声构成。本步骤的噪声阶段可以是指用户输入手势前,传感器采集到的手势信号对应的特征值中仅存在一个第一特征值大于或者等于第一阈值的阶段。
易于理解的是,当在噪声阶段时,终端可以检测第一特征值是否大于或等于第一阈值。第一特征值是指在噪声阶段终端获取到的传感器采集到的手势信号对应的特征值。该第一特征值并不特指某一固定特征值,该第一特征值可以用于表示在噪声阶段获取到的特征值。第一阈值是指与特征值对应的临界值。该第一阈值并不特指某一固定阈值。例如终端可以基于用户的阈值更改指令更改该第一阈值。该阈值更改指令包括但不限于语音阈值更改指令、点击阈值更改指令和定时阈值更改指令等。该阈值更改指令例如可以是用户基于历史手势数据获取结果确定的。例如,当用户基于历史手势数据结果确定第一阈值较大时,用户可以输入“将第一阈值调小至第三阈值”的语音阈值更改指令。终端可以基于该语音阈值更改指令更改第一阈值。其中,第三阈值小于第一阈值。
根据一些实施例,当终端在噪声阶段检测到存在第一特征值大于或者等于第一阈值时,终端可以确定由噪声阶段切换至手势数据获取阶段。终端可以将该第一特征值对应的第一特征点确定为手势数据获取阶段的手势起始点。
易于理解的是,第一阈值例如可以是a阈值。当在噪声阶段终端检测到存在A特征值大于a阈值时,终端可以确定由噪声阶段切换至手势数据获取阶段。终端可以将A特征值对应的A1特征点确定为手势数据获取阶段的手势起始点。
根据一些实施例,当终端在初始阶段检测到存在特征值大于或者等于第一阈值时,终端也可以由初始阶段切换至手势数据获取阶段。初始阶段是指未获取到手势信号的阶段。
根据一些实施例,手势数据获取阶段包括手势起始阶段、手势阶段和手势结束阶段,手势起始点为手势起始阶段的起始点,手势终止点为手势结束阶段的终止点。手势起始阶段用于表示目标手势的起始输入阶段,手势阶段用于表示目标手势的输入阶段,手势结束阶段用于表示目标手势的结束输入阶段。终端中存储的信号验证信息例如可以是由手势起始阶段切换至手势阶段,再由手势阶段切换至手势结束阶段的信息。
易于理解的是,手势起始阶段例如可以是指特征值大于第一阈值,且小于第二阈值的信号数据段。手势阶段例如可以是指特征值在第二阈值上下剧烈波动的信号数据段。手势结束阶段例如可以是指在手势阶段之后,特征值大于第一阈值,且小于第二阈值的信号数据段。需要说明的是,手势数据获取阶段中的任一阶段并不仅仅包括上述限定的特征值,在特征值不满足当前阶段的特征值要求时,终端需要对该特征值进行检测。例如,在手势起始阶段V特征值小于第一阈值时,终端例如可以检测在V1特征点后的预设时长内是否存在X特征值大于第一阈值。若终端检测到存在X特征值大于第一阈值时,终端可以确定V特征值属于手势起始阶段,终端可以将V特征值和V1特征点对应的信号状态添加至手势信号集合。若终端检测到不存在X特征值大于第一阈值时,终端可以确定V特征值不属于手势起始阶段,终端可以清除手势数据集合中的特征值和信号状态。
根据一些实施例,噪声阶段切换至手势数据获取阶段时,当终端在噪声阶段中检测到存在第一特征值大于或等于第一阈值时,可以确定由噪声阶段切换至手势起始阶段,并将第一特征值对应的第一特征点确定为手势起始阶段的起始点。该手势起始阶段的起始点即为手势数据获取阶段的手势起始点。
易于理解的是,第一阈值例如可以是a阈值。当在噪声阶段终端检测到存在A特征值大于a阈值时,终端可以确定由噪声阶段切换至手势起始阶段。终端可以将A特征值对应的A1特征点确定为手势起始阶段的起始点。
根据一些实施例,当在手势起始阶段时,终端可以检测是否存在第四特征值大于或者等于第二阈值。其中,第二阈值大于第一阈值,第一阈值用于检测手势数据获取阶段的起止点,第二阈值用于检测是否有与目标手势对应的手势动作的发生。第二阈值并不特指某一固定阈值。该第二阈值例如可以是基于用户的阈值设置指令变化。当在手势起始阶段中检测到存在第四特征值大于或等于第二阈值时,终端可以确定由手势起始阶段切换至手势阶段,并将第四特征值对应的第四特征点确定为手势阶段的起始点。其中,手势阶段的起始点为手势起始阶段的终止点。第四特征点例如可以是目标手势对应的手势信号采集时,终端基 于特征值确定的第一个大于或者等于第二阈值的特征值对应的特征点。
易于理解的是,第二阈值例如可以是b阈值。当在手势起始阶段,且终端检测第四特征值S特征值大于b阈值时,终端可以将手势起始阶段切换至手势阶段,终端可以将S特征值对应的S1特征点设置为手势阶段的起始点,S1特征点还是手势起始阶段的终止点。
S203,当在手势数据获取阶段中检测到持续存在第二特征值小于第一阈值时,确定由手势数据获取阶段切换至噪声阶段,将第三特征值对应的第三特征点确定为手势数据获取阶段的手势终止点;
根据一些实施例,第二特征值是指手势数据获取阶段中持续小于第一阈值的特征值,该第二特征值并不特指某一固定特征值,该第二特征值例如可以包括多个特征值。第三特征值为第二特征值对应的第二特征点之前最后一个大于或等于第一阈值的特征值。该第三特征值并不特指某一固定特征值。例如当用户输入的目标手势发生变化时,该第三特征值也会相应变化。
易于理解的是,当在手势数据获取阶段时,终端可以检测是否持续存在第二特征值小于第一阈值。当终端检测到持续存在第二特征值小于第一阈值时,终端可以确定由手势数据获取阶段切换至噪声阶段。终端可以将第三特征点确定为手势数据获取阶段的手势终止点。其中,第三特征点为第二特征值对应的第二特征点之前,最后一个大于或等于第一阈值的特征值对应的特征点。终端检测是否持续存在第二特征值小于第一阈值,可以减少直接基于门限阈值确定手势终止点不准确的情况,可以提高手势起止点确定的准确性,可以提高手势数据获取的准确性。另外,终端确定手势起止点时无需手动标注,无需人工操作步骤,可以减少手势数据获取步骤,可以提高手势数据获取的便利性。
可选的,第一阈值例如可以是a阈值。在噪声阶段切换至手势数据获取阶段之后,当终端检测到持续存在B特征值小于a阈值时,终端可以确定由手势数据获取阶段切换至噪声阶段。终端可以将W特征值对应的W1特征点确定为手势数据获取阶段的手势终止点。其中,W特征值例如可以为B特征值对应的B1特征点之前最后一个大于a阈值的特征值。
根据一些实施例,当终端由噪声阶段切换至手势数据获取阶段时,终端可以由噪声阶段切换至手势起始阶段。当在手势起始阶段中检测到存在第四特征值大于或等于第二阈值时,终端可以确定由手势起始阶段切换至手势阶段。当在手势阶段中检测到持续存在第六特征值小于第二阈值时,终端确定由手势阶段切换至手势结束阶段,将第五特征值对应的第五特征点确定为手势阶段的终止点。其中,该手势阶段的终止点为手势结束阶段的起始点,第二阈值大于第一阈值。其中,第五特征点为第五特征值对应的特征点,第五特征值为第六特征值对应的第六特征点之前,最后一个大于或等于第二阈值的特征值,即第五特征点为第六特征值对应的第六特征点之前,最后一个大于或等于第二阈值的特征值对应的特征点。
易于理解的是,第二阈值例如可以是b阈值。当在手势起始阶段,且终端检测第四特征值S特征值大于b阈值时,终端可以将手势起始阶段切换至手势阶段。在手势阶段,当终端检测到F特征值持续小于b阈值时,终端可以将手势阶段切换至手势结束阶段。终端可以将E特征值对应的E1特征点确定为手势阶段的终止点。其中,E特征值例如可以为F特征值对应的F1特征点之前最后一个大于b阈值的特征值。
根据一些实施例,当在手势阶段中检测到持续存在第六特征值小于第二阈值时,终端确定由手势阶段切换至手势结束阶段。当在手势结束阶段中检测到持续存在第二特征值小于第一阈值时,终端确定由手势结束阶段切换至噪声阶段,即终端确定由手势数据获取阶段切换至噪声阶段。此时,终端可以将第三特征值对应的第三特征点确定为手势结束阶段的终止点,第三特征点为第二特征值对应的第二特征点之前,最后一个大于或等于第一阈值的特征值对应的特征点。其中,第二特征值所在阶段可以为手势结束状态判定阶段。手势结束状态判定阶段可以是噪声阶段的部分阶段。
S204,基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中;
具体过程如上所述,此处不再赘述。
根据一些实施例,在手势数据获取阶段中任一阶段,终端可以基于特征值获取特征点对应的信号状态,并将特征值和信号状态添加至手势信号集合中,减少基于多门限阈值检测方法对手势数据进行丢弃的情况,可以提高手势数据获取的准确性。
可选的,当终端基于特征值获取到特征点对应的信号状态时,终端可以仅对特征点对应的信号状态进行设置,终端还可以基于相邻两个特征点的信号状态对信号状态进行切换。图8示出本申请实施例的一种手势检测状态的切换示意图。如图8所示,状态0表示初始状态,未检测到手势动作,状态1表示检测到手势起始段数据;状态2表示检测到手势段数据;状态3表示检测到手势结束段数据;状态4表示检测到手势结束状态判定段数据;状态5表示检测到噪声段数据;状态6表示进入检测冻结状态;ε lower表示第一阈值;ε upper表示第二阈值;L stable表示手势结束状态判定段数据的持续时长,L max_buf表示手势结束段数据的持续时长,L freeze表示检测冻结状态的持续时长;L start表示起始点对应的起始点调整值;x表示特征值。
根据一些实施例,终端可以基于相邻两个特征点的信号状态对信号状态进行切换。例如O特征点和P特征点为相邻的两个特征点。O特征点对应的信号状态例如可以是手势起始状态。P特征点对应的信号状态例如可以是噪声状态,因此当终端确定P特征点的信号状态时,可以将信号状态由手势起始状态切换为噪声状态。因此在手势数据获取阶段的任一阶段中可以包括至少一中信号状态。
易于理解的是,在处于手势起始阶段中,当终端检测到第一特征值小于第一阈值时,终端将第一特征点对应的信号状态设置为噪声状态。在处于手势起始阶段中,当检测到第一特征值大于或等于第一阈值,且小于第二阈值时,终端将第一特征点对应的信号状态设置为手势起始状态。在处于手势起始阶段中,当检测到第一特征值大于或等于第二阈值时,终端将第一特征点对应的信号状态设置为手势状态,该第一特征值对应的第一特征点即为手势阶段的起始点。当终端获取到手势起始阶段的第一特征点对应的信号状态时,终端可以将第一特征值和第一特征点对应的信号状态添加至手势信号集合中。
可选的,第一阈值例如可以是a阈值。第二阈值例如可以是b阈值。当处于手势起始阶段中,终端检测到第一特征值A特征值小于a阈值时,终端可以将A1特征点对应的信号状态设置为噪声状态,并将A特征值和A1特征点对应的噪声状态添加至手势信号集合中。当处于手势起始阶段中,终端检测到第一特征值A特征值大于a阈值且小于b阈值时,终端可以将A1特征点对应的信号状态设置为手势起始状态,并将A特征值和A1特征点对应的手势起始状态添加至手势信号集合中。当处于手势起始阶段中,终端检测到第一特征值A特征值大于b阈值时,终端可以将A1特征点对应的信号状态设置为手势状态,并将A特征值和A1特征点对应的手势状态添加至手势信号集合中。
易于理解的是,在处于手势阶段中,当检测到第三特征值小于第一阈值时,终端可以将第三特征点对应的信号状态设置为噪声状态。在处于手势阶段中,当检测到第三特征值大于或等于第一阈值,且小于第二阈值时,终端可以将第三特征点对应的信号状态设置为手势结束状态。在处于手势阶段中,当检测到第三特征值大于或等于第二阈值时,终端可以将第三特征点对应的信号状态设置为手势状态。当终端获取到手势阶段的第三特征点对应的信号状态时,终端可以将第三特征值和第三特征点对应的信号状态添加至手势信号集合中。
可选的,第一阈值例如可以是a阈值。第二阈值例如可以是b阈值。当处于手势阶段中,终端检测到第三特征值W特征值小于a阈值时,终端可以将W1特征点对应的信号状态设置为噪声状态,并将W特征值和W1特征点对应的噪声状态添加至手势信号集合中。当处于手势阶段中,终端检测到第三特征值W特征值大于a阈值且小于b阈值时,终端可以将W1特征点对应的信号状态设置为手势结束状态,并将W特征值和W1特征点对应的手势起始状态添加至手势信号集合中。当处于手势阶段中,终端检测到第三特征值A特征值大于b阈值时,终端可以将W1特征点对应的信号状态设置为手势状态,并将W特征值和W1特征点对应的手势状态添加至手势信号集合中。
易于理解的是,在处于手势结束阶段中,当检测到第六特征值小于第一阈值时,终端可以将第六特征点对应的信号状态设置为噪声状态。在处于手势结束阶段中,当检测到第六特征值大于或等于第一阈值,且小于第二阈值时,终端可以将第六特征点对应的信号状态设置为手势结束状态。在处于手势结束阶段中,当检测到第六特征值大于或等于第二阈值时,终端可以将第六特征点对应的信号状态设置为手势状态,该第六特征值对应的第六特征点即为手势结束阶段的起始点。当终端获取到手势结束阶段的第六特征点对应的信号状态时,终端可以将第六特征值和第六特征点对应的信号状态添加至手势信号集合中。
可选的,第一阈值例如可以是a阈值。第二阈值例如可以是b阈值。当处于手势结束阶段中,终端检测到第六特征值F特征值小于a阈值时,终端可以将F1特征点对应的信号状态设置为噪声状态,并将F特征值和F1特征点对应的噪声状态添加至手势信号集合中。当处于手势结束阶段中,终端检测到第六特征值F特征值大于a阈值且小于b阈值时,终端可以将F1特征点对应的信号状态设置为手势结束状态,并将F特征值和F1特征点对应的手势结束状态添加至手势信号集合中。当处于手势结束阶段中,终端检测到第六特征值A特征值大于b阈值时,终端可以将F1特征点对应的信号状态设置为手势状态,并将F特征值和F1特征点对应的手势状态添加至手势信号集合中。
S205,获取手势数据获取阶段中任一阶段的异常信号状态和正常信号状态;
根据一些实施例,异常信号状态并不特指某一固定信号状态,该异常信号状态为与任一阶段不对应的信号状态,正常信号状态则为与任一阶段对应的信号状态。例如当手势数据获取阶段中任一阶段为手势起始阶段时,手势起始阶段的异常信号状态即为信号状态不为手势起始状态的信号状态,手势起始阶段的正常信号状态即为信号状态为手势起始状态的信号状态。
易于理解的是,当终端基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中之后,终端可以获取手势数据获取阶段中任一阶段的异常信号状态和正常信号状态。
可选的,当终端基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中之后,终端例如可以获取手势起始阶段的异常信号状态噪声状态和正常信号状态手势起始状态。
S206,在异常信号状态的持续时长大于第一时长,或正常信号状态的持续时长大于第二时长时,清除 手势信号集合中的所有特征值和所有信号状态;
根据一些实施例,当终端获取到手势数据获取阶段中任一阶段的异常信号状态和正常信号状态时,终端可以获取异常信号状态的持续时长和正常信号状态的持续时长。当终端确定异常信号状态的持续时长大于第一时长,或正常信号状态的持续时长大于第二时长时,终端可以清除手势信号集合中的所有特征值和所有信号状态。终端对手势数据获取阶段中任一阶段的异常信号状态和正常信号状态的持续时长的检测,可以在确定手势数据异常时无需对手势数据进行分割,可以提高手势数据获取的准确性。
易于理解的是,第一时长是与异常信号状态的持续时长对应的时长。第二时长是与正常信号状态的持续时长对应的时长。第一时长和第二时长并不特指某一固定时长,该第一时长和第二时长例如可以基于用户的时长设置指令进行设置。
可选的,各个阶段中第一时长和第二时长的取值可以不一样。例如,手势起始阶段中第一时长可以是0.2秒,手势阶段中第一时长例如可以是0.6秒,手势结束阶段中第一时长例如可以是0.3秒。
根据一些实施例,手势起始阶段中第一时长可以是0.2秒,第二时长例如可以是0.5秒。当终端基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中之后,终端例如可以获取手势起始阶段的异常信号状态噪声状态和正常信号状态手势起始状态。当终端获取到手势起始阶段的异常信号状态噪声状态和正常信号状态手势起始状态时,终端可以获取噪声状态的持续时长和正常信号状态的持续时长。终端获取到噪声状态的持续时长例如可以是0.3秒和正常信号状态的持续时长例如可以是0.4秒。当终端确定异常信号状态的持续时长大于第一时长0.2秒时,终端可以清除手势信号集合中的所有特征值和所有信号状态。
易于理解的是,当终端获取手势信号,且确定手势数据获取阶段中任一阶段的特征点对应的信号状态为异常信号状态时,终端可以直接对该异常信号状态进行计时,即可以获取该异常信号状态的持续时长。在异常信号状态的持续时长大于第一时长,终端可以清除手势信号集合中的所有特征值和所有信号状态;
S207,当信号变化信息满足信号验证信息时,确定信号变化信息为目标手势的信号数据段;
具体过程如上所述,此处不再赘述。
根据一些实施例,由于实际应用中的手势起始点会比传感器采集到的手势信号早一点,因此终端对手势起始点进行调整,可以提高手势数据获取的准确性。当信号变化信息满足信号验证信息时,终端可以获取手势起始点对应的起始点调整值。当终端获取到手势起始点对应的起始点调整值时,终端可以基于起始点调整值,对手势起始点进行调整,得到调整后的手势起始点。终端可以将调整后的手势起始点和手势终止点之间的状态变化信息确定为目标手势的信号数据段。
易于理解的是,手势起始点对应的起始点调整值可以基于用户的调整指令进行设置。例如,在终端确定上一个目标手势的识别准确度低于预设准确度时,终端可以基于用户的调整指令对起始点调整值进行调整。
根据一些实施例,手势起始点例如可以是第3秒,手势终止点例如可以是第4秒。例如当信号变化信息满足由手势起始阶段切换至手势阶段,再由手势阶段切换至手势结束阶段的信息时,终端可以获取手势起始点对应的起始点调整值例如可以是0.08秒。当终端获取到手势起始点对应的起始点调整值时,终端可以基于起始点调整值,对手势起始点进行调整,得到调整后的手势起始点为第2.92秒。终端可以将调整后的手势起始点第2.92秒和手势终止点第4秒之间的状态变化信息确定为目标手势的信号数据段。
根据一些实施例,当信号变化信息满足信号验证信息时,终端还可以对手势终止点进行调整,以便提高手势数据获取的准确性。
在本申请一个或多个实施例中,获取传感器采集的手势信号,获取手势信号对应的特征值,可以在噪声阶段中检测到存在第一特征值大于或等于第一阈值时,确定由噪声阶段切换至手势数据获取阶段,并将第一特征点确定为手势数据获取阶段的起始点,当在手势数据获取阶段中检测到持续存在第二特征值小于第一阈值时,确定由手势数据获取阶段切换至噪声阶段,将第三特征值对应的第三特征点确定为手势数据获取阶段的手势终止点,可以减少直接基于门限阈值确定手势终止点不准确的情况,可以提高手势起止点确定的准确性,可以提高手势数据获取的准确性。另外,终端确定手势起止点时无需手动标注,无需人工操作步骤,可以减少手势数据获取步骤,可以提高手势数据获取的便利性。其次,基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中时,可以对手势数据获取阶段中任一阶段中的特征点的信号状态进行设置,可以减少直接基于门限阈值导致手势数据不完整的情况,可以提高手势数据获取的准确性。另外,终端对手势数据获取阶段中任一阶段的异常信号状态和正常信号状态的持续时长的检测,可以在确定手势数据异常时无需对手势数据进行分割,可以提高手势数据获取的准确性。最后,当信号变化信息满足信号验证信息时,确定信号变化信息为目标手势的信号数据段,可以减少仅基于门限阈值获取手势数据不准确的情况,可以提高手势数据获取的准确性。
请参见图9,图9示出本申请实施例提供的一种手势数据获取方法的流程示意图。具体的:
S301,获取至少一个传感器采集的手势信号;
根据一些实施例,终端中设置的传感器可以是至少一个。其中,至少一个传感器可以是同一类型的传感器,还可以是不同类型的传感器。例如至少一个传感器可以是不同生产商生产的压力传感器。
根据一些实施例,终端的传感器检测到存在空间手势输入时,至少一个传感器可以采集手势信号。终端可以获取至少一个传感器采集的手势信号。具体可以是,至少一个传感器可以将采集的手势信号输出至终端的处理器。终端的处理器可以获取至少一个传感器采集的手势信号。
可选的,终端中的至少一个传感器为T传感器、Y传感器和U传感器时,终端可以获取i传感器、y传感器和u传感器采集的手势信号。终端获取到i传感器在t时刻采集到的手势数据例如可以是公式(1)。
Figure PCTCN2022077603-appb-000001
其中,D i表示i传感器的数据维度;d i(t)为i传感器在t时刻采集到的手势数据。
S302,对至少一个传感器采集的手势信号进行拼接,得到拼接手势信号;
根据一些实施例,终端获取到至少一个传感器采集的手势信号时,终端可以对至少一个传感器采集的手势信号进行拼接,得到拼接手势信号。
易于理解的是,当终端获取到t时刻至少一个传感器采集的手势信号时,对至少一个传感器采集的手势信号进行拼接,得到t时刻的拼接手势信号,即t时刻的拼接手势信号为公式(2)。
Figure PCTCN2022077603-appb-000002
其中,C表示传感器的个数。
S303,基于平滑窗口的窗口尺寸和平滑窗口对应的权重系数,对拼接手势信号进行平滑滤波处理,得到拼接手势信号对应的特征值;
根据一些实施例,当终端对至少一个传感器采集的手势信号进行拼接,终端可以得到拼接手势信号。基于平滑窗口的窗口尺寸和平滑窗口对应的权重系数,终端可以对拼接手势信号进行平滑滤波处理,终端可以得到拼接手势信号对应的特征值。
易于理解的是,终端可以获取t时刻手势信号的特征,终端可以获取到t时刻手势信号的特征f(t)定义为公式(3)。
Figure PCTCN2022077603-appb-000003
可选的,终端对f(t)的平滑滤波可以定义为公式(4)。
Figure PCTCN2022077603-appb-000004
Figure PCTCN2022077603-appb-000005
其中,L为平滑窗口大小,w为平滑窗口对应权重。
根据一些实施例,终端对拼接手势信号进行平滑滤波处理,可以得到拼接手势信号对应的特征值。
易于理解的是,终端获取手势信号对应的特征值时,终端还可以基于信号能量、信号幅度、信号过零率、信号相关系数计算得到特征值。其中,终端可以对信号能量、信号幅度、信号过零率、信号相关系数进行平滑滤波得到手势信号对应的特征值。
其中,手势信号的能量定义为公式(5)。
Figure PCTCN2022077603-appb-000006
其中,d i(t)表示t时刻手势数据d(t)的第i维数据,D表示手势数据维度。
手势信号幅度定义为公式(6)。
Figure PCTCN2022077603-appb-000007
手势信号过零率定义为公式(7)。
Figure PCTCN2022077603-appb-000008
其中,N表示过零率统计数据长度。
手势信号相关系数定义为公式(8)。
Figure PCTCN2022077603-appb-000009
其中,N表示相关系数统计计算数据长度,k表示数据延迟长度,<·>表示点积。
S304,基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中;
具体过程如上所述,此处不再赘述。
S305,在手势信号集合中,获取手势数据获取阶段的手势起始点和手势终止点之间的所有特征值和所有信号状态,基于所有特征值和所有信号状态生成信号变化信息;
具体过程如上所述,此处不再赘述。
S306,当信号变化信息满足信号验证信息时,确定信号变化信息为目标手势的信号数据段;
具体过程如上所述,此处不再赘述。
根据一些实施例,信号变化信息包括信号变化趋势和信号数据段长度。当终端在手势信号集合中获取手势数据获取阶段的手势起始点和手势终止点之间的所有特征值和所有信号状态时,终端可以基于所有特征值和所有信号状态生成信号变化信息。终端获取到的信号变化信息例如可以包括信号变化趋势和信号数据段长度。当终端确定信号变化趋势满足信号验证趋势,且信号数据段长度满足数据验证长度时,确定信号变化信息为目标手势的信号数据段。终端对信号数据段长度的检测可以减少数据段长度不符合数据验证长度的情况,减少手势数据获取不准确的情况,可以提高手势数据获取的准确性。
易于理解的是,数据验证长度例如可以是0.9秒到1.1秒。当终端获取到的信号数据段长度为1秒时,终端确定信号变化趋势满足信号验证趋势,且信号数据段长度1秒满足数据验证长度0.9秒到1.1秒时,终端确定信号变化信息为目标手势的信号数据段。
S307,将手势数据获取阶段切换为检测冻结阶段;
根据一些实施例,当信号变化信息满足信号验证信息时,终端可以确定信号变化信息为目标手势的信号数据段。当终端确定信号变化信息为目标手势的信号数据段时,终端可以从手势信号集合将目标手势对应的手势数据分割出来,终端可以将手势数据获取阶段切换为检测冻结阶段。检测冻结阶段是为了过滤手势结束后,由于手动的抖动引起的手势信号波动。在一些情况下,在手势输入结束后,由于手部的抖动,传感器会采集到手势信号,终端会基于该手势信号将手部的抖动确定为一个新的目标手势。在实际应用中,每一个目标手势结束后,会有一个短暂的停顿,终端不会立刻执行新的手势动作。因此终端将手势数据获取阶段切换为检测冻结阶段,可以过滤掉手势结束阶段的噪声数据,提高手势数据获取的准确性。
S308,当检测冻结阶段的持续时长达到第三时长时,解除检测冻结阶段。
根据一些实施例,当终端将手势数据获取阶段切换至检测冻结阶段时,终端可以获取检测冻结阶段的持续时长。当终端检测到检测冻结阶段的持续时长达到第三时长时,终端可以解除检测冻结阶段,可以减少检测冻结状态持续时长过长导致下一个手势数据获取不准确的情况,可以提高手势数据获取的准确性。其中,第三时长是指与检测冻结阶段的持续时长对应的时长。该第三时长并不特指某一固定时长,该第三时长可以基于用户的时长设置指令确定。
易于理解的是,第三时长例如可以是0.5秒。当终端将手势数据获取阶段切换至检测冻结阶段时,终端可以获取检测冻结阶段的持续时长。终端获取到的检测冻结阶段的持续。当终端检测到检测冻结阶段的持续时长达到0.5秒时,终端可以解除检测冻结状态。
可选的,当终端将手势数据获取阶段切换至检测冻结阶段时,终端可以开始对检测冻结阶段进行计时。当终端对检测冻结阶段的计时时长达到第三时长时,终端可以直接解除检测冻结状态。
在本申请一个或多个实施例中,获取至少一个传感器采集的手势信号,对至少一个传感器采集的手势信号进行拼接,得到拼接手势信号,基于平滑窗口的窗口尺寸和平滑窗口对应的权重系数,对拼接手势信号进行平滑滤波处理,得到拼接手势信号对应的特征值,可以减少噪声数据导致特征值获取不准确的情况,可以提高特征值获取的准确性,可以提高手势数据获取的准确性。其次,当信号变化信息满足信号验证信息时,确定信号变化信息为目标手势的信号数据段之后,将手势数据获取阶段切换为检测冻结阶段,当检测冻结阶段的持续时长大于第三时长时,解除检测冻结阶段,可以过滤掉手势结束阶段的噪声数据,减少传感器本身波动和环境噪声对手势数据的影响,可以提高手势数据获取的准确性。
下述为本申请装置实施例,可以用于执行本申请方法实施例。对于本申请装置实施例中未披露的细节,请参照本申请方法实施例。
请参见图10,其示出了本申请一个示例性实施例提供的手势数据获取装置的结构示意图。该手势数据获取装置可以通过软件、硬件或者两者的结合实现成为装置的全部或一部分。该手势数据获取装置1000包括信号获取单元1001、信号添加单元1002、信息获取单元1003和数据段获取单元1004,其中:
信号获取单元1001,用于获取传感器采集的手势信号,获取手势信号对应的特征值;
信号添加单元1002,用于基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中;特征点为特征值在手势数据获取阶段中的特征点;
信息获取单元1003,用于在手势信号集合中,获取手势数据获取阶段的手势起始点和手势终止点之间的所有特征值和所有信号状态,基于所有特征值和所有信号状态生成信号变化信息;
数据段获取单元1004,用于当信号变化信息满足信号验证信息时,确定信号变化信息为目标手势的信号数据段。
根据一些实施例,图11示出本申请实施例的一种手势数据获取装置的结构示意图。如图11所示,该手势数据获取装置1000还包括阶段切换单元1005,用于基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中之前,当在噪声阶段中检测到存在第一特征值大于或等于第一阈值时,确定由噪声阶段切换至手势数据获取阶段,并将第一特征值对应的第一特征点确定为手势数据获取阶段的手势起始点;
当在手势数据获取阶段中检测到持续存在第二特征值小于第一阈值时,确定由手势数据获取阶段切换至噪声阶段,将第三特征值对应的第三特征点确定为手势数据获取阶段的手势终止点,第三特征点为第二特征值对应的第二特征点之前,最后一个大于或等于第一阈值的特征值对应的特征点。
根据一些实施例,手势数据获取阶段包括手势起始阶段、手势阶段和手势结束阶段,手势起始点为手势起始阶段的起始点,手势终止点为手势结束阶段的终止点。
根据一些实施例,阶段切换单元1005,用于当在噪声阶段中检测到存在第一特征值大于或等于第一阈值时,确定由噪声阶段切换至手势数据获取阶段,并将第一特征值对应的第一特征点确定为手势数据获取阶段的手势起始点时,具体用于:
当在噪声阶段中检测到存在第一特征值大于或等于第一阈值时,确定由噪声阶段切换至手势起始阶段,并将第一特征值对应的第一特征点确定为手势起始阶段的起始点;
当在手势起始阶段中检测到存在第四特征值大于或等于第二阈值时,确定由手势起始阶段切换至手势阶段,并将第四特征值对应的第四特征点确定为手势阶段的起始点;
其中,手势阶段的起始点为手势起始阶段的终止点,第二阈值大于第一阈值。
根据一些实施例,阶段切换单元1005,用于当在手势数据获取阶段中检测到持续存在第二特征值小于第一阈值时,确定由手势数据获取阶段切换至噪声阶段,将第三特征值对应的第三特征点确定为手势数据获取阶段的手势终止点时,具体用于:
当在手势阶段中检测到持续存在第六特征值小于第二阈值时,确定由手势阶段切换至手势结束阶段,将第五特征值对应的第五特征点确定为手势阶段的终止点,第五特征点为第六特征值对应的第六特征点之前,最后一个大于或等于第二阈值的特征值对应的特征点;
当在手势结束阶段中检测到持续存在第二特征值小于第一阈值时,确定由手势结束阶段切换至噪声阶段,将第三特征值对应的第三特征点确定为手势结束阶段的终止点,第三特征点为第二特征值对应的第二特征点之前,最后一个大于或等于第一阈值的特征值对应的特征点;
其中,手势阶段的终止点为手势结束阶段的起始点,第二阈值大于第一阈值。
根据一些实施例,阶段切换单元1005,用于基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中时,具体用于:
在处于手势起始阶段中,当检测到第一特征值小于第一阈值时,将第一特征点对应的信号状态设置为噪声状态;
当检测到第一特征值大于或等于第一阈值,且小于第二阈值时,将第一特征点对应的信号状态设置为手势起始状态;
当检测到第一特征值大于或等于第二阈值时,将第一特征点对应的信号状态设置为手势状态。
根据一些实施例,信号添加单元1002,用于基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中时,具体用于:
在处于手势阶段中,当检测到第三特征值小于第一阈值时,将第三特征点对应的信号状态设置为噪声状态;
当检测到第三特征值大于或等于第一阈值,且小于第二阈值时,将第三特征点对应的信号状态设置为手势结束状态;
当检测到第三特征值大于或等于第二阈值时,将第三特征点对应的信号状态设置为手势状态。
根据一些实施例,信号添加单元1002,用于基于特征值获取特征点对应的信号状态,将特征值和信号 状态添加至手势信号集合中时,具体用于:
在处于手势结束阶段中,当检测到第六特征值小于第一阈值时,将第六特征点对应的信号状态设置为噪声状态;
当检测到第六特征值大于或等于第一阈值,且小于第二阈值时,将第六特征点对应的信号状态设置为手势结束状态;
当检测到第六特征值大于或等于第二阈值时,将第六特征点对应的信号状态设置为手势状态。
根据一些实施例,该手势数据获取装置还包括信号清除单元,用于基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中之后,获取手势数据获取阶段中任一阶段的异常信号状态和正常信号状态;
在异常信号状态的持续时长大于第一时长,或正常信号状态的持续时长大于第二时长时,清除手势信号集合中的所有特征值和所有信号状态。
根据一些实施例,图12示出本申请实施例的一种手势数据获取装置的结构示意图。如图12所示数据段获取单元1004还包括调整值获取子单元1014、起始点调整子单元1024和数据段获取子单元1034,数据段获取单元1004用于当信号变化信息满足信号验证信息时,确定信号变化信息为目标手势的信号数据段时:
调整值获取子单元1014,用于当信号变化信息满足信号验证信息时,获取手势起始点对应的起始点调整值;
起始点调整子单元1024,用于基于起始点调整值,对手势起始点进行调整,得到调整后的手势起始点;
数据段获取子单元1034,用于将调整后的手势起始点和手势终止点之间的状态变化信息确定为目标手势的信号数据段。
根据一些实施例,图13示出本申请实施例的一种手势数据获取装置的结构示意图。如图13所示,信号获取单元1001包括信号获取子单元1011、信号拼接子单元1021和特征值获取子单元1031,信号获取单元用于获取传感器采集的手势信号,获取手势信号对应的特征值,包括:
信号获取子单元1011,用于获取至少一个传感器采集的手势信号;
信号拼接子单元1021,用于对至少一个传感器采集的手势信号进行拼接,得到拼接手势信号;
特征值获取子单元1031,用于基于平滑窗口的窗口尺寸和平滑窗口对应的权重系数,对拼接手势信号进行平滑滤波处理,得到拼接手势信号对应的特征值。
根据一些实施例,数据段获取单元1004,用于信号变化信息包括信号变化趋势和信号数据段长度;当信号变化信息满足信号验证信息时,确定信号变化信息为目标手势的信号数据段时,具体用于:
当信号变化趋势满足信号验证趋势,且信号数据段长度满足数据验证长度时,确定信号变化信息为目标手势的信号数据段。
根据一些实施例,根据一些实施例,图14示出本申请实施例的一种手势数据获取装置的结构示意图。如图14所示,该手势数据获取装置1000还包括阶段解除单元1006,用于当信号变化信息满足信号验证信息时,确定信号变化信息为目标手势的信号数据段之后,将手势数据获取阶段切换为检测冻结阶段;
当检测冻结阶段的持续时长达到第三时长时,解除检测冻结阶段。
需要说明的是,上述实施例提供的手势数据获取装置在执行手势数据获取方法时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的手势数据获取装置与手势数据获取方法实施例属于同一构思,其体现实现过程详见方法实施例,这里不再赘述。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
在本申请实施例中,通过获取传感器采集的手势信号,获取手势信号对应的特征值,并基于特征值获取特征点对应的信号状态,将特征值和信号状态添加至手势信号集合中,可以在在手势信号集合中,获取手势数据获取阶段的手势起始点和手势终止点之间的所有特征值和所有信号状态,基于所有特征值和所有信号状态生成信号变化信息,在信号变化信息满足信号验证信息时,确定信号变化信息为目标手势的信号数据段。因此在获取手势数据时,无需人为控制,减少基于仅基于门限阈值获取手势数据不准确的情况,无需对模型手动标注手势数据,仅基于传感器采集的目标手势对应的信号变化信息即可获取目标手势的信号数据段,可以在提高手势数据获取准确性的同时提高手势数据获取的便利性,进而可以提高用户的手势操作体验。
本申请实施例还提供了一种计算机存储介质,所述计算机存储介质可以存储有多条指令,所述指令适于由处理器加载并执行如上述图3-图9所示实施例的所述手势数据获取方法,具体执行过程可以参见图3-图9所示实施例的具体说明,在此不进行赘述。
本申请还提供了一种计算机程序产品,该计算机程序产品存储有至少一条指令,所述至少一条指令由所述处理器加载并执行如上述图3-图9所示实施例的所述手势数据获取方法,具体执行过程可以参见图3-图9所示实施例的具体说明,在此不进行赘述。
请参考图15,其示出了本申请一个示例性实施例提供的终端的结构方框图。本申请中的终端可以包括一个或多个如下部件:处理器110、存储器120、输入装置130、输出装置140和总线150。处理器110、存储器120、输入装置130和输出装置140之间可以通过总线150连接。所述处理器加载并执行如上述图3-图9所示实施例的所述手势数据获取方法,具体执行过程可以参见图3-图9所示实施例的具体说明,在此不进行赘述。
处理器110可以包括一个或者多个处理核心。处理器110利用各种接口和线路连接整个终端内的各个部分,通过运行或执行存储在存储器120内的指令、程序、代码集或指令集,以及调用存储在存储器120内的数据,执行终端100的各种功能和处理数据。可选地,处理器110可以采用数字信号处理(digital signal processing,DSP)、现场可编程门阵列(field-programmable gate array,FPGA)、可编程逻辑阵列(programmable logic Array,PLA)中的至少一种硬件形式来实现。处理器110可集成中央处理器(central processing unit,CPU)、图像处理器(graphics processing unit,GPU)和调制解调器等中的一种或几种的组合。其中,CPU主要处理操作系统、用户界面和应用程序等;GPU用于负责显示内容的渲染和绘制;调制解调器用于处理无线通信。可以理解的是,上述调制解调器也可以不集成到处理器110中,单独通过一块通信芯片进行实现。
存储器120可以包括随机存储器(random Access Memory,RAM),也可以包括只读存储器(read-only memory,ROM)。可选地,该存储器120包括非瞬时性计算机可读介质(non-transitory computer-readable storage medium)。存储器120可用于存储指令、程序、代码、代码集或指令集。存储器120可包括存储程序区和存储数据区,其中,存储程序区可存储用于实现操作系统的指令、用于实现至少一个功能的指令(比如触控功能、声音播放功能、图像播放功能等)、用于实现下述各个方法实施例的指令等,该操作系统可以是安卓(Android)系统,包括基于Android系统深度开发的系统、苹果公司开发的IOS系统,包括基于IOS系统深度开发的系统或其它系统。存储数据区还可以存储终端在使用中所创建的数据比如电话本、音视频数据、聊天记录数据,等。
参见图16所示,存储器120可分为操作系统空间和用户空间,操作系统即运行于操作系统空间,原生及第三方应用程序即运行于用户空间。为了保证不同第三方应用程序均能够达到较好的运行效果,操作系统针对不同第三方应用程序为其分配相应的系统资源。然而,同一第三方应用程序中不同应用场景对系统资源的需求也存在差异,比如,在本地资源加载场景下,第三方应用程序对磁盘读取速度的要求较高;在动画渲染场景下,第三方应用程序则对GPU性能的要求较高。而操作系统与第三方应用程序之间相互独立,操作系统往往不能及时感知第三方应用程序当前的应用场景,导致操作系统无法根据第三方应用程序的具体应用场景进行针对性的系统资源适配。
为了使操作系统能够区分第三方应用程序的具体应用场景,需要打通第三方应用程序与操作系统之间的数据通信,使得操作系统能够随时获取第三方应用程序当前的场景信息,进而基于当前场景进行针对性的系统资源适配。
以操作系统为Android系统为例,存储器120中存储的程序和数据如图17所示,存储器120中可存储有Linux内核层320、系统运行时库层340、应用框架层360和应用层380,其中,Linux内核层320、系统运行库层340和应用框架层360属于操作系统空间,应用层380属于用户空间。Linux内核层320为终端的各种硬件提供了底层的驱动,如显示驱动、音频驱动、摄像头驱动、蓝牙驱动、Wi-Fi驱动、电源管理等。系统运行库层340通过一些C/C++库来为Android系统提供了主要的特性支持。如SQLite库提供了数据库的支持,OpenGL/ES库提供了3D绘图的支持,Webkit库提供了浏览器内核的支持等。在系统运行时库层340中还提供有安卓运行时库(Android runtime),它主要提供了一些核心库,能够允许开发者使用Java语言来编写Android应用。应用框架层360提供了构建应用程序时可能用到的各种API,开发者也可以通过使用这些API来构建自己的应用程序,比如活动管理、窗口管理、视图管理、通知管理、内容提供者、包管理、通话管理、资源管理、定位管理。应用层380中运行有至少一个应用程序,这些应用程序可以是操作系统自带的原生应用程序,比如联系人程序、短信程序、时钟程序、相机应用等;也可以是第三方开发者所开发的第三方应用程序,比如游戏类应用程序、即时通信程序、相片美化程序、手势数据获取程序等。
以操作系统为IOS系统为例,存储器120中存储的程序和数据如图18所示,IOS系统包括:核心操作系统层420(Core OS layer)、核心服务层440(Core Services layer)、媒体层460(Media layer)、可触摸层480(Cocoa Touch Layer)。核心操作系统层420包括了操作系统内核、驱动程序以及底层程序框架,这些底层程序框架提供更接近硬件的功能,以供位于核心服务层440的程序框架所使用。核心服务层440提供给应用程序所需要的系统服务和/或程序框架,比如基础(Foundation)框架、账户框架、广告框 架、数据存储框架、网络连接框架、地理位置框架、运动框架等等。媒体层460为应用程序提供有关视听方面的接口,如图形图像相关的接口、音频技术相关的接口、视频技术相关的接口、音视频传输技术的无线播放(AirPlay)接口等。可触摸层480为应用程序开发提供了各种常用的界面相关的框架,可触摸层480负责用户在终端上的触摸交互操作。比如本地通知服务、远程推送服务、广告框架、游戏工具框架、消息用户界面接口(User Interface,UI)框架、用户界面UIKit框架、地图框架等等。
在图16所示出的框架中,与大部分应用程序有关的框架包括但不限于:核心服务层440中的基础框架和可触摸层480中的UIKit框架。基础框架提供许多基本的对象类和数据类型,为所有应用程序提供最基本的系统服务,和UI无关。而UIKit框架提供的类是基础的UI类库,用于创建基于触摸的用户界面,iOS应用程序可以基于UIKit框架来提供UI,所以它提供了应用程序的基础架构,用于构建用户界面,绘图、处理和用户交互事件,响应手势等等。
其中,在IOS系统中实现第三方应用程序与操作系统数据通信的方式以及原理可参考Android系统,本申请在此不再赘述。
其中,输入装置130用于接收输入的指令或数据,输入装置130包括但不限于键盘、鼠标、摄像头、麦克风或触控设备。输出装置140用于输出指令或数据,输出装置140包括但不限于显示设备和扬声器等。在一个示例中,输入装置130和输出装置140可以合设,输入装置130和输出装置140为触摸显示屏,该触摸显示屏用于接收用户使用手指、触摸笔等任何适合的物体在其上或附近的触摸操作,以及显示各个应用程序的用户界面。触摸显示屏通常设置在终端的前面板。触摸显示屏可被设计成为全面屏、曲面屏或异型屏。触摸显示屏还可被设计成为全面屏与曲面屏的结合,异型屏与曲面屏的结合,本申请实施例对此不加以限定。
除此之外,本领域技术人员可以理解,上述附图所示出的终端的结构并不构成对终端的限定,终端可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。比如,终端中还包括射频电路、输入单元、传感器、音频电路、无线保真(wireless fidelity,WiFi)模块、电源、蓝牙模块等部件,在此不再赘述。
在本申请实施例中,各步骤的执行主体可以是上文介绍的终端。可选地,各步骤的执行主体为终端的操作系统。操作系统可以是安卓系统,也可以是IOS系统,或者其它操作系统,本申请实施例对此不作限定。
本申请实施例的终端,其上还可以安装有显示设备,显示设备可以是各种能实现显示功能的设备,例如:阴极射线管显示器(cathode ray tubedisplay,简称CR)、发光二极管显示器(light-emitting diode display,简称LED)、电子墨水屏、液晶显示屏(liquid crystal display,简称LCD)、等离子显示面板(plasma display panel,简称PDP)等。用户可以利用终端100上的显示设备,来查看显示的文字、图像、视频等信息。所述终端可以是智能手机、平板电脑、游戏设备、AR(Augmented Reality,增强现实)设备、汽车、数据存储装置、音频播放装置、视频播放装置、笔记本、桌面计算设备、可穿戴设备诸如电子手表、电子眼镜、电子头盔、电子手链、电子项链、电子衣物等设备。
本领域的技术人员可以清楚地了解到本申请的技术方案可借助软件和/或硬件来实现。本说明书中的“单元”和“模块”是指能够独立完成或与其他部件配合完成特定功能的软件和/或硬件,其中硬件例如可以是现场可编程门阵列(Field-ProgrammaBLE Gate Array,FPGA)、集成电路(Integrated Circuit,IC)等。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本申请并不受所描述的动作顺序的限制,因为依据本申请,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本申请所必须的。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置,可通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些服务接口,装置或单元的间接耦合或通信连接,可以是电性或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储器中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储器中,包括若干指令用以使得一台计算机设备(可为个人计算机、服务器或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储器包括:U盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、移动硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通进程序来指令相关的硬件来完成,该程序可以存储于一计算机可读存储器中,存储器可以包括:闪存盘、只读存储器(Read-Only Memory,ROM)、随机存取器(Random Access Memory,RAM)、磁盘或光盘等。
以上所述者,仅为本公开的示例性实施例,不能以此限定本公开的范围。即但凡依本公开教导所作的等效变化与修饰,皆仍属本公开涵盖的范围内。本领域技术人员在考虑说明书及实践这里的公开后,将容易想到本公开的其它实施方案。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未记载的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的范围和精神由权利要求限定。

Claims (20)

  1. 一种手势数据获取方法,其特征在于,所述方法包括:
    获取传感器采集的手势信号,获取所述手势信号对应的特征值;
    基于所述特征值获取特征点对应的信号状态,将所述特征值和所述信号状态添加至手势信号集合中;所述特征点为所述特征值在手势数据获取阶段中的特征点;
    在所述手势信号集合中,获取所述手势数据获取阶段的手势起始点和手势终止点之间的所有特征值和所有信号状态,基于所述所有特征值和所述所有信号状态生成信号变化信息;
    当所述信号变化信息满足信号验证信息时,确定所述信号变化信息为目标手势的信号数据段。
  2. 根据权利要求1所述的方法,其特征在于,所述基于所述特征值获取所述特征点对应的信号状态,将所述特征值和所述信号状态添加至手势信号集合中之前,还包括:
    当在噪声阶段中检测到存在第一特征值大于或等于第一阈值时,确定由所述噪声阶段切换至手势数据获取阶段,并将所述第一特征值对应的第一特征点确定为所述手势数据获取阶段的手势起始点;
    当在所述手势数据获取阶段中检测到持续存在第二特征值小于所述第一阈值时,确定由所述手势数据获取阶段切换至所述噪声阶段,将第三特征值对应的第三特征点确定为所述手势数据获取阶段的手势终止点,所述第三特征点为所述第二特征值对应的第二特征点之前,最后一个大于或等于所述第一阈值的特征值对应的特征点。
  3. 根据权利要求2所述的方法,其特征在于,所述手势数据获取阶段包括手势起始阶段、手势阶段和手势结束阶段,所述手势起始点为所述手势起始阶段的起始点,所述手势终止点为所述手势结束阶段的终止点。
  4. 根据权利要求3所述的方法,其特征在于,所述当在噪声阶段中检测到存在第一特征值大于或等于第一阈值时,确定由所述噪声阶段切换至手势数据获取阶段,并将所述第一特征值对应的第一特征点确定为所述手势数据获取阶段的手势起始点,包括:
    当在噪声阶段中检测到存在第一特征值大于或等于第一阈值时,确定由所述噪声阶段切换至手势起始阶段,并将所述第一特征值对应的第一特征点确定为所述手势起始阶段的起始点;
    当在所述手势起始阶段中检测到存在第四特征值大于或等于第二阈值时,确定由所述手势起始阶段切换至手势阶段,并将所述第四特征值对应的第四特征点确定为所述手势阶段的起始点;
    其中,所述手势阶段的起始点为所述手势起始阶段的终止点,所述第二阈值大于所述第一阈值。
  5. 根据权利要求3所述的方法,其特征在于,所述当在所述手势数据获取阶段中检测到持续存在第二特征值小于所述第一阈值时,确定由所述手势数据获取阶段切换至所述噪声阶段,将第三特征值对应的第三特征点确定为所述手势数据获取阶段的手势终止点,包括:
    当在所述手势阶段中检测到持续存在第六特征值小于第二阈值时,确定由所述手势阶段切换至手势结束阶段,将第五特征值对应的第五特征点确定为所述手势阶段的终止点,所述第五特征点为所述第六特征值对应的第六特征点之前,最后一个大于或等于所述第二阈值的特征值对应的特征点;
    当在所述手势结束阶段中检测到持续存在第二特征值小于第一阈值时,确定由所述手势结束阶段切换至所述噪声阶段,将第三特征值对应的第三特征点确定为所述手势结束阶段的终止点,所述第三特征点为所述第二特征值对应的第二特征点之前,最后一个大于或等于所述第一阈值的特征值对应的特征点;
    其中,所述手势阶段的终止点为所述手势结束阶段的起始点,所述第二阈值大于所述第一阈值。
  6. 根据权利要求4所述的方法,其特征在于,所述基于所述特征值获取特征点对应的信号状态,将所述特征值和所述信号状态添加至手势信号集合中,包括:
    在处于所述手势起始阶段中,当检测到所述第一特征值小于所述第一阈值时,将所述第一特征点对应的信号状态设置为噪声状态;
    当检测到所述第一特征值大于或等于所述第一阈值,且小于所述第二阈值时,将所述第一特征点对应的信号状态设置为手势起始状态;
    当检测到所述第一特征值大于或等于所述第二阈值时,将所述第一特征点对应的信号状态设置为手势状态。
  7. 根据权利要求4所述的方法,其特征在于,所述基于所述特征值获取特征点对应的信号状态,将所述特征值和所述信号状态添加至手势信号集合中,包括:
    在处于所述手势阶段中,当检测到所述第三特征值小于所述第一阈值时,将所述第三特征点对应的信号状态设置为噪声状态;
    当检测到所述第三特征值大于或等于所述第一阈值,且小于所述第二阈值时,将所述第三特征点对应的信号状态设置为手势结束状态;
    当检测到所述第三特征值大于或等于所述第二阈值时,将所述第三特征点对应的信号状态设置为手势状态。
  8. 根据权利要求5所述的方法,其特征在于,所述基于所述特征值获取特征点对应的信号状态,将所述特征值和所述信号状态添加至手势信号集合中,包括:
    在处于所述手势结束阶段中,当检测到所述第六特征值小于所述第一阈值时,将所述第六特征点对应的信号状态设置为噪声状态;
    当检测到所述第六特征值大于或等于所述第一阈值,且小于所述第二阈值时,将所述第六特征点对应的信号状态设置为手势结束状态;
    当检测到所述第六特征值大于或等于所述第二阈值时,将所述第六特征点对应的信号状态设置为手势状态。
  9. 根据权利要求3所述的方法,其特征在于,所述基于所述特征值获取特征点对应的信号状态,将所述特征值和所述信号状态添加至手势信号集合中之后,还包括:
    获取所述手势数据获取阶段中任一阶段的异常信号状态和正常信号状态;
    在所述异常信号状态的持续时长大于第一时长,或所述正常信号状态的持续时长大于第二时长时,清除所述手势信号集合中的所有特征值和所有信号状态。
  10. 根据权利要求1所述的方法,其特征在于,所述当所述信号变化信息满足信号验证信息时,确定所述信号变化信息为目标手势的信号数据段,包括:
    当所述信号变化信息满足信号验证信息时,获取所述手势起始点对应的起始点调整值;
    基于所述起始点调整值,对所述手势起始点进行调整,得到调整后的手势起始点;
    将所述调整后的手势起始点和所述手势终止点之间的状态变化信息确定为目标手势的信号数据段。
  11. 根据权利要求1所述的方法,其特征在于,所述获取传感器采集的手势信号,获取所述手势信号对应的特征值,包括:
    获取至少一个传感器采集的手势信号;
    对所述至少一个传感器采集的手势信号进行拼接,得到拼接手势信号;
    基于平滑窗口的窗口尺寸和所述平滑窗口对应的权重系数,对所述拼接手势信号进行平滑滤波处理,得到所述拼接手势信号对应的特征值。
  12. 根据权利要求1所述的方法,其特征在于,所述信号变化信息包括信号变化趋势和信号数据段长度;所述当所述信号变化信息满足信号验证信息时,确定所述信号变化信息为目标手势的信号数据段,包括:
    当所述信号变化趋势满足信号验证趋势,且所述信号数据段长度满足数据验证长度时,确定所述信号变化信息为目标手势的信号数据段。
  13. 根据权利要求1所述的方法,其特征在于,所述当所述信号变化信息满足信号验证信息时,确定所述信号变化信息为目标手势的信号数据段之后,还包括:
    将所述手势数据获取阶段切换为检测冻结阶段;
    当所述检测冻结阶段的持续时长达到第三时长时,解除所述检测冻结阶段。
  14. 一种手势数据获取装置,其特征在于,所述装置包括:
    信号获取单元,用于获取传感器采集的手势信号,获取所述手势信号对应的特征值;
    信号添加单元,用于基于所述特征值获取特征点对应的信号状态,将所述特征值和所述信号状态添加至手势信号集合中;所述特征点为所述特征值在手势数据获取阶段中的特征点;
    信息获取单元,用于在所述手势信号集合中,获取所述手势数据获取阶段的手势起始点和手势终止点之间的所有特征值和所有信号状态,基于所述所有特征值和所述所有信号状态生成信号变化信息;
    数据段获取单元,用于当所述信号变化信息满足信号验证信息时,确定所述信号变化信息为目标手势的信号数据段。
  15. 根据权利要求14所述的装置,其特征在于,所述装置还包括阶段切换单元,所述阶段切换单元,用于:
    当在噪声阶段中检测到存在第一特征值大于或等于第一阈值时,确定由噪声阶段切换至手势数据获取阶段,并将第一特征值对应的第一特征点确定为手势数据获取阶段的手势起始点;
    当在手势数据获取阶段中检测到持续存在第二特征值小于第一阈值时,确定由手势数据获取阶段切换至噪声阶段,将第三特征值对应的第三特征点确定为手势数据获取阶段的手势终止点,第三特征点为第二特征值对应的第二特征点之前,最后一个大于或等于第一阈值的特征值对应的特征点。
  16. 根据权利要求14所述的装置,其特征在于,所述数据段获取单元,包括:
    调整值获取子单元,用于当所述信号变化信息满足信号验证信息时,获取所述手势起始点对应的起始点调整值;
    起始点调整子单元,用于基于所述起始点调整值,对所述手势起始点进行调整,得到调整后的手势起始点;
    数据段获取子单元,用于将所述调整后的手势起始点和所述手势终止点之间的状态变化信息确定为目标手势的信号数据段。
  17. 根据权利要求14所述的装置,其特征在于,所述信号获取单元,包括:
    信号获取子单元,用于获取至少一个传感器采集的手势信号;
    信号拼接子单元,用于对所述至少一个传感器采集的手势信号进行拼接,得到拼接手势信号;
    特征值获取子单元,用于基于平滑窗口的窗口尺寸和所述平滑窗口对应的权重系数,对所述拼接手势信号进行平滑滤波处理,得到所述拼接手势信号对应的特征值。
  18. 根据权利要求14所述的装置,其特征在于,所述装置还包括阶段解除单元,所述阶段解除单元,用于:
    将所述手势数据获取阶段切换为检测冻结阶段;
    当所述检测冻结阶段的持续时长达到第三时长时,解除所述检测冻结阶段。
  19. 一种终端,其特征在于,包括:处理器和存储器;其中,所述存储器存储有计算机程序,所述计算机程序适于由所述处理器加载并执行如权利要求1~13任意一项的方法步骤。
  20. 一种计算机存储介质,其特征在于,所述计算机存储介质存储有多条指令,所述指令适于由处理器加载并执行如权利要求1~13任意一项的方法步骤。
PCT/CN2022/077603 2021-03-24 2022-02-24 手势数据获取方法、装置、终端及存储介质 WO2022199312A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP22773974.5A EP4270156A4 (en) 2021-03-24 2022-02-24 GESTURE DATA ACQUISITION METHOD AND APPARATUS, TERMINAL, AND STORAGE MEDIUM
US18/231,965 US20230384925A1 (en) 2021-03-24 2023-08-09 Method, terminal for acquiring gesture data, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110316214.X 2021-03-24
CN202110316214.XA CN113031775B (zh) 2021-03-24 2021-03-24 手势数据获取方法、装置、终端及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/231,965 Continuation US20230384925A1 (en) 2021-03-24 2023-08-09 Method, terminal for acquiring gesture data, and storage medium

Publications (1)

Publication Number Publication Date
WO2022199312A1 true WO2022199312A1 (zh) 2022-09-29

Family

ID=76473508

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/077603 WO2022199312A1 (zh) 2021-03-24 2022-02-24 手势数据获取方法、装置、终端及存储介质

Country Status (4)

Country Link
US (1) US20230384925A1 (zh)
EP (1) EP4270156A4 (zh)
CN (1) CN113031775B (zh)
WO (1) WO2022199312A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113031775B (zh) * 2021-03-24 2023-02-03 Oppo广东移动通信有限公司 手势数据获取方法、装置、终端及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060136846A1 (en) * 2004-12-20 2006-06-22 Sung-Ho Im User interface apparatus using hand gesture recognition and method thereof
CN103941866A (zh) * 2014-04-08 2014-07-23 河海大学常州校区 一种基于Kinect深度图像的三维手势识别方法
US20150109202A1 (en) * 2013-10-22 2015-04-23 Thalmic Labs Inc. Systems, articles, and methods for gesture identification in wearable electromyography devices
CN105573545A (zh) * 2015-11-27 2016-05-11 努比亚技术有限公司 一种手势校准方法、装置及手势输入处理方法
CN105807903A (zh) * 2014-12-30 2016-07-27 Tcl集团股份有限公司 一种智能设备的控制方法及装置
CN108196668A (zh) * 2017-12-05 2018-06-22 重庆中电大宇卫星应用技术研究所 一种便携式手势识别系统及方法
CN113031775A (zh) * 2021-03-24 2021-06-25 Oppo广东移动通信有限公司 手势数据获取方法、装置、终端及存储介质

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3002338A1 (fr) * 2013-02-15 2014-08-22 France Telecom Procede de segmentation temporelle d'un geste instrumente, dispositif et terminal associes
CN105094298B (zh) * 2014-05-13 2018-06-26 华为技术有限公司 终端以及基于该终端的手势识别方法
US9354709B1 (en) * 2014-06-17 2016-05-31 Amazon Technologies, Inc. Tilt gesture detection
US9880632B2 (en) * 2014-06-19 2018-01-30 Thalmic Labs Inc. Systems, devices, and methods for gesture identification
US10739863B2 (en) * 2015-12-31 2020-08-11 Huawei Technologies Co., Ltd. Method for responding to gesture acting on touchscreen and terminal
CN110266876B (zh) * 2019-04-29 2020-12-29 努比亚技术有限公司 一种压力阈值确定方法、装置及计算机可读存储介质
CN112363622A (zh) * 2020-11-13 2021-02-12 深圳振科智能科技有限公司 字符输入方法、装置、电子设备及存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060136846A1 (en) * 2004-12-20 2006-06-22 Sung-Ho Im User interface apparatus using hand gesture recognition and method thereof
US20150109202A1 (en) * 2013-10-22 2015-04-23 Thalmic Labs Inc. Systems, articles, and methods for gesture identification in wearable electromyography devices
CN103941866A (zh) * 2014-04-08 2014-07-23 河海大学常州校区 一种基于Kinect深度图像的三维手势识别方法
CN105807903A (zh) * 2014-12-30 2016-07-27 Tcl集团股份有限公司 一种智能设备的控制方法及装置
CN105573545A (zh) * 2015-11-27 2016-05-11 努比亚技术有限公司 一种手势校准方法、装置及手势输入处理方法
CN108196668A (zh) * 2017-12-05 2018-06-22 重庆中电大宇卫星应用技术研究所 一种便携式手势识别系统及方法
CN113031775A (zh) * 2021-03-24 2021-06-25 Oppo广东移动通信有限公司 手势数据获取方法、装置、终端及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4270156A4 *

Also Published As

Publication number Publication date
EP4270156A4 (en) 2024-04-24
US20230384925A1 (en) 2023-11-30
EP4270156A1 (en) 2023-11-01
CN113031775B (zh) 2023-02-03
CN113031775A (zh) 2021-06-25

Similar Documents

Publication Publication Date Title
US11450315B2 (en) Electronic apparatus and method for operating same
CN108027952B (zh) 用于提供内容的方法和电子设备
WO2021232930A1 (zh) 应用分屏方法、装置、存储介质及电子设备
EP3335214B1 (en) Method and electronic device for playing a virtual musical instrument
US11829581B2 (en) Display control method and terminal
CN108228358B (zh) 修正垂直同步信号的方法、装置、移动终端以及存储介质
WO2017206902A1 (zh) 应用控制方法及相关设备
CN108549519B (zh) 分屏处理方法、装置、存储介质和电子设备
EP2250558A1 (en) Buffers for display acceleration
WO2020073980A1 (zh) 寄宿应用的处理方法、设备及计算机可读存储介质
CN112788583B (zh) 设备寻找方法、装置、存储介质及电子设备
US20230384925A1 (en) Method, terminal for acquiring gesture data, and storage medium
WO2015027864A1 (en) Method, device and touch screen apparatus for refreshing content of draggable listview
CN111127469A (zh) 缩略图显示方法、装置、存储介质以及终端
CN113163055B (zh) 一种震动调节方法、装置、存储介质及电子设备
CN110286836B (zh) 用于移动应用接口元素的设备、方法和图形用户界面
CN111918386B (zh) 定位方法、装置、存储介质及电子设备
WO2019072179A1 (zh) 应用程序运行控制方法及装置
WO2023087963A1 (zh) 信息显示方法、设备及存储介质
CN112995562A (zh) 摄像头的调用方法、装置、存储介质及终端
CN113010078A (zh) 触控方法、装置、存储介质及电子设备
CN113595662A (zh) 信号提示方法、装置、终端设备及存储介质
CN113568748A (zh) 一种应用进程处理方法、装置、存储介质及电子设备
KR20180103639A (ko) 상대적 유사도에 기초한 음악 시퀀스들의 유사도 분석
CN112256354A (zh) 应用启动方法、装置、存储介质及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22773974

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022773974

Country of ref document: EP

Effective date: 20230728

NENP Non-entry into the national phase

Ref country code: DE