CN111752372A - Action recognition method and device and computer readable storage medium - Google Patents

Action recognition method and device and computer readable storage medium Download PDF

Info

Publication number
CN111752372A
CN111752372A CN201910241771.2A CN201910241771A CN111752372A CN 111752372 A CN111752372 A CN 111752372A CN 201910241771 A CN201910241771 A CN 201910241771A CN 111752372 A CN111752372 A CN 111752372A
Authority
CN
China
Prior art keywords
motion
earphone
data
motion path
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910241771.2A
Other languages
Chinese (zh)
Inventor
孙维国
吴海全
顾卫锋
张恩勤
曹磊
何桂晓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Grandsun Electronics Co Ltd
Original Assignee
Shenzhen Grandsun Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Grandsun Electronics Co Ltd filed Critical Shenzhen Grandsun Electronics Co Ltd
Priority to CN201910241771.2A priority Critical patent/CN111752372A/en
Publication of CN111752372A publication Critical patent/CN111752372A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements

Abstract

The present invention relates to the field of communications technologies, and in particular, to a method and an apparatus for motion recognition, and a computer-readable storage medium. The action recognition method comprises the following steps: acquiring first motion data of a user through a sensor of a first earphone, and acquiring second motion data of the user through a sensor of a second earphone; determining a first motion path according to the first motion data, and determining a second motion path according to the second motion data; comparing and matching the first motion path and the second motion path, and determining whether the first earphone and the second earphone are worn on the same user body according to a comparison and matching result; it can be judged whether or not both earphones are worn on the same user.

Description

Action recognition method and device and computer readable storage medium
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a method and an apparatus for motion recognition, and a computer-readable storage medium.
Background
Because bluetooth headset (for example bluetooth headset) convenient to wear, small in size, its application is comparatively extensive, for example bluetooth headset and external equipment (for example audio amplifier, cell-phone) cooperation use.
For more convenient control of the external device, the head motion can be recognized through the bluetooth headset. However, when recognizing a head movement through an earphone at present, it is impossible to determine whether two earphones (a left earphone and a right earphone) are worn on the same user.
Therefore, a new technical solution is needed to solve the above problems.
Disclosure of Invention
In order to solve the above technical problem, embodiments of the present invention provide a method and an apparatus for motion recognition, and a computer-readable storage medium, which can determine whether two earphones are worn on the same user.
A first aspect of an embodiment of the present invention provides an action recognition method, including:
acquiring first motion data of a user through a sensor of a first earphone, and acquiring second motion data of the user through a sensor of a second earphone; determining a first motion path according to the first motion data, and determining a second motion path according to the second motion data; and comparing and matching the first motion path and the second motion path, and determining whether the first earphone and the second earphone are worn on the same user according to a comparison and matching result.
In a first possible implementation manner of the first aspect of the embodiment of the present invention, the determining a first motion path according to the first motion data and determining a second motion path according to the second motion data includes that the first motion data includes first acceleration data and first angle data, and a first motion path is generated by calculation according to the first acceleration data and the first angle data; the second motion data comprises second acceleration data and second angle data, and a second motion path is generated according to the second acceleration data and the second angle data in a calculation mode.
In a second possible implementation manner of the first aspect of the embodiment of the present invention, after the comparing and matching the first motion path and the second motion path and determining whether the first earphone and the second earphone are worn on the same user according to the comparison and matching result, the recognizing the head movement according to the first motion path and the second motion path further includes recognizing the head movement.
With reference to the second possible implementation manner of the first aspect of the embodiment of the present invention, in a third possible implementation manner, after identifying a head action according to the first motion path and/or the second motion path, the method further includes executing an operation instruction corresponding to the identified head action.
In a fourth possible implementation manner of the first aspect of the embodiment of the present invention, before the acquiring, by the sensor of the first earphone, the first motion data of the user and the acquiring, by the sensor of the second earphone, the second motion data of the user, the method further includes determining whether the user wears the first earphone and the second earphone; detecting a first contact distance between a first earphone and a user, and judging whether the first contact distance is smaller than a preset first distance threshold value; detecting a second contact distance between a second earphone and a user, and judging whether the second contact distance is smaller than a preset second distance threshold value; if the first contact distance is smaller than the preset first distance threshold value and the second contact distance is smaller than the preset second distance threshold value, the user wears the first earphone and the second earphone.
In combination with the fifth possible implementation manner of the first aspect of the embodiment of the present invention, the comparing and matching are performed on the first motion path and the second motion path, and whether the first earphone and the second earphone are worn on the same user is determined according to a comparison and matching result, where the transmitting and matching include transmitting the first motion path to the second earphone through a near magnetic field induction technology, and comparing and matching the first motion path and the second motion path in the second earphone; and determining whether the first earphone and the second earphone are worn on the same user or not according to the comparison and matching result.
In a sixth possible implementation manner of the first aspect in combination with the embodiments of the invention, the head movement includes nodding, shaking, raising or lowering.
A second aspect of embodiments of the present invention provides a motion recognition apparatus, including,
the acquisition module is used for acquiring first motion data of a user through a sensor of the first earphone and acquiring second motion data of the user through a sensor of the second earphone;
the determining module is used for determining a first motion path according to the first motion data and determining a second motion path according to the second motion data;
and the comparison module is used for comparing and matching the first motion path and the second motion path and determining whether the first earphone and the second earphone are worn on the same user according to the comparison and matching result.
In a first possible implementation manner of the second aspect of the embodiment of the present invention, the determining module includes a first determining sub-module, configured to calculate and generate a first motion path according to first acceleration data and first angle data, where the first motion data includes the first acceleration data and the first angle data; and the second determining submodule is used for calculating and generating a second motion path according to the second acceleration data and the second angle data, wherein the second motion data comprises second acceleration data and second angle data.
In a second possible implementation manner of the second aspect of the embodiment of the present invention, the apparatus further includes an identification module, configured to identify a head motion according to the first motion path and the second motion path.
A third aspect of the embodiments of the present invention provides an action recognition apparatus, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the above method when executing the computer program.
A fourth aspect of the embodiments of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the above-described action recognition method.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
in the embodiment of the invention, the first motion path and the second motion path are compared to judge whether the motions of the two earphones are synchronous or not, and further judge whether the two earphones are worn on the same user body or not.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a flowchart illustrating a first embodiment of a method for recognizing actions according to the present invention;
FIG. 2 is a flowchart illustrating a second embodiment of a motion recognition method according to the present invention;
FIG. 3 is a flowchart illustrating a third embodiment of a motion recognition method according to the present invention;
FIG. 4 is a flowchart illustrating a fourth embodiment of a motion recognition method according to the present invention;
fig. 5 is a schematic structural diagram of a first embodiment of the motion recognition device provided in the present invention;
fig. 6 is a schematic structural diagram of a second embodiment of the motion recognition device provided in the present invention;
fig. 7 is a schematic structural diagram of a motion recognition device according to a third embodiment of the present invention;
fig. 8 is a schematic structural diagram of a fourth embodiment of the motion recognition device provided in the present invention;
fig. 9 is a schematic structural diagram of a fifth embodiment of the motion recognition device provided by the present invention;
fig. 10 is a schematic structural diagram of the motion recognition device according to the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
The embodiment of the invention discloses a method and a device for identifying actions and a computer readable storage medium.
Referring to fig. 1, fig. 1 is a schematic flow chart illustrating a motion recognition method according to a first embodiment of the present invention, specifically:
s101, acquiring first motion data of a user through a sensor of a first earphone, and acquiring second motion data of the user through a sensor of a second earphone;
the first earphone can be a left earphone, and the second earphone can be a right earphone correspondingly; or the first earpiece may be the right earpiece and the second earpiece may correspond to the left earpiece.
The first motion data comprises first acceleration data and first angle data, and the second motion data comprises second acceleration data and second angle data;
a three-axis acceleration sensor and an angle sensor may be provided in the first earphone, first acceleration data may be sensed by the three-axis acceleration sensor in the first earphone, and first angle data may be sensed by the angle sensor in the first earphone;
more specifically, when the first angle data is sensed by the angle sensor in the first earphone, whether the first angle data is larger than a preset first angle threshold value is judged;
when the first angle data are larger than a preset first angle threshold, recording the first angle data;
when the first angle data is smaller than a preset first angle threshold, the first angle data does not need to be recorded; at this time, the head movement may be caused by the unintentional movement of the user and is not the head movement that needs to be recognized, and setting the preset first angle threshold may avoid the redundant movement recognition operation caused by the unrelated movement of the user.
When sensing first acceleration data through a triaxial acceleration sensor in a first earphone, judging whether the first acceleration data is larger than a preset first acceleration threshold value;
when the first acceleration data is larger than a preset first acceleration threshold, recording the first acceleration data;
when the first acceleration data is smaller than a preset first acceleration threshold, the first acceleration data does not need to be recorded; the preset first acceleration threshold value is set, so that redundant action identification operation caused by irrelevant actions of a user can be avoided.
A three-axis acceleration sensor and an angle sensor may be provided in the second earphone, and second acceleration data of the second earphone may be sensed by the three-axis acceleration sensor in the second earphone, and second angle data of the second earphone may be sensed by the angle sensor in the second earphone.
The angle sensor in the second earphone senses the second angle data operation, and the three-axis acceleration sensor in the second earphone senses the second acceleration operation, which is not described herein again in connection with the above-mentioned operation of the first earphone.
S102, determining a first motion path according to the first motion data, and determining a second motion path according to the second motion data.
Calculating and generating a first motion path according to the first acceleration data and the first angle data;
and calculating and generating a second motion path according to the second acceleration data and the second angle data.
S103, comparing and matching the first motion path and the second motion path, and determining whether the first earphone and the second earphone are worn on the same user according to a comparison and matching result.
Wherein the first motion path may be transmitted to a second earpiece in which the first motion path and the second motion path are matched in comparison;
the first movement path and the second movement path are compared and matched, and whether the movement paths of the two earphones are the same or not can be judged according to the comparison and matching result, so that whether the two earphones are worn on the same user body or not is judged; if the user wearing the two earphones is judged to be worn on different users, the user wearing the second earphone can obtain the head action of the user wearing the first earphone according to the obtained first motion path.
In particular, the first movement path may be transmitted by Near Field Magnetic Induction (NFMI) technology into the second earpiece, where the first and second movement paths are processed in comparison. The wireless data transmission between the first earphone and the second earphone is carried out through the NFMI technology, the wireless data transmission can also be carried out in the environment such as underwater, the application range is wide, and the ultra-low power consumption is achieved.
In the embodiment of the invention, the first motion path and the second motion path are compared and matched, and whether the motions of the two earphones are synchronous or not is judged, so that whether the two earphones are worn on the same user body or not is judged.
With reference to fig. 2, fig. 2 shows a flowchart of a motion recognition method according to a second embodiment of the present invention, and after S103, the method further includes S104, recognizing a head motion according to a first motion path and a second motion path.
According to the first motion path, the head action corresponding to the first earphone can be identified; according to the second motion path, the head motion corresponding to the second earphone can be identified.
When it is determined that the first earphone and the second earphone are worn on the same user at the same time, the first motion path and the second motion path are the same, the head motion corresponding to the first earphone and the head motion corresponding to the second earphone are the same, the head motion can be identified only through the first motion path or the second motion path, and the head motion can also be identified through the first motion path and the second motion path.
When the first earphone and the second earphone are worn on different users, the first motion path and the second motion path are different, the head action corresponding to the first earphone is identified through the first motion path, and the head action corresponding to the second earphone is identified through the second motion path.
With reference to fig. 3, fig. 3 is a flowchart illustrating a motion recognition method according to a third embodiment of the present invention, and after S104, the method further includes S105, executing an operation instruction corresponding to the recognized head motion.
The head movements can be nodding, shaking, raising, lowering and the like, and each head movement can correspond to an operation instruction.
For example, the main body for processing the head movement is a bluetooth headset, when the bluetooth headset is used for controlling the sound box, the head nodding can correspond to a music playing instruction, the head shaking can correspond to a music playing instruction, the head raising can correspond to a next music playing instruction, the head lowering can correspond to a music playing instruction, and the like.
For example, when the main body for processing the head motion is a bluetooth headset, and the bluetooth headset is used for controlling a mobile terminal such as a mobile phone, the user can correspondingly turn on the incoming call command by pointing the head, the user can correspondingly turn off the incoming call command by shaking the head, the user can correspondingly turn on a photographing command by raising the head, and the user can correspondingly turn on a music command by lowering the head.
With reference to fig. 4, fig. 4 is a schematic flow chart of a motion recognition method according to a fourth embodiment of the present invention, before S101, the method further includes S106, determining whether a user wears a first headset and a second headset;
if the first earphone and the second earphone are worn, first motion data of a user are collected through a sensor of the first earphone, and second motion data of the user are collected through a sensor of the second earphone.
Specifically, a first contact distance between a first earphone and a user is detected, and whether the first contact distance is smaller than a preset first distance threshold value is judged;
detecting a second contact distance between a second earphone and a user, and judging whether the second contact distance is smaller than a preset second distance threshold value;
if the first contact distance is smaller than the preset first distance threshold value and the second contact distance is smaller than the preset second distance threshold value, the user wears the first earphone and the second earphone.
The first earphone and the second earphone may be respectively provided with a touch sensor, and the touch distance between the first earphone and the user and the touch distance between the second earphone and the user may be respectively sensed through the touch sensors.
The invalid action recognition of the first earphone or the second earphone is avoided when the first earphone and the second earphone are not worn. Only when wearing first earphone and second earphone simultaneously, can gather user's first motion data through the sensor of first earphone, gather user's second motion data through the sensor of second earphone.
In the above embodiment of the present invention, a method of cascading multi-stage classifiers is adopted, where the first stage classifier is to determine whether to wear a first earphone and a second earphone; invalid identification is avoided when the first earphone or the second earphone is not worn.
The second-stage classifier is used for acquiring first acceleration data and first angle data of the first earphone and calculating to generate a first motion path; acquiring second acceleration data and second angle data of a second earphone, and calculating to generate a second motion path;
and the third-stage classifier is used for comparing and matching the first motion path and the second motion path and determining whether the first earphone and the second earphone are worn on the same user or not according to a comparison and matching result.
In the embodiment of the invention, the output result of the first-stage classifier is the input sample of the second-stage classifier, and the output result of the second-stage classifier is the input sample of the third-stage classifier, so that repeated identification of invalid actions can be avoided, and the action identification efficiency is improved. For example, the output result of the first-stage classifier is that the user wears the first earphone and the second earphone, the result is used as an input sample of the second-stage classifier, the output result of the second-stage classifier is the first motion path and the second motion path, the result is used as an input sample of the third-stage classifier, and after the third-stage classifier is compared and matched, whether the first earphone and the second earphone are worn on the same user is judged.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
In the embodiment of the present invention, an action recognition apparatus is further provided, and the action recognition apparatus includes modules for executing the steps in the embodiment corresponding to fig. 1. Please refer to fig. 1 for the related description of the corresponding embodiment.
Fig. 5 is a schematic structural diagram of a motion recognition device according to a first embodiment of the present invention. As shown in fig. 5, the motion recognition apparatus 2 of this embodiment includes,
the acquisition module 21 is configured to acquire first motion data of a user through a sensor of a first earphone and acquire second motion data of the user through a sensor of a second earphone;
a determining module 22, configured to determine a first motion path according to the first motion data, and determine a second motion path according to the second motion data;
and the comparison module 23 is configured to compare and match the first motion path and the second motion path, and determine whether the first earphone and the second earphone are worn on the same user according to a comparison and matching result.
Fig. 6 is a schematic structural diagram of a motion recognition device according to a second embodiment of the present invention. Based on fig. 5, the determining module 22 includes,
the first determining submodule 221 is configured to calculate and generate a first motion path according to the first acceleration data and the first angle data; (ii) a
And a second determining submodule 222, configured to calculate and generate a second motion path according to the second acceleration data and the second angle data.
Fig. 7 is a schematic structural diagram of a motion recognition device according to a third embodiment of the present invention. Based on fig. 5, an identification module 24 is also included,
the identifying module 24 is configured to identify a head motion according to the first motion path and the second motion path.
Fig. 8 is a schematic structural diagram of a fourth embodiment of the motion recognition device according to the present invention. Based on fig. 7, further comprises an execution module 25,
the executing module 25 is configured to execute an operation instruction corresponding to the identified head action.
Fig. 9 is a schematic structural diagram of a fifth embodiment of the motion recognition device according to the present invention. Based on fig. 4, further comprises a judgment wearing module 26,
the wearing judgment module 26 is used for judging whether a user wears the first earphone and the second earphone;
the judgment wear module 26 further includes a first connector,
the first detection submodule 261 is configured to detect a first contact distance between the first earphone and the user, and determine whether the first contact distance is smaller than a preset first distance threshold;
the second detection submodule 262 detects a second contact distance between the second earphone and the user, and determines whether the second contact distance is smaller than a preset second distance threshold;
the determining submodule 263 is configured to wear the first earphone and the second earphone if the first contact distance is smaller than the preset first distance threshold and the second contact distance is smaller than the preset second distance threshold.
Fig. 10 is a schematic diagram of a motion recognition device according to an embodiment of the present invention. As shown in fig. 10, the motion recognition device 6 includes a processor 60, a memory 61, and a computer program 62, such as a motion recognition implementation program, stored in the memory 61 and executable on the processor 60. The processor 60, when executing the computer program 62, implements the steps in each of the above-described embodiments of the motion recognition method, such as S101 to S103 shown in fig. 1. Alternatively, the processor 60, when executing the computer program 62, implements the functions of the modules/units in the above-mentioned device embodiments, such as the functions of the modules 21 to 23 shown in fig. 5.
Illustratively, the computer program 62 may be partitioned into one or more modules/units that are stored in the memory 61 and executed by the processor 60 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 62 in the action recognition device 6. For example, the computer program 62 may be divided into an acquisition module, an execution module, and a generation module (module in a virtual device), and the specific functions of each module are as follows:
the acquisition module is used for acquiring first motion data of a user through a sensor of the first earphone and acquiring second motion data of the user through a sensor of the second earphone;
the determining module is used for determining a first motion path according to the first motion data and determining a second motion path according to the second motion data;
and the comparison module is used for comparing and matching the first motion path and the second motion path and determining whether the first earphone and the second earphone are worn on the same user according to the comparison and matching result.
The motion recognition device 6 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The motion recognition device 6 may include, but is not limited to, a processor 60, a memory 61. It will be understood by those skilled in the art that fig. 5 is only an example of the motion recognition apparatus 6, and does not constitute a limitation to the motion recognition apparatus 6, and may include more or less components than those shown, or combine some components, or different components, for example, the motion recognition apparatus 6 may further include an input-output device, a network access device, a bus, etc.
The Processor 60 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may be an internal storage unit of the motion recognition device 6, such as a hard disk or a memory of the motion recognition device 6. The memory 61 may also be an external storage device of the motion recognition device 6, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the terminal device. Further, the memory 61 may include both an internal storage unit and an external storage device of the motion recognition apparatus 6. The memory 61 is used for storing the computer program and other programs and data required by the terminal device. The memory 61 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. . Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may comprise any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, etc. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (12)

1. An action recognition method is characterized by comprising the following steps:
acquiring first motion data of a user through a sensor of a first earphone, and acquiring second motion data of the user through a sensor of a second earphone;
determining a first motion path according to the first motion data, and determining a second motion path according to the second motion data;
and comparing and matching the first motion path and the second motion path, and determining whether the first earphone and the second earphone are worn on the same user according to a comparison and matching result.
2. The motion recognition method of claim 1, wherein determining a first motion path based on the first motion data and a second motion path based on the second motion data comprises,
the first motion data comprise first acceleration data and first angle data, and a first motion path is generated through calculation according to the first acceleration data and the first angle data;
the second motion data comprises second acceleration data and second angle data, and a second motion path is generated according to the second acceleration data and the second angle data in a calculation mode.
3. The motion recognition method according to claim 1, wherein the comparing and matching the first motion path and the second motion path, and determining whether the first earphone and the second earphone are worn on the same user according to the comparing and matching result, further comprises,
and recognizing the head action according to the first motion path and the second motion path.
4. The motion recognition method according to claim 3, wherein after recognizing the head motion based on the first motion path and/or the second motion path, further comprising,
and executing the operation instruction corresponding to the identified head action.
5. The motion recognition method of claim 1, wherein before the collecting of the first motion data of the user by the sensor of the first headset and the collecting of the second motion data of the user by the sensor of the second headset, further comprising,
judging whether a user wears a first earphone and a second earphone;
detecting a first contact distance between a first earphone and a user, and judging whether the first contact distance is smaller than a preset first distance threshold value;
detecting a second contact distance between a second earphone and a user, and judging whether the second contact distance is smaller than a preset second distance threshold value;
if the first contact distance is smaller than the preset first distance threshold value and the second contact distance is smaller than the preset second distance threshold value, the user wears the first earphone and the second earphone.
6. The motion recognition method according to claim 1, wherein the comparing and matching the first motion path and the second motion path, and determining whether the first earphone and the second earphone are worn on the same user according to the comparing and matching result comprises,
transmitting the first motion path to a second earphone by a near magnetic field induction technology, and comparing and matching the first motion path and the second motion path in the second earphone;
and determining whether the first earphone and the second earphone are worn on the same user or not according to the comparison and matching result.
7. The motion recognition method according to any one of claims 1 to 6, wherein the head motion comprises nodding, shaking, raising or lowering.
8. A motion recognition device is characterized by comprising,
the acquisition module is used for acquiring first motion data of a user through a sensor of the first earphone and acquiring second motion data of the user through a sensor of the second earphone;
the determining module is used for determining a first motion path according to the first motion data and determining a second motion path according to the second motion data;
and the comparison module is used for comparing and matching the first motion path and the second motion path and determining whether the first earphone and the second earphone are worn on the same user according to the comparison and matching result.
9. The motion recognition apparatus of claim 8, wherein the determination module comprises,
the first determining submodule is used for calculating and generating a first motion path according to the first acceleration data and the first angle data, wherein the first motion data comprises first acceleration data and first angle data;
and the second determining submodule is used for calculating and generating a second motion path according to the second acceleration data and the second angle data, wherein the second motion data comprises second acceleration data and second angle data.
10. The motion recognition apparatus according to claim 8, further comprising,
and the identification module is used for identifying the head action according to the first motion path and the second motion path.
11. An action recognition device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the steps of the method according to any of claims 1 to 7 are implemented when the computer program is executed by the processor.
12. Computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN201910241771.2A 2019-03-28 2019-03-28 Action recognition method and device and computer readable storage medium Pending CN111752372A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910241771.2A CN111752372A (en) 2019-03-28 2019-03-28 Action recognition method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910241771.2A CN111752372A (en) 2019-03-28 2019-03-28 Action recognition method and device and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN111752372A true CN111752372A (en) 2020-10-09

Family

ID=72672092

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910241771.2A Pending CN111752372A (en) 2019-03-28 2019-03-28 Action recognition method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111752372A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113257250A (en) * 2021-05-11 2021-08-13 歌尔股份有限公司 Fraud behavior detection method, device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140086438A1 (en) * 2012-09-26 2014-03-27 Sony Mobile Communications Inc. Control method of mobile terminal apparatus
CN107306369A (en) * 2016-04-20 2017-10-31 Lg电子株式会社 Portable sound devices
CN107331129A (en) * 2017-06-14 2017-11-07 前海随身宝(深圳)科技有限公司 A kind of loss determination methods and device based on motion sensor
US20180103321A1 (en) * 2016-10-10 2018-04-12 Samsung Electronics Co., Ltd. Output device outputting audio signal and control method thereof
CN108737922A (en) * 2018-05-21 2018-11-02 深圳市沃特沃德股份有限公司 Bluetooth headset play control method and Bluetooth headset
CN208540114U (en) * 2018-07-20 2019-02-22 深圳市冠旭电子股份有限公司 A kind of bluetooth headset based on NFMI technology
CN109379653A (en) * 2018-09-30 2019-02-22 Oppo广东移动通信有限公司 Audio frequency transmission method, device, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140086438A1 (en) * 2012-09-26 2014-03-27 Sony Mobile Communications Inc. Control method of mobile terminal apparatus
CN107306369A (en) * 2016-04-20 2017-10-31 Lg电子株式会社 Portable sound devices
US20180103321A1 (en) * 2016-10-10 2018-04-12 Samsung Electronics Co., Ltd. Output device outputting audio signal and control method thereof
CN107331129A (en) * 2017-06-14 2017-11-07 前海随身宝(深圳)科技有限公司 A kind of loss determination methods and device based on motion sensor
CN108737922A (en) * 2018-05-21 2018-11-02 深圳市沃特沃德股份有限公司 Bluetooth headset play control method and Bluetooth headset
CN208540114U (en) * 2018-07-20 2019-02-22 深圳市冠旭电子股份有限公司 A kind of bluetooth headset based on NFMI technology
CN109379653A (en) * 2018-09-30 2019-02-22 Oppo广东移动通信有限公司 Audio frequency transmission method, device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113257250A (en) * 2021-05-11 2021-08-13 歌尔股份有限公司 Fraud behavior detection method, device and storage medium

Similar Documents

Publication Publication Date Title
CN108985212B (en) Face recognition method and device
US11074466B2 (en) Anti-counterfeiting processing method and related products
US11416080B2 (en) User intention-based gesture recognition method and apparatus
CN110505550B (en) Wireless earphone in-ear detection method and device and wireless earphone
CN109151211B (en) Voice processing method and device and electronic equipment
CN105874408A (en) Gesture interactive wearable spatial audio system
CN102023772A (en) Capacitive touch screen signal processing method and device
KR20150121949A (en) Method and apparatus for recognizing gestures
CN110909695B (en) Anti-counterfeiting processing method and related product
WO2019132564A1 (en) Method and system for classifying time-series data
CN104750251A (en) Information processing method and first wearable equipment
CN111933167B (en) Noise reduction method and device of electronic equipment, storage medium and electronic equipment
CN111510785B (en) Video playing control method, device, terminal and computer readable storage medium
CN107466387B (en) Method and device for detecting touch mode
CN107545163B (en) Unlocking control method and related product
CN113778255B (en) Touch recognition method and device
CN111752372A (en) Action recognition method and device and computer readable storage medium
CN108196819B (en) Working mode switching method and device applied to terminal and electronic equipment
CN111857366B (en) Method and device for determining double-click action of earphone and earphone
CN110796147B (en) Image segmentation method and related product
CN104735249A (en) Information processing method and electronic equipment
US20230022327A1 (en) Method for determining wearing status of wireless earphone and related apparatus
CN108632713B (en) Volume control method and device, storage medium and terminal equipment
CN113766385B (en) Earphone noise reduction method and device
CN115278431A (en) State determination method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201009