CN106371560B - Method and apparatus for determining blowing and suction air - Google Patents

Method and apparatus for determining blowing and suction air Download PDF

Info

Publication number
CN106371560B
CN106371560B CN201510512499.9A CN201510512499A CN106371560B CN 106371560 B CN106371560 B CN 106371560B CN 201510512499 A CN201510512499 A CN 201510512499A CN 106371560 B CN106371560 B CN 106371560B
Authority
CN
China
Prior art keywords
action
information
target
blowing
amplitude value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510512499.9A
Other languages
Chinese (zh)
Other versions
CN106371560A (en
Inventor
刘浩
周梁
于魁飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhigu Ruituo Technology Services Co Ltd
Original Assignee
Beijing Zhigu Ruituo Technology Services Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhigu Ruituo Technology Services Co Ltd filed Critical Beijing Zhigu Ruituo Technology Services Co Ltd
Priority to CN201510512499.9A priority Critical patent/CN106371560B/en
Publication of CN106371560A publication Critical patent/CN106371560A/en
Application granted granted Critical
Publication of CN106371560B publication Critical patent/CN106371560B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The application provides a blowing and suction air determining method and device, and relates to the field of wearable devices. The method comprises the following steps: responding to a target action executed by a user, and acquiring body feeling information of the user, wherein the target action is a blowing action or a sucking action; and determining the action type of the target action according to the somatosensory information and reference information. Therefore, the method and the equipment for determining the air blowing and sucking actions according to the somatosensory information are provided, and are beneficial to liberation of hands during man-machine interaction and improvement of interaction capacity of electronic equipment such as wearable equipment.

Description

Method and apparatus for determining blowing and suction air
Technical Field
The application relates to the field of wearable devices, in particular to a blowing and suction air determining method and device.
Background
With the popularization and development of electronic devices, electronic devices are increasingly intelligent. The interaction capability of the electronic equipment is continuously enhanced, and the electronic equipment can be controlled more conveniently and quickly by a user, which is always pursued by electronic equipment manufacturers.
For most electronic devices that exist today, a user is required to interact with them using both hands. In some cases, the user's hands may not be convenient to interact, such as being occupied; in some cases, some display contents displayed in a mixed mode are seen by a user, but the user cannot feel the touch and is not convenient to use two hands to interact with each other.
Therefore, it is necessary to provide more diversified interactive modes.
Disclosure of Invention
The purpose of this application is: a method and apparatus for determining a blowing and suctioning is provided.
According to a first aspect of at least one embodiment of the present application, there is provided a method of determining a blowing and suctioning, the method including:
responding to a target action executed by a user, and acquiring myoelectric information of an eye of the user, wherein the target action is a blowing action or an inhaling action;
and determining the action type of the target action according to the electromyographic information and electromyographic reference information.
With reference to any one of the possible implementation manners of the first aspect, in a second possible implementation manner, the determining, according to the electromyographic information and electromyographic reference information, an action type of the target action includes:
and determining the action type of the target action according to the electromyographic information and a reference amplitude value.
With reference to any one of the possible implementation manners of the first aspect, in a third possible implementation manner, the determining, according to the electromyographic information and a reference amplitude value, an action type of the target action includes:
and determining the action type of the target action as the blowing action in response to that an amplitude value of the myoelectric information is larger than a first amplitude value.
With reference to any one of the possible implementation manners of the first aspect, in a fourth possible implementation manner, the determining, according to the electromyographic information and a reference amplitude value, an action type of the target action includes:
and determining the action type of the target action as an inspiration action in response to an amplitude value of the electromyographic information being smaller than a second amplitude value.
With reference to any one of the possible implementation manners of the first aspect, in a fifth possible implementation manner, the determining, according to the electromyographic information and electromyographic reference information, an action type of the target action includes:
and determining the action type of the target action according to the electromyographic information and a reference waveform.
With reference to any one of the possible implementation manners of the first aspect, in a sixth possible implementation manner, the determining, according to the electromyographic information and electromyographic reference information, an action type of the target action includes:
and determining the action type of the target action according to the electromyographic information and a reference signal characteristic.
With reference to any one of the possible implementation manners of the first aspect, in a seventh possible implementation manner, the method further includes:
and determining the intensity information of the target action according to at least one amplitude value of the electromyographic information.
With reference to any one of the possible implementation manners of the first aspect, in an eighth possible implementation manner, the determining the strength information of the target action according to the at least one amplitude value of the electromyographic information includes:
responding to the fact that the action type of the target action is an air blowing action, and determining the maximum intensity value of the air blowing action according to the maximum amplitude value of the myoelectric information;
and in response to the action type of the target action being an inspiratory action, determining a maximum intensity value of the inspiratory action according to the minimum amplitude value of the electromyographic information.
With reference to any one of the possible implementation manners of the first aspect, in a ninth possible implementation manner, the method further includes:
and determining first input information corresponding to the strength information.
With reference to any one of the possible implementation manners of the first aspect, in a tenth possible implementation manner, the method further includes:
and determining the frequency of the target action according to the number of the reference waveforms corresponding to the action types in the electromyographic information.
With reference to any one of the possible implementation manners of the first aspect, in an eleventh possible implementation manner, the method further includes:
and determining the frequency of the target action according to the number of the reference signal characteristics corresponding to the action type in the electromyographic information.
With reference to any one of the possible implementation manners of the first aspect, in a twelfth possible implementation manner, the method further includes:
and determining second input information corresponding to the times of the target action.
With reference to any one of the possible implementation manners of the first aspect, in a thirteenth possible implementation manner, the method further includes:
and determining third input information corresponding to the action type of the target action.
According to a second aspect of at least one embodiment of the present application, there is provided a blowing and suctioning determination method including:
responding to a target action executed by a user, and acquiring body feeling information of the user, wherein the target action is a blowing action or a sucking action;
and determining the action type of the target action according to the somatosensory information and reference information.
According to a third aspect of at least one embodiment of the present application, there is provided a blowing and suctioning determination apparatus including:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for responding to a target action executed by a user and acquiring electromyographic information of one eye of the user, and the action type of the target action;
and the action type determining module is used for determining the action type of the target action according to the electromyographic information and electromyographic reference information.
With reference to any one of the possible implementation manners of the third aspect, in a second possible implementation manner, the action type determining module is configured to determine the action type of the target action according to the electromyographic information and a reference amplitude value.
With reference to any one of the possible implementation manners of the third aspect, in a third possible implementation manner, the motion type determination module is configured to determine that the motion type of the target motion is an air blowing motion in response to that an amplitude value of the electromyographic information is greater than a first amplitude value.
With reference to any one of the possible implementation manners of the third aspect, in a fourth possible implementation manner, the motion type determination module is configured to determine that the motion type of the target motion is an inhalation motion in response to that an amplitude value of the electromyographic information is smaller than a second amplitude value.
With reference to any one of the possible implementation manners of the third aspect, in a fifth possible implementation manner, the action type determining module is configured to determine an action type of the target action according to the electromyographic information and a reference waveform.
With reference to any one of the possible implementation manners of the third aspect, in a sixth possible implementation manner, the action type determining module is configured to determine an action type of the target action according to the electromyographic information and a reference signal characteristic.
With reference to any one possible implementation manner of the third aspect, in a seventh possible implementation manner, the apparatus further includes:
and the intensity information determining module is used for determining the intensity information of the target action according to at least one amplitude value of the electromyographic information.
With reference to any one of the possible implementation manners of the third aspect, in an eighth possible implementation manner, the strength information determining module includes:
the first determination unit is used for responding to the fact that the action type of the target action is an air blowing action, and determining the maximum strength value of the air blowing action according to the maximum amplitude value of the myoelectric information;
and the second determination unit is used for responding to the action type of the target action as an inspiration action, and determining the maximum intensity value of the inspiration action according to the minimum amplitude value of the electromyographic information.
With reference to any one possible implementation manner of the third aspect, in a ninth possible implementation manner, the apparatus further includes:
and the first input information determining module is used for determining first input information corresponding to the strength information.
With reference to any one possible implementation manner of the third aspect, in a tenth possible implementation manner, the apparatus further includes:
and the frequency determining module is used for determining the frequency of the target action according to the number of the reference waveforms corresponding to the action types in the electromyographic information.
With reference to any one possible implementation manner of the third aspect, in an eleventh possible implementation manner, the apparatus further includes:
and the frequency determining module is used for determining the frequency of the target action according to the number of the reference signal characteristics corresponding to the action type in the electromyographic information.
With reference to any one possible implementation manner of the third aspect, in a twelfth possible implementation manner, the apparatus further includes:
and the second input information determining module is used for determining second input information corresponding to the times of the target action.
With reference to any one possible implementation manner of the third aspect, in a thirteenth possible implementation manner, the apparatus further includes:
and the third input information determining module is used for determining third input information corresponding to the action type of the target action.
According to a fourth aspect of at least one embodiment of the present application, there is provided a blowing and suctioning determination apparatus including:
the acquisition module is used for responding to a target action executed by a user and acquiring the body feeling information of the user, wherein the target action is a blowing action or a sucking action;
and the action type determining module is used for determining the action type of the target action according to the somatosensory information and reference information.
According to a fifth aspect of at least one embodiment of the present application, there is provided a user equipment, including:
a myoelectric sensor;
a memory for storing instructions;
a processor to execute the memory-stored instructions, the instructions to cause the processor to:
responding to a target action executed by a user, and acquiring myoelectric information of an eye of the user, wherein the target action is a blowing action or an inhaling action;
and determining the action type of the target action according to the electromyographic information and electromyographic reference information.
According to a sixth aspect of at least one embodiment of the present application, there is provided a user equipment, including:
an integral sensor;
a memory for storing instructions;
a processor to execute the memory-stored instructions, the instructions to cause the processor to:
responding to a target action executed by a user, and acquiring body feeling information of the user, wherein the target action is a blowing action or a sucking action;
and determining the action type of the target action according to the somatosensory information and reference information.
According to the input information determining method and the input information determining equipment, a target action is executed by a user in response to the user, the body feeling information of the user is acquired, the target action is a blowing action or a sucking action, and then the action type of the target action is determined according to the body feeling information and reference information. Therefore, the method for recognizing the air blowing and sucking actions of the user according to the somatosensory information is provided, hands are liberated during man-machine interaction, and the interaction capacity of the electronic equipment is improved.
Drawings
Fig. 1 is a flow chart of a method of determining a blowing air flow according to an embodiment of the present application;
FIG. 2 is a schematic diagram of electrode positions during acquisition of electroencephalogram information according to the 10/20 system method;
FIG. 3 is an electroencephalogram of an embodiment of the present application when performing an insufflation and an inspiration maneuver with a small force, respectively;
FIG. 4 is an electroencephalogram of an embodiment of the present application when performing an insufflation and an inspiration maneuver with greater force, respectively;
FIG. 5 is a schematic diagram of a location of collection of electro-ocular information in another embodiment of the present application;
FIG. 6 is a diagram of electro-ocular waveforms for performing an inspiratory maneuver and an insufflation maneuver, respectively, with a small force, according to one embodiment of the present application;
FIG. 7 is a diagram of electro-ocular waveforms for performing an inspiratory maneuver and an insufflation maneuver, respectively, with greater force, according to one embodiment of the present application;
FIG. 8 is a schematic diagram of a location of collection of electromyographic information for an eye according to another embodiment of the present application;
FIG. 9 is a diagram of an eye electromyography waveform when performing an inspiration maneuver and an insufflation maneuver with a small force, respectively, according to one embodiment of the present application;
FIG. 10 is a diagram of an eye electromyography waveform when performing an inspiration maneuver and an insufflation maneuver, respectively, with greater force, according to one embodiment of the present application;
fig. 11 is a block diagram of a blowing and suction air determining apparatus according to an embodiment of the present application;
fig. 12 is a block diagram of a blowing and suction air determining apparatus according to an embodiment of the present application;
FIG. 13 is a block diagram of the intensity information determination module in one embodiment of the present application;
fig. 14 is a block diagram of a blowing and suctioning determination apparatus according to another embodiment of the present application;
fig. 15 is a block diagram of a blowing and suctioning determination apparatus according to another embodiment of the present application;
fig. 16 is a block diagram of a blowing and suctioning determination apparatus according to another embodiment of the present application;
fig. 17 is a block diagram of a blowing and suctioning determination apparatus according to another embodiment of the present application;
fig. 18 is a block diagram of a blowing and suctioning determination apparatus according to another embodiment of the present application;
FIG. 19 is a block diagram of the intensity information determination module in another embodiment of the present application;
fig. 20 is a block diagram of a blowing and suctioning determination apparatus according to another embodiment of the present application;
fig. 21 is a block diagram of a blowing and suctioning determination apparatus according to another embodiment of the present application;
fig. 22 is a block diagram of a blowing and suctioning determination apparatus according to another embodiment of the present application;
fig. 23 is a block diagram of a blowing and suctioning determination apparatus according to another embodiment of the present application;
fig. 24 is a block diagram of a blowing and suctioning determination apparatus according to another embodiment of the present application;
FIG. 25 is a block diagram of the intensity information determination module in another embodiment of the present application;
fig. 26 is a block diagram of a blowing and suctioning determination apparatus according to another embodiment of the present application;
fig. 27 is a block diagram of a blowing and suction air determining apparatus according to another embodiment of the present application;
fig. 28 is a block diagram of a blowing and suctioning determination apparatus according to another embodiment of the present application;
fig. 29 is a block diagram of a blowing and suctioning determination apparatus according to another embodiment of the present application;
fig. 30 is a block diagram of a blowing and suctioning determination apparatus according to another embodiment of the present application;
fig. 31 is a schematic hardware structure diagram of a user equipment according to an embodiment of the present application;
fig. 32 is a schematic hardware structure diagram of a user equipment according to another embodiment of the present application.
Detailed Description
The following detailed description of embodiments of the present application will be made with reference to the accompanying drawings and examples. The following examples are intended to illustrate the present application but are not intended to limit the scope of the present application.
Those skilled in the art will understand that, in the embodiments of the present application, the size of the serial number of each step described below does not mean the execution sequence, and the execution sequence of each step should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
The inventor finds in the research process that when a user respectively performs an air blowing action and an air suction action with the mouth, two kinds of somatosensory information correspondingly collected from the body of the user are obviously different. Based on the method, the blowing action and the sucking action of the user can be identified according to the somatosensory information of the user.
Fig. 1 is a flow chart of a method for determining a blowing/suctioning action according to an embodiment of the present application, which may be implemented on, for example, a blowing/suctioning action determining device. As shown in fig. 1, the method includes:
s120: responding to a target action executed by a user, and acquiring body feeling information of the user, wherein the target action is a blowing action or a sucking action;
s140: and determining the action type of the target action according to the somatosensory information and reference information.
According to the method, a target action is executed by a user, the body feeling information of the user is acquired, the target action is a blowing action or a sucking action, and then the action type of the target action is determined according to the body feeling information and reference information. Therefore, the method for recognizing the air blowing and sucking actions of the user according to the somatosensory information is provided, hands are liberated during man-machine interaction, and the interaction capacity of the electronic equipment is improved.
The functions of steps S120 and S140 will be described in detail below with reference to specific embodiments.
S120: the method comprises the steps of responding to a target action executed by a user, and acquiring the body feeling information of the user, wherein the target action is a blowing action or a sucking action.
The somatosensory information can be electroencephalogram information, electro-oculogram information or eye myoelectricity information, and can be acquired through corresponding sensors. Each of the somatosensory information will be described in detail below.
S140: and determining the action type of the target action according to the somatosensory information and reference information.
Wherein the motion types of the target motion are divided into two types, namely the blowing motion and the inhaling motion. That is, in this step, it is to accurately identify whether the target motion is a blowing motion or a sucking motion, based on the somatosensory information and the reference information.
a) In one embodiment, the somatosensory information is electroencephalogram information. Correspondingly, the step S140 further includes:
s140 a: and determining the action type of the target action according to the electroencephalogram information and electroencephalogram reference information.
The electroencephalogram information may be, for example, EEG (electroencephalography) information of the user, and may be acquired by, for example, an EEG sensor. In addition, the inventor researches and discovers that when the electroencephalogram information is acquired according to the 10/20 system method shown in fig. 2, the electroencephalogram information of different areas is affected by the air blowing and sucking actions of the user to different degrees, and the electroencephalogram information of the areas of C3 and C4 is more obviously affected by the air blowing and sucking actions. In one embodiment, in step S120, the step of acquiring the sense of unity information of the user further includes:
s120 a: acquiring the brain electrical information at C3 and/or C4 regions of the user's brain.
Fig. 3 is a waveform diagram of electroencephalogram information when a user performs a suction action and a blowing action, respectively. Where the abscissa represents time, the ordinate represents waveform amplitude values, the dashed box A, B corresponds to the waveform when the user is blowing, and the dashed box C, D corresponds to the waveform when the user is inhaling. As can be seen, when the user blows air, the waveform amplitude value of the electroencephalogram information is obviously increased; when a user inhales, the amplitude value of the waveform of the electroencephalogram information is obviously reduced. Specifically, assuming that the waveform of the electroencephalogram information when the user does not perform any action is a standard electroencephalogram waveform (corresponding to a waveform outside the 4 dashed-line boxes in fig. 3), the amplitude values of a plurality of points on the waveform when the user blows air are significantly higher than the amplitude value of the peak of the standard electroencephalogram waveform, and the amplitude values of a plurality of points on the waveform when the user inhales air are significantly lower than the amplitude value of the trough of the standard electroencephalogram waveform. Based on the principle, the method and the device can realize the recognition of the blowing and sucking actions of the user.
In one embodiment, the step S140a may include:
s140 a': and determining the action type of the target action according to the electroencephalogram information and a reference amplitude value.
Wherein the reference amplitude value can be set according to a peak amplitude value or a trough amplitude value of the standard brain waveform.
In one embodiment, the step S140 a' further includes:
s141 a': and determining the action type of the target action as a blowing action in response to that an amplitude value of the electroencephalogram information is larger than a first amplitude value.
The first amplitude value may be set according to a peak amplitude value of the standard brain waveform, for example, an average value (denoted as a first average value) of peak amplitude values of a plurality of standard brain waveforms may be calculated as the first amplitude value. To reduce false identifications, the first amplitude value may be further increased.
In this step, in order to improve the real-time processing performance of the method and simplify the calculation, the electroencephalogram information may be scanned segment by segment in a predetermined time window, the maximum amplitude value of the electroencephalogram information in each time window is compared with the first amplitude value, and if the maximum amplitude value is larger than the first amplitude value, the motion type of the target motion is determined to be an air blowing motion.
In addition, as understood by those skilled in the art, if any amplitude value in the electroencephalogram information is not greater than the first amplitude value, the action type of the target action may be determined to be an inspiratory action.
In another embodiment, the step S140 a' further includes:
s142 a': and determining the action type of the target action as an inspiration action in response to an amplitude value of the electroencephalogram information being smaller than a second amplitude value.
The second amplitude value may be set according to a trough amplitude value of the standard brain waveform, for example, an average value (denoted as a second average value) of trough amplitude values of a plurality of standard brain waveforms may be calculated as the second amplitude value. To reduce false identifications, the second amplitude value may be further reduced.
In this step, in order to improve the real-time processing performance of the method and simplify the calculation, the electroencephalogram information may be scanned segment by segment in a predetermined time window, the minimum amplitude value of the electroencephalogram information in each time window is compared with the second amplitude value, and if the minimum amplitude value is smaller than the second amplitude value, the action type of the target action is determined to be an inhalation action.
In addition, as understood by those skilled in the art, if any amplitude value in the electroencephalogram information is not smaller than the second amplitude value, the action type of the target action may be determined to be a blowing action.
In another embodiment, the step S140a may include:
s140a ": and determining the action type of the target action according to the electroencephalogram information and a reference waveform.
The reference waveform may be a waveform corresponding to an insufflation action (referred to as an insufflation reference waveform) or a waveform corresponding to an inspiration action (referred to as an inspiration reference waveform). In this step, the waveform of the electroencephalogram information and the reference waveform may be subjected to similarity calculation according to an existing waveform similarity calculation method, and a calculation result may be compared with a waveform threshold value to determine the action type of the target action. For example, assuming that the reference waveform is an insufflation reference waveform, if the similarity between the waveform of the electroencephalogram information and the insufflation reference waveform is greater than the waveform threshold, the action type of the target action is an insufflation action, otherwise, the action type is an inspiration action.
To improve recognition accuracy, the insufflation reference waveform and the inspiration reference waveform may each include a plurality of sub-waveforms. For example, the insufflation reference waveform may include a plurality of sub-waveforms corresponding to different insufflation intensities, so that in this step, the waveforms of the electroencephalogram information should be subjected to similarity calculation with the plurality of sub-waveforms one by one, and if any calculation result is greater than the waveform threshold, the motion type is identified as an insufflation motion.
In another embodiment, the step S140a may include:
s140 a': and determining the action type of the target action according to the electroencephalogram information and a reference signal characteristic.
The reference signal characteristic may be a signal characteristic corresponding to an insufflation action (referred to as an insufflation reference signal characteristic) or a signal characteristic corresponding to an inspiration action (referred to as an inspiration reference signal characteristic). In this step, similarity calculation may be performed on the signal characteristics of the electroencephalogram information and the reference signal characteristics, and a calculation result may be compared with a characteristic threshold value to determine the action type of the target action. For example, assuming that the reference signal feature is an insufflation reference signal feature, if the similarity between the signal feature of the electroencephalogram information and the insufflation reference signal feature is greater than the feature threshold, the action type of the target action is an insufflation action, otherwise, the action type is an inspiration action.
Wherein the signal characteristics are related to at least one of amplitude, phase, frequency spectrum of the respective waveform. For example, the signal characteristics of the electroencephalogram information include: at least one of a fingerprint, an average, and a difference; the fingerprint consists of at least one item of amplitude, phase and frequency spectrum of the electroencephalogram information; the average value is the average value of at least one item of amplitude, phase and frequency spectrum of the electroencephalogram information; the difference is the difference of at least one item of amplitude, phase and frequency spectrum of the electroencephalogram information. Similarly, the blowing reference signal characteristics also include at least one of fingerprint, average value and difference; the fingerprint consists of at least one of amplitude, phase and frequency spectrum of the corresponding waveform of the blowing air; the average value is the average value of at least one of amplitude, phase and frequency spectrum of the corresponding waveform of the blowing air; the difference is the difference of at least one of the amplitude, the phase and the frequency spectrum of the corresponding waveform of the blowing air.
To improve identification accuracy, the insufflation reference signal feature and the inspiration reference signal feature may each include a plurality of sub-signal features. For example, the blowing reference signal feature may include a plurality of sub-signal features corresponding to different blowing intensities, so that in this step, the signal features of the electroencephalogram information should be subjected to similarity calculation with the plurality of sub-signal features one by one, and if any calculation result is greater than the feature threshold, the action type is identified as a blowing action.
Fig. 4 is a waveform diagram of electroencephalogram information when a user performs a blowing action and a suction action with a large force, respectively. The abscissa represents time, the ordinate represents a waveform amplitude value, the dotted line boxes A 'and B' correspond to waveforms when the user blows air, and the dotted line boxes C 'and D' correspond to waveforms when the user inhales air. Comparing the waveforms in the dotted line box a ' (or B ') and the dotted line box a (or B) in fig. 3, it can be seen that the maximum amplitude value of the waveform is significantly increased as the user's blowing strength is increased. Comparing the waveform in the dashed box C ' (or D ') with the dashed box C (or D) in fig. 3, it can be seen that the minimum amplitude value of the waveform decreases significantly as the user's inspiratory effort increases. Thus, in one embodiment, the method may further comprise:
s150 a: and determining the intensity information of the target action according to at least one amplitude value of the electroencephalogram information.
The intensity information of the target motion may be intensity values at a plurality of motion moments, that is, intensity values corresponding to a plurality of sampling points on the electroencephalogram information, or may also be a maximum intensity value of the target motion. The intensity information may correspond to a pressure or a pressure.
Specifically, in this step, the intensity information of the target motion may be determined according to at least one amplitude value of the electroencephalogram information and a predetermined corresponding relationship. The predetermined correspondence may be a correspondence between an amplitude value of the electroencephalogram information determined in advance through training and an intensity value of the target motion.
In one embodiment, the step 150a further comprises:
s151 a: responding to the fact that the action type of the target action is the blowing action, and determining the maximum intensity value of the blowing action according to the maximum amplitude value of the electroencephalogram information;
s152 a: and in response to the action type of the target action being an inspiration action, determining the maximum intensity value of the inspiration action according to the minimum amplitude value of the electroencephalogram information.
The inventor researches and discovers that when the blowing action of the user reaches the maximum intensity value, the amplitude value of the corresponding electroencephalogram information also reaches the maximum value, and corresponds to the peak of the wave in the dashed line box A, B in fig. 3. Therefore, in the step S151a, the maximum intensity value of the blowing motion may be determined according to the maximum amplitude value of the electroencephalogram information.
The inventor also found that when the user's inhalation reaches the maximum intensity value, the corresponding amplitude value of the electroencephalogram information decreases to the minimum, corresponding to the trough within the dashed box C, D in fig. 3. Therefore, in the step S152a, the maximum intensity value of the inhalation action may be determined according to the minimum amplitude value of the electroencephalogram information.
In one embodiment, the method may further comprise:
s160 a: and determining first input information corresponding to the strength information.
The corresponding relationship between the intensity information and the first input information may be preset, for example, the intensity information may be used as input information of an electronic game, and different intensity values may be input corresponding to different strengths of the game.
In some cases, the user may blow or inhale multiple times in succession to input different input information, such as one blow to input a select command and two blows to input an open command. Thus, in one embodiment, the method may further comprise:
s170 a: and determining the times of the target action according to the number of the reference waveforms corresponding to the action types in the electroencephalogram information.
In the step, if the action type is the blowing action, determining the times of the blowing action according to the quantity of a blowing reference waveform included in the electroencephalogram information; and if the action type is an inspiration action, determining the frequency of the inspiration action according to the number of inspiration reference waveforms included in the electroencephalogram information.
In another embodiment, the method may further comprise:
s170 a': and determining the times of the target action according to the quantity of the reference signal characteristics corresponding to the action type in the electroencephalogram information.
In the step, if the action type is the blowing action, determining the times of the blowing action according to the quantity of the characteristics of a blowing reference signal included in the electroencephalogram information; and if the action type is an inspiration action, determining the frequency of the inspiration action according to the quantity of the electroencephalogram information including inspiration reference signal characteristics.
In one embodiment, the method may further comprise:
s180 a: and determining second input information corresponding to the times of the target action.
The corresponding relationship between the number of the target actions and the second input information may be preset, for example, the corresponding relationship may be preset and stored in a table shown in table 1, and in this step, the second input information corresponding to the number of the target actions may be determined by looking up the table.
TABLE 1
Number of target actions Second input information
Blowing for 1 time Selecting a command
Blowing twice Open command
Inhaling for 1 time Switch to the next one
Two times of inspiration Back to the previous one
b) In another embodiment, the somatosensory information is electro-ocular information. Correspondingly, the step S140 further includes:
s140 b: and determining the action type of the target action according to the electro-ocular information and the electro-ocular reference information.
The Electro-ocular information may be, for example, EOG (Electro-ocular) information of the user, and may be acquired by, for example, an EOG sensor. Fig. 5 shows a schematic diagram of an acquisition position of the electrooculogram information, wherein two circular electrodes are located at positions corresponding to the acquisition position. Of course, those skilled in the art will appreciate that the electro-ocular information may be collected above and below the left eye of the user in addition to above and below the right eye as shown in FIG. 5.
Fig. 6 is a waveform diagram of electro-oculogram information when a user performs a suction motion and a blowing motion, respectively. Where the abscissa represents time, the ordinate represents waveform amplitude values, the dashed box E, F corresponds to the waveform when the user is blowing, and the dashed box G, H corresponds to the waveform when the user is inhaling. It can be seen that when the user blows air, the waveform amplitude value of the electrooculogram information is obviously increased; when the user inhales, it causes the magnitude of the waveform of the electro-ocular information to decrease significantly. Specifically, assuming that the waveform of the electrooculogram information when the user does not perform any action is a standard electrooculogram waveform (corresponding to the waveform outside the 4 dashed boxes in fig. 6), the amplitude values of a plurality of points on the waveform when the user blows air are significantly higher than the amplitude value of the peak of the standard electrooculogram waveform, and the amplitude values of a plurality of points on the waveform when the user inhales air are significantly lower than the amplitude value of the valley of the standard electrooculogram waveform. Based on the principle, the method and the device can realize the recognition of the blowing and sucking actions of the user.
In one embodiment, the step S140b may include:
s140 b': and determining the action type of the target action according to the electro-oculogram information and a reference amplitude value.
The reference amplitude value can be set according to a peak amplitude value or a trough amplitude value of the standard electrooculogram waveform.
In one embodiment, the step S140 b' further includes:
s141 b': and determining the action type of the target action as a blowing action in response to that an amplitude value of the electro-oculogram information is larger than a first amplitude value.
The first amplitude value may be set according to a peak amplitude value of the standard electro-ocular waveform, for example, an average value (denoted as a first average value) of peak amplitude values of a plurality of standard electro-ocular waveforms may be calculated as the first amplitude value. To reduce false identifications, the first amplitude value may be further increased.
In this step, in order to improve the real-time processing performance of the method and simplify the calculation, the electrooculogram information may be scanned segment by segment in a predetermined time window, the maximum amplitude value of the electrooculogram information in each time window is compared with the first amplitude value, and if the maximum amplitude value is larger than the first amplitude value, the motion type of the target motion is determined to be the blowing motion.
In addition, as understood by those skilled in the art, if any amplitude value in the electro-oculogram information is not larger than the first amplitude value, the action type of the target action may be determined as an inhalation action.
In another embodiment, the step S140 b' further includes:
s142 b': and determining the action type of the target action as an inspiration action in response to an amplitude value of the electro-oculogram information being smaller than a second amplitude value.
The second amplitude value may be set according to the valley amplitude value of the standard electro-ocular waveform, for example, an average value (denoted as a second average value) of the valley amplitude values of a plurality of standard electro-ocular waveforms may be calculated as the second amplitude value. To reduce false identifications, the second amplitude value may be further reduced.
In this step, in order to improve the real-time processing performance of the method and simplify the calculation, the electro-ocular information may be scanned segment by segment in a predetermined time window, the minimum amplitude value of the electro-ocular information in each time window is compared with the second amplitude value, and if the minimum amplitude value is smaller than the second amplitude value, the motion type of the target motion is determined to be an inhalation motion.
In addition, as understood by those skilled in the art, if any amplitude value in the electro-ocular information is not smaller than the second amplitude value, the action type of the target action may be determined to be a blowing action.
In another embodiment, the step S140b may include:
s140b ": and determining the action type of the target action according to the electro-oculogram information and a reference waveform.
The reference waveform may be a waveform corresponding to an insufflation action (referred to as an insufflation reference waveform) or a waveform corresponding to an inspiration action (referred to as an inspiration reference waveform). In this step, the similarity between the waveform of the electro-ocular information and the reference waveform may be calculated according to a conventional waveform similarity calculation method, and the calculation result may be compared with a waveform threshold value to determine the action type of the target action. For example, assuming that the reference waveform is an insufflation reference waveform, if the similarity between the waveform of the electrooculogram information and the insufflation reference waveform is greater than the waveform threshold, the action type of the target action is an insufflation action, otherwise, the action type is an inspiration action.
To improve recognition accuracy, the insufflation reference waveform and the inspiration reference waveform may each include a plurality of sub-waveforms. For example, the insufflation reference waveform may include a plurality of sub-waveforms corresponding to different insufflation intensities, so that in this step, the waveforms of the electro-oculogram information should be subjected to similarity calculation with the plurality of sub-waveforms one by one, and if any one calculation result is greater than the waveform threshold, the action type is identified as an insufflation action.
In another embodiment, the step S140b may include:
s140 b': and determining the action type of the target action according to the electro-ocular information and a reference signal characteristic.
The reference signal characteristic may be a signal characteristic corresponding to an insufflation action (referred to as an insufflation reference signal characteristic) or a signal characteristic corresponding to an inspiration action (referred to as an inspiration reference signal characteristic). In this step, similarity calculation may be performed on the signal feature of the electro-ocular information and the reference signal feature, and a calculation result may be compared with a feature threshold value to determine the action type of the target action. For example, assuming that the reference signal feature is an insufflation reference signal feature, if the similarity between the signal feature of the electrooculogram information and the insufflation reference signal feature is greater than the feature threshold, the action type of the target action is an insufflation action, otherwise, the action type is an inhalation action.
Wherein the signal characteristics are related to at least one of amplitude, phase, frequency spectrum of the respective waveform. For example, the signal characteristics of the electro-ocular information include: at least one of a fingerprint, an average, and a difference; the fingerprint is composed of at least one of amplitude, phase and frequency spectrum of the electrooculogram information; the average value is an average value of at least one of amplitude, phase, and frequency spectrum of the electrooculogram information; the difference is a difference in at least one of amplitude, phase, spectrum of the electro-ocular information. Similarly, the blowing reference signal characteristics also include at least one of fingerprint, average value and difference; the fingerprint consists of at least one of amplitude, phase and frequency spectrum of the corresponding waveform of the blowing air; the average value is the average value of at least one of amplitude, phase and frequency spectrum of the corresponding waveform of the blowing air; the difference is the difference of at least one of the amplitude, the phase and the frequency spectrum of the corresponding waveform of the blowing air.
To improve identification accuracy, the insufflation reference signal feature and the inspiration reference signal feature may each include a plurality of sub-signal features. For example, the blowing reference signal feature may include a plurality of sub-signal features corresponding to different blowing intensities, so that in this step, the signal features of the electrooculogram information should be subjected to similarity calculation with the plurality of sub-signal features one by one, and if any one calculation result is greater than the feature threshold, the action type is identified as a blowing action.
Fig. 7 is a waveform diagram of electro-oculogram information when a user performs a blowing action and a sucking action with a large force, respectively. The abscissa represents time, the ordinate represents a waveform amplitude value, the dotted line boxes E 'and F' correspond to waveforms when the user blows air, and the dotted line boxes G 'and H' correspond to waveforms when the user inhales air. Comparing the waveforms in the dashed box E ' (or F ') and the dashed box E (or F) in fig. 6, it can be seen that the maximum amplitude value of the waveform is significantly increased as the user's blowing strength is increased. Comparing the waveform in the dashed box G ' (or H ') with the dashed box G (or H) in fig. 6, it can be seen that the minimum amplitude value of the waveform decreases significantly as the user's inspiratory effort increases. Thus, in one embodiment, the method may further comprise:
s150 b: and determining the intensity information of the target action according to at least one amplitude value of the electro-ocular information.
The intensity information of the target motion may be intensity values at a plurality of motion timings, that is, intensity values corresponding to a plurality of sampling points on the electrooculogram information, or may be a maximum intensity value of the target motion. The intensity information may correspond to a pressure or pressure.
Specifically, in this step, the intensity information of the target action may be determined according to at least one amplitude value of the electro-ocular information and a predetermined corresponding relationship. The predetermined correspondence may be a correspondence between an amplitude value of the electro-oculogram information determined in advance by training and an intensity value of the target motion.
In one embodiment, the step 150b further comprises:
s151 b: in response to the fact that the action type of the target action is a blowing action, determining the maximum intensity value of the blowing action according to the maximum amplitude value of the electro-oculogram information;
s152 b: in response to the action type of the target action being an inspiratory action, determining a maximum intensity value of the inspiratory action according to the minimum amplitude value of the electro-oculogram information.
The inventor has found that when the user's blowing motion reaches the maximum intensity value, the corresponding amplitude value of the electro-oculogram information also reaches the maximum value, corresponding to the peak of the wave within the dashed box E, F in fig. 6. Therefore, in the step S151b, the maximum intensity value of the blowing action may be determined according to the maximum amplitude value of the electro-oculogram information.
The inventors have also found that when the user's inspiratory effort reaches a maximum intensity value, the corresponding magnitude of the electro-ocular information decreases to a minimum, corresponding to a trough within dashed box G, H in fig. 6. Therefore, in the step S152b, the maximum intensity value of the inhalation maneuver may be determined according to the minimum amplitude value of the electro-oculogram information.
In one embodiment, the method may further comprise:
s160 b: and determining first input information corresponding to the strength information.
The corresponding relationship between the intensity information and the first input information may be predetermined, for example, the intensity information may be used as input information of an electronic game, and different intensity values correspond to different intensity inputs of the game.
In some cases, the user may blow or inhale multiple times in succession to input different input information, such as one blow to input a select command and two blows to input an open command. Thus, in one embodiment, the method may further comprise:
s170 b: and determining the times of the target action according to the quantity of the reference waveforms corresponding to the action types in the electro-ocular information.
In the step, if the action type is the air blowing action, determining the times of the air blowing action according to the quantity of an air blowing reference waveform included in the electro-oculogram information; and if the action type is an inspiration action, determining the number of inspiration actions according to the number of inspiration reference waveforms included in the electro-oculogram information.
In another embodiment, the method may further comprise:
s170 b': and determining the times of the target action according to the quantity of the reference signal characteristics corresponding to the action type in the electro-ocular information.
In the step, if the action type is the air blowing action, determining the times of the air blowing action according to the quantity of the characteristics of the air blowing reference signal included in the electro-oculogram information; and if the action type is an inspiration action, determining the number of inspiration actions according to the number of characteristics of an inspiration reference signal included in the electro-oculogram information.
In one embodiment, the method may further comprise:
s180 b: and determining second input information corresponding to the times of the target action.
The corresponding relationship between the number of the target actions and the second input information may be preset, for example, the corresponding relationship may be preset and stored in a table shown in table 1, and in this step, the second input information corresponding to the number of the target actions may be determined by looking up the table.
c) In another embodiment, the somatosensory information is electromyographic information of an eye. Correspondingly, the step S140 further includes:
s140 c: and determining the action type of the target action according to the electromyographic information and electromyographic reference information.
The electromyographic information may be, for example, EMG (electromyography) information of an eye of the user, which may be acquired by, for example, an EMG sensor. Fig. 8 shows a schematic diagram of an acquisition position of the electromyographic information, wherein the positions of the two circular electrodes correspond to the acquisition position. Of course, it is understood by those skilled in the art that the electromyographic information may be acquired above the right eye, or above or below the left eye, in addition to being acquired below the right eye as shown in fig. 8.
Fig. 9 is a waveform diagram of electromyographic information when the user performs a suction motion and a blowing motion, respectively. Where the abscissa represents time, the ordinate represents waveform amplitude values, the dashed box E, F corresponds to the waveform when the user is blowing, and the dashed box G, H corresponds to the waveform when the user is inhaling. As can be seen, when the user blows air, the waveform amplitude value of the myoelectric information is obviously increased; when a user inhales, the waveform amplitude value of the electromyographic information is obviously reduced. Specifically, assuming that the waveform of the electromyographic information when the user does not perform any action is a standard electromyographic waveform (corresponding to the waveform outside the 4 dashed boxes in fig. 9), the amplitude values of a plurality of points on the waveform when the user blows air are significantly higher than the amplitude value of the peak of the standard electromyographic waveform, and the amplitude values of a plurality of points on the waveform when the user inhales air are significantly lower than the amplitude value of the trough of the standard electromyographic waveform. Based on the principle, the method and the device can realize the recognition of the blowing and sucking actions of the user.
In one embodiment, the step S140c may include:
s140 c': and determining the action type of the target action according to the electromyographic information and a reference amplitude value.
The reference amplitude value may be set according to a peak amplitude value or a trough amplitude value of the standard electromyogram.
In one embodiment, the step S140 c' further includes:
s141 c': and determining the action type of the target action as the blowing action in response to that an amplitude value of the myoelectric information is larger than a first amplitude value.
The first amplitude value may be set according to a peak amplitude value of the standard electromyogram waveform, for example, an average value (denoted as a first average value) of peak amplitude values of a plurality of standard electromyograms may be calculated as the first amplitude value. To reduce false identifications, the first amplitude value may be further increased.
In this step, in order to improve the real-time processing performance of the method and simplify the calculation, the electromyographic information may be scanned segment by segment in a predetermined time window, the maximum amplitude value of the electromyographic information in each time window is compared with the first amplitude value, and if the maximum amplitude value is greater than the first amplitude value, the action type of the target action is determined to be an air blowing action.
In addition, as understood by those skilled in the art, if any amplitude value in the electromyographic information is not greater than the first amplitude value, the action type of the target action may be determined to be an inspiratory action.
In another embodiment, the step S140 c' further includes:
s142 c': and determining the action type of the target action as an inspiration action in response to an amplitude value of the electromyographic information being smaller than a second amplitude value.
The second amplitude value may be set according to a trough amplitude value of the standard electromyogram, for example, an average value (denoted as a second average value) of trough amplitude values of a plurality of standard electromyograms may be calculated as the second amplitude value. To reduce false identifications, the second amplitude value may be further reduced.
In this step, in order to improve the real-time processing performance of the method and simplify the calculation, the electromyographic information may be scanned segment by segment in a predetermined time window, the minimum amplitude value of the electromyographic information in each time window is compared with the second amplitude value, and if the minimum amplitude value is smaller than the second amplitude value, the action type of the target action is determined to be an inhalation action.
In addition, as understood by those skilled in the art, if any amplitude value in the myoelectric information is not less than the second amplitude value, the action type of the target action may be determined to be a blowing action.
In another embodiment, the step S140c may include:
s140c ": and determining the action type of the target action according to the electromyographic information and a reference waveform.
The reference waveform may be a waveform corresponding to an insufflation action (referred to as an insufflation reference waveform) or a waveform corresponding to an inspiration action (referred to as an inspiration reference waveform). In this step, a similarity calculation may be performed between the waveform of the electromyographic information and the reference waveform according to an existing waveform similarity calculation method, and a calculation result may be compared with a waveform threshold value to determine the action type of the target action. For example, assuming that the reference waveform is an insufflation reference waveform, if the similarity between the waveform of the myoelectric information and the insufflation reference waveform is greater than the waveform threshold, the action type of the target action is an insufflation action, otherwise, the action type is an inhalation action.
To improve recognition accuracy, the insufflation reference waveform and the inspiration reference waveform may each include a plurality of sub-waveforms. For example, the insufflation reference waveform may include a plurality of sub-waveforms corresponding to different insufflation intensities, so that in this step, the waveform of the myoelectric information should be subjected to similarity calculation with the plurality of sub-waveforms one by one, and if any one calculation result is greater than the waveform threshold, the action type is identified as an insufflation action.
In another embodiment, the step S140c may include:
s140 c': and determining the action type of the target action according to the electromyographic information and a reference signal characteristic.
The reference signal characteristic may be a signal characteristic corresponding to an insufflation action (referred to as an insufflation reference signal characteristic) or a signal characteristic corresponding to an inspiration action (referred to as an inspiration reference signal characteristic). In this step, a similarity calculation may be performed between the signal feature of the electromyographic information and the reference signal feature, and a calculation result may be compared with a feature threshold value to determine the action type of the target action. For example, if the reference signal feature is an insufflation reference signal feature, if the similarity between the signal feature of the myoelectric information and the insufflation reference signal feature is greater than the feature threshold, the action type of the target action is an insufflation action, otherwise, the action type is an inhalation action.
Wherein the signal characteristics are related to at least one of amplitude, phase, frequency spectrum of the respective waveform. For example, the signal characteristics of the electromyographic information include: at least one of a fingerprint, an average, and a difference; the fingerprint is composed of at least one item of amplitude, phase and frequency spectrum of the electromyographic information; the average value is an average value of at least one of amplitude, phase and frequency spectrum of the electromyographic information; the difference is a difference of at least one of amplitude, phase and frequency spectrum of the electromyographic information. Similarly, the blowing reference signal characteristics also include at least one of fingerprint, average value and difference; the fingerprint consists of at least one of amplitude, phase and frequency spectrum of the corresponding waveform of the blowing air; the average value is the average value of at least one of amplitude, phase and frequency spectrum of the corresponding waveform of the blowing air; the difference is the difference of at least one of the amplitude, the phase and the frequency spectrum of the corresponding waveform of the blowing air.
To improve identification accuracy, the insufflation reference signal feature and the inspiration reference signal feature may each include a plurality of sub-signal features. For example, the insufflation reference signal feature may include a plurality of sub-signal features corresponding to different insufflation intensities, so that in this step, the signal features of the myoelectric information should be subjected to similarity calculation with the plurality of sub-signal features one by one, and if any one calculation result is greater than the feature threshold, the action type is identified as an insufflation action.
Fig. 10 is a waveform diagram of eye electromyography information when a user performs an air blowing action and an air suction action with a large force, respectively. The abscissa represents time, the ordinate represents a waveform amplitude value, the dotted line boxes J 'and K' correspond to waveforms when the user blows air, and the dotted line boxes L 'and M' correspond to waveforms when the user inhales air. Comparing the waveforms in the dashed box J ' (or K ') and the dashed box J (or K) in fig. 9, it can be seen that the maximum amplitude value of the waveform is significantly increased as the user's blowing strength is increased. Comparing the waveform in the dashed box L ' (or M ') and the dashed box L (or M) in fig. 9, it can be seen that the minimum amplitude value of the waveform decreases significantly as the user's inspiratory effort increases. Thus, in one embodiment, the method may further comprise:
s150 c: and determining the intensity information of the target action according to at least one amplitude value of the electromyographic information.
The intensity information of the target motion may be intensity values at a plurality of motion moments, that is, intensity values corresponding to a plurality of sampling points on the myoelectric information, or may be a maximum intensity value of the target motion. The intensity information may correspond to a pressure or pressure.
Specifically, in this step, the strength information of the target motion may be determined according to at least one amplitude value of the electromyographic information and a predetermined corresponding relationship. The predetermined correspondence may be a correspondence between an amplitude value of the myoelectric information determined by training in advance and an intensity value of the target motion.
In one embodiment, the step 150c further comprises:
s151 c: responding to the fact that the action type of the target action is an air blowing action, and determining the maximum intensity value of the air blowing action according to the maximum amplitude value of the myoelectric information;
s152 c: and in response to the action type of the target action being an inspiratory action, determining a maximum intensity value of the inspiratory action according to the minimum amplitude value of the electromyographic information.
The inventor researches and discovers that when the air blowing action of the user reaches the maximum intensity value, the amplitude value of the corresponding myoelectric information also reaches the maximum value, and corresponds to the peak in the dotted line box E, F in fig. 9. Therefore, in the step S151c, the maximum intensity value of the air blowing motion may be determined according to the maximum amplitude value of the myoelectric information.
The inventor has also found that when the user's inspiratory effort reaches a maximum intensity value, the corresponding amplitude value of the electromyographic information decreases to a minimum, corresponding to a trough within the dashed box G, H in fig. 9. Therefore, in the step S152c, the maximum intensity value of the inhalation maneuver may be determined according to the minimum amplitude value of the electromyographic information.
In one embodiment, the method may further comprise:
s160 c: and determining first input information corresponding to the strength information.
The corresponding relationship between the intensity information and the first input information may be predetermined, for example, the intensity information may be used as input information of an electronic game, and different intensity values correspond to different intensity inputs of the game.
In some cases, the user may blow or inhale multiple times in succession to input different input information, such as one blow to input a select command and two blows to input an open command. Thus, in one embodiment, the method may further comprise:
s170 c: and determining the frequency of the target action according to the number of the reference waveforms corresponding to the action types in the electromyographic information.
In the step, if the action type is the air blowing action, determining the times of the air blowing action according to the number of the air blowing reference waveforms in the myoelectric information; and if the action type is an inspiration action, determining the frequency of the inspiration action according to the number of inspiration reference waveforms included in the electromyographic information.
In another embodiment, the method may further comprise:
s170 c': and determining the frequency of the target action according to the number of the reference signal characteristics corresponding to the action type in the electromyographic information.
In the step, if the action type is the air blowing action, determining the times of the air blowing action according to the number of the myoelectric information including the characteristics of an air blowing reference signal; and if the action type is an inspiration action, determining the frequency of the inspiration action according to the number of the electromyographic information including inspiration reference signal characteristics.
In one embodiment, the method may further comprise:
s180 c: and determining second input information corresponding to the times of the target action.
The corresponding relationship between the number of the target actions and the second input information may be preset, for example, the corresponding relationship may be preset and stored in a table shown in table 1, and in this step, the second input information corresponding to the number of the target actions may be determined by looking up the table.
In addition, in one embodiment, the method may further include the steps of:
s190: and determining third input information corresponding to the action type of the target action.
That is, the two motion types of the target motion may correspond to different input information, for example, if the motion type is a blowing motion, a selected current application command is input, and if the motion type is an inhaling motion, a switch next application command is input.
In conclusion, the method can identify whether the target action is a blowing action or a sucking action according to the corresponding somatosensory information, can identify the intensity information and the frequency information of the target action, and can determine the corresponding input information according to the identification result, so that the method is favorable for performing man-machine interaction without depending on two hands, and is favorable for improving the interaction capacity of equipment.
Fig. 11 is a block diagram of a device for determining blowing and suction air according to an embodiment of the present application, where the device for determining blowing and suction air may be disposed in a wearable device such as a smart hat, smart glasses, or may be a standalone wearable device for a user. As shown in fig. 11, the apparatus 1100 may include:
the obtaining module 1110 is configured to obtain information about a sense of unity of a user in response to the user performing a target action, where the target action is a blowing action or a sucking action;
an action type determining module 1120, configured to determine an action type of the target action according to the somatosensory information and a reference information.
The device responds to a target action executed by a user, acquires the body feeling information of the user, and then determines the action type of the target action according to the body feeling information and reference information. Therefore, the method for recognizing the air blowing and sucking actions of the user according to the somatosensory information is provided, hands are released in the man-machine interaction process, and the interaction capacity of the electronic equipment is improved.
The functions of the obtaining module 1110 and the action type determining module 1120 will be described in detail below with reference to specific embodiments.
The obtaining module 1110 is configured to obtain information about a sense of unity of the user in response to a target action performed by the user, where the target action is a blowing action or a sucking action.
The somatosensory information can be electroencephalogram information, electro-oculogram information or eye myoelectricity information, and can be acquired through corresponding sensors. Each of the somatosensory information will be described in detail below.
The action type determining module 1120 is configured to determine an action type of the target action according to the somatosensory information and a reference information.
Wherein the motion types of the target motion are divided into two types, namely the blowing motion and the inhaling motion. That is, the motion type determining module 1120 is to accurately identify whether the target motion is a blowing motion or an inhaling motion according to the somatosensory information and the reference information.
a) In one embodiment, the somatosensory information is electroencephalogram information. Correspondingly, the action type determining module 1120 is configured to determine the action type of the target action according to the electroencephalogram information and an electroencephalogram reference information.
The electroencephalogram information may be, for example, EEG information of the user, and may be acquired by, for example, an EEG sensor. In addition, the inventor researches and discovers that when the electroencephalogram information is acquired according to the 10/20 system method shown in fig. 2, the electroencephalogram information of different areas is affected by the air blowing and sucking actions of the user to different degrees, and the electroencephalogram information of the areas of C3 and C4 is more obviously affected by the air blowing and sucking actions. Accordingly, in one embodiment, the obtaining module 1110 is configured to obtain the electroencephalogram information in a C3 and/or C4 region of the brain of the user.
In one embodiment, the motion type determining module 1120 is configured to determine the motion type of the target motion according to the electroencephalogram information and a reference amplitude value.
Wherein the reference amplitude value can be set according to a peak amplitude value or a trough amplitude value of the standard brain waveform.
In one embodiment, the motion type determining module 1120 is configured to determine that the motion type of the target motion is an air blowing motion in response to an amplitude value of the electroencephalogram information being greater than a first amplitude value.
The first amplitude value may be set according to a peak amplitude value of the standard brain waveform, for example, an average value (denoted as a first average value) of peak amplitude values of a plurality of standard brain waveforms may be calculated as the first amplitude value. To reduce false identifications, the first amplitude value may be further increased.
In order to improve the real-time processing performance of the method and simplify the calculation, the motion type determination module 1120 may scan the electroencephalogram information segment by segment in a predetermined time window, compare the maximum amplitude value of the electroencephalogram information in each time window with the first amplitude value, and if the maximum amplitude value is greater than the first amplitude value, determine that the motion type of the target motion is an air blowing motion.
In addition, as understood by those skilled in the art, if any amplitude value in the electroencephalogram information is not greater than the first amplitude value, the action type of the target action may be determined to be an inspiratory action.
In another embodiment, the motion type determining module 1120 is configured to determine that the motion type of the target motion is an inhalation motion in response to an amplitude value of the electroencephalogram information being smaller than a second amplitude value.
The second amplitude value may be set according to a trough amplitude value of the standard brain waveform, for example, an average value (denoted as a second average value) of trough amplitude values of a plurality of standard brain waveforms may be calculated as the second amplitude value. To reduce false identifications, the second amplitude value may be further reduced.
In order to improve the real-time processing performance of the method and simplify the calculation, the motion type determination module 1120 may scan the electroencephalogram information segment by segment in a predetermined time window, compare the minimum amplitude value of the electroencephalogram information in each time window with the second amplitude value, and if the minimum amplitude value is smaller than the second amplitude value, determine that the motion type of the target motion is an inhalation motion.
In addition, as understood by those skilled in the art, if any amplitude value in the electroencephalogram information is not smaller than the second amplitude value, the action type of the target action may be determined to be a blowing action.
In another embodiment, the motion type determining module 1120 is configured to determine the motion type of the target motion according to the electroencephalogram information and a reference waveform.
The reference waveform may be a waveform corresponding to an insufflation action (referred to as an insufflation reference waveform) or a waveform corresponding to an inspiration action (referred to as an inspiration reference waveform). The motion type determining module 1120 may perform similarity calculation on the waveform of the electroencephalogram information and the reference waveform according to an existing waveform similarity calculation method, and compare the calculation result with a waveform threshold to determine the motion type of the target motion. For example, assuming that the reference waveform is an insufflation reference waveform, if the similarity between the waveform of the electroencephalogram information and the insufflation reference waveform is greater than the waveform threshold, the action type of the target action is an insufflation action, otherwise, the action type is an inspiration action.
To improve recognition accuracy, the insufflation reference waveform and the inspiration reference waveform may each include a plurality of sub-waveforms. For example, the insufflation reference waveform may include a plurality of sub-waveforms corresponding to different insufflation intensities, so that the action type determination module 1120 may perform similarity calculation on the waveform of the electroencephalogram information and the plurality of sub-waveforms one by one, and if any calculation result is greater than the waveform threshold, identify the action type as an insufflation action.
In another embodiment, the motion type determining module 1120 is configured to determine the motion type of the target motion according to the electroencephalogram information and a reference signal feature.
The reference signal characteristic may be a signal characteristic corresponding to an insufflation action (referred to as an insufflation reference signal characteristic) or a signal characteristic corresponding to an inspiration action (referred to as an inspiration reference signal characteristic). The motion type determination module 1120 may perform similarity calculation on the signal characteristics of the electroencephalogram information and the reference signal characteristics, and compare the calculation result with a characteristic threshold value to determine the motion type of the target motion. For example, assuming that the reference signal feature is an insufflation reference signal feature, if the similarity between the signal feature of the electroencephalogram information and the insufflation reference signal feature is greater than the feature threshold, the action type of the target action is an insufflation action, otherwise, the action type is an inspiration action.
Wherein the signal characteristics are related to at least one of amplitude, phase, frequency spectrum of the respective waveform. For example, the signal characteristics of the electroencephalogram information include: at least one of a fingerprint, an average, and a difference; the fingerprint consists of at least one item of amplitude, phase and frequency spectrum of the electroencephalogram information; the average value is the average value of at least one item of amplitude, phase and frequency spectrum of the electroencephalogram information; the difference is the difference of at least one item of amplitude, phase and frequency spectrum of the electroencephalogram information. Similarly, the blowing reference signal characteristics also include at least one of fingerprint, average value and difference; the fingerprint consists of at least one of amplitude, phase and frequency spectrum of the corresponding waveform of the blowing air; the average value is the average value of at least one of amplitude, phase and frequency spectrum of the corresponding waveform of the blowing air; the difference is the difference of at least one of the amplitude, the phase and the frequency spectrum of the corresponding waveform of the blowing air.
To improve identification accuracy, the insufflation reference signal feature and the inspiration reference signal feature may each include a plurality of sub-signal features. For example, the insufflation reference signal feature may include a plurality of sub-signal features corresponding to different insufflation intensities, so that the action type determination module 1120 may perform similarity calculation on the signal features of the electroencephalogram information and the plurality of sub-signal features one by one, and if any calculation result is greater than the feature threshold, identify the action type as an insufflation action.
In one embodiment, referring to fig. 12, the apparatus 1100 further comprises:
the intensity information determining module 1130a is configured to determine intensity information of the target motion according to at least one amplitude value of the electroencephalogram information.
The intensity information of the target motion may be intensity values at a plurality of motion moments, that is, intensity values corresponding to a plurality of sampling points on the electroencephalogram information, or may also be a maximum intensity value of the target motion. The intensity information may correspond to a pressure or a pressure.
Specifically, the intensity information determining module 1130a may determine the intensity information of the target motion according to at least one amplitude value of the electroencephalogram information and a predetermined corresponding relationship. The predetermined correspondence may be a correspondence between an amplitude value of the electroencephalogram information determined in advance through training and an intensity value of the target motion.
In one embodiment, referring to fig. 13, the strength information determination module 1130a includes:
a first determining unit 1131a, configured to determine, in response to that the motion type of the target motion is an air blowing motion, a maximum strength value of the air blowing motion according to the maximum amplitude value of the electroencephalogram information;
a second determining unit 1132a, configured to, in response to that the motion type of the target motion is an inhalation motion, determine a maximum intensity value of the inhalation motion according to the minimum amplitude value of the electroencephalogram information.
In one embodiment, referring to fig. 14, the apparatus 1100 further comprises:
a first input information determining module 1140a, configured to determine a first input information corresponding to the intensity information.
The corresponding relationship between the intensity information and the first input information may be preset, for example, the intensity information may be used as input information of an electronic game, and different intensity values may be input corresponding to different strengths of the game.
In one embodiment, referring to fig. 15, the apparatus 1100 further comprises:
a number determining module 1150a, configured to determine the number of times of the target action according to the number of reference waveforms included in the electroencephalogram information and corresponding to the action type.
If the action type is the blowing action, determining the times of the blowing action according to the quantity of a blowing reference waveform included in the electroencephalogram information; and if the action type is an inspiration action, determining the frequency of the inspiration action according to the number of inspiration reference waveforms included in the electroencephalogram information.
In another embodiment, referring to fig. 16, the apparatus 1100 further comprises:
a number determining module 1150 a' configured to determine the number of times of the target action according to the number of reference signal features included in the electroencephalogram information and corresponding to the action type.
If the action type is an air blowing action, determining the number of times of the air blowing action according to the quantity of the characteristics of an air blowing reference signal included in the electroencephalogram information; and if the action type is an inspiration action, determining the frequency of the inspiration action according to the quantity of the electroencephalogram information including inspiration reference signal characteristics.
In one embodiment, referring to fig. 17, the apparatus 1100 further comprises:
a second input information determining module 1160a, configured to determine second input information corresponding to the number of times of the target action.
The corresponding relationship between the number of the target actions and the second input information may be preset, for example, the corresponding relationship may be preset and stored in a table shown in table 1, and the second input information determining module 1160a may determine the second input information corresponding to the number of the target actions by looking up a table.
b) In one embodiment, the somatosensory information is electrooculogram information. Correspondingly, the action type determining module 1120 is configured to determine the action type of the target action according to the electro-ocular information and the electro-ocular reference information.
In one embodiment, the action type determining module 1120 is configured to determine the action type of the target action according to the electro-oculogram information and a reference amplitude value.
The reference amplitude value can be set according to a peak amplitude value or a trough amplitude value of the standard electrooculogram waveform.
In one embodiment, the motion type determining module 1120 is configured to determine that the motion type of the target motion is a blowing motion in response to an amplitude value of the electro-oculogram information being greater than a first amplitude value.
The first amplitude value may be set according to a peak amplitude value of the standard electro-ocular waveform, for example, an average value (denoted as a first average value) of peak amplitude values of a plurality of standard electro-ocular waveforms may be calculated as the first amplitude value. To reduce false identifications, the first amplitude value may be further increased.
In order to improve the real-time processing performance of the method and simplify the calculation, the motion type determination module 1120 may scan the electro-ocular information segment by segment in a predetermined time window, compare the maximum amplitude value of the electro-ocular information in each time window with the first amplitude value, and if the maximum amplitude value is greater than the first amplitude value, determine that the motion type of the target motion is the air blowing motion.
In addition, as understood by those skilled in the art, if any amplitude value in the electro-oculogram information is not larger than the first amplitude value, the action type of the target action may be determined as an inhalation action.
In another embodiment, the motion type determining module 1120 is configured to determine the motion type of the target motion as an inhalation motion in response to an amplitude value of the electro-oculogram information being smaller than a second amplitude value.
The second amplitude value may be set according to the valley amplitude value of the standard electro-ocular waveform, for example, an average value (denoted as a second average value) of the valley amplitude values of a plurality of standard electro-ocular waveforms may be calculated as the second amplitude value. To reduce false identifications, the second amplitude value may be further reduced.
In order to improve the real-time processing performance of the method and simplify the calculation, the motion type determination module 1120 may scan the electro-ocular information segment by segment in a predetermined time window, compare the minimum amplitude value of the electro-ocular information in each time window with the second amplitude value, and determine the motion type of the target motion as an inhalation motion if the minimum amplitude value is smaller than the second amplitude value.
In addition, as understood by those skilled in the art, if any amplitude value in the electro-ocular information is not smaller than the second amplitude value, the action type of the target action may be determined to be a blowing action.
In another embodiment, the action type determining module 1120 is configured to determine the action type of the target action according to the electro-oculogram information and a reference waveform.
The reference waveform may be a waveform corresponding to an insufflation action (referred to as an insufflation reference waveform) or a waveform corresponding to an inspiration action (referred to as an inspiration reference waveform). The motion type determining module 1120 may perform similarity calculation on the waveform of the electro-ocular information and the reference waveform according to an existing waveform similarity calculation method, and compare the calculation result with a waveform threshold to determine the motion type of the target motion. For example, assuming that the reference waveform is an insufflation reference waveform, if the similarity between the waveform of the electrooculogram information and the insufflation reference waveform is greater than the waveform threshold, the action type of the target action is an insufflation action, otherwise, the action type is an inspiration action.
To improve recognition accuracy, the insufflation reference waveform and the inspiration reference waveform may each include a plurality of sub-waveforms. For example, the insufflation reference waveform may include a plurality of sub-waveforms corresponding to different insufflation intensities, so that the action type determination module 1120 may perform similarity calculation on the waveform of the electro-ocular information and the plurality of sub-waveforms one by one, and if any calculation result is greater than the waveform threshold, identify the action type as an insufflation action.
In another embodiment, the action type determining module 1120 is configured to determine the action type of the target action according to the electro-ocular information and a reference signal characteristic.
The reference signal characteristic may be a signal characteristic corresponding to an insufflation action (referred to as an insufflation reference signal characteristic) or a signal characteristic corresponding to an inspiration action (referred to as an inspiration reference signal characteristic). The action type determining module 1120 may perform similarity calculation on the signal feature of the electro-ocular information and the reference signal feature, and compare the calculation result with a feature threshold to determine the action type of the target action. For example, assuming that the reference signal feature is an insufflation reference signal feature, if the similarity between the signal feature of the electrooculogram information and the insufflation reference signal feature is greater than the feature threshold, the action type of the target action is an insufflation action, otherwise, the action type is an inhalation action.
Wherein the signal characteristics are related to at least one of amplitude, phase, frequency spectrum of the respective waveform. For example, the signal characteristics of the electro-ocular information include: at least one of a fingerprint, an average, and a difference; the fingerprint is composed of at least one of amplitude, phase and frequency spectrum of the electrooculogram information; the average value is an average value of at least one of amplitude, phase, and frequency spectrum of the electrooculogram information; the difference is a difference in at least one of amplitude, phase, spectrum of the electro-ocular information. Similarly, the blowing reference signal characteristics also include at least one of fingerprint, average value and difference; the fingerprint consists of at least one of amplitude, phase and frequency spectrum of the corresponding waveform of the blowing air; the average value is the average value of at least one of amplitude, phase and frequency spectrum of the corresponding waveform of the blowing air; the difference is the difference of at least one of the amplitude, the phase and the frequency spectrum of the corresponding waveform of the blowing air.
To improve identification accuracy, the insufflation reference signal feature and the inspiration reference signal feature may each include a plurality of sub-signal features. For example, the insufflation reference signal feature may include a plurality of sub-signal features corresponding to different insufflation intensities, so that the action type determination module 1120 may perform similarity calculation on the signal features of the electrooculogram information and the plurality of sub-signal features one by one, and if any calculation result is greater than the feature threshold, identify the action type as an insufflation action.
In one embodiment, referring to fig. 18, the device 1100 further comprises:
the intensity information determination module 1130b is configured to determine intensity information of the target action according to at least one amplitude value of the electro-ocular information.
The intensity information of the target motion may be intensity values at a plurality of motion timings, that is, intensity values corresponding to a plurality of sampling points on the electrooculogram information, or may be a maximum intensity value of the target motion. The intensity information may correspond to a pressure or a pressure.
Specifically, the intensity information determining module 1130b may determine the intensity information of the target action according to at least one amplitude value of the electro-ocular information and a predetermined corresponding relationship. The predetermined correspondence may be a correspondence between an amplitude value of the electro-oculogram information determined in advance by training and an intensity value of the target motion.
In one embodiment, referring to fig. 19, the strength information determination module 1130b includes:
a first determining unit 1131b, configured to determine, in response to that the motion type of the target motion is a blowing motion, a maximum intensity value of the blowing motion according to the maximum amplitude value of the electro-oculogram information;
a second determining unit 1132b, configured to, in response to that the motion type of the target motion is an inhalation motion, determine a maximum intensity value of the inhalation motion according to the minimum amplitude value of the electro-oculogram information.
In one embodiment, referring to fig. 20, the device 1100 further comprises:
a first input information determining module 1140b, configured to determine a first input information corresponding to the intensity information.
The corresponding relationship between the intensity information and the first input information may be preset, for example, the intensity information may be used as input information of an electronic game, and different intensity values may be input corresponding to different strengths of the game.
In one embodiment, referring to fig. 21, the apparatus 1100 further comprises:
a number determining module 1150b, configured to determine the number of times of the target action according to the number of reference waveforms included in the electro-ocular information and corresponding to the action type.
If the action type is the blowing action, determining the times of the blowing action according to the quantity of a blowing reference waveform included in the electro-oculogram information; and if the action type is an inspiration action, determining the number of inspiration actions according to the number of inspiration reference waveforms included in the electro-oculogram information.
In another embodiment, referring to fig. 22, the apparatus 1100 further comprises:
a number determining module 1150 b' configured to determine the number of times of the target action according to the number of reference signal features corresponding to the action type included in the electro-ocular information.
If the action type is a blowing action, determining the number of times of the blowing action according to the quantity of the characteristics of a blowing reference signal included in the electrooculogram information; and if the action type is an inspiration action, determining the number of inspiration actions according to the number of characteristics of an inspiration reference signal included in the electro-oculogram information.
In one embodiment, referring to fig. 23, the device 1100 further comprises:
a second input information determining module 1160b, configured to determine second input information corresponding to the number of times of the target action.
The corresponding relationship between the number of the target actions and the second input information may be preset, for example, the corresponding relationship may be preset and stored in a table shown in table 1, and the second input information determining module 1160b may determine the second input information corresponding to the number of the target actions by looking up the table.
c) In another embodiment, the somatosensory information is electromyographic information of an eye. Correspondingly, the action type determining module 1120 is configured to determine the action type of the target action according to the electromyographic information and electromyographic reference information.
In an embodiment, the action type determining module 1120 is configured to determine the action type of the target action according to the electromyographic information and a reference amplitude value.
The reference amplitude value can be set according to a peak amplitude value or a trough amplitude value of the standard electrooculogram waveform.
In one embodiment, the motion type determining module 1120 is configured to determine that the motion type of the target motion is an air blowing motion in response to an amplitude value of the electromyographic information being greater than a first amplitude value.
The first amplitude value may be set according to a peak amplitude value of the standard electromyogram waveform, for example, an average value (denoted as a first average value) of peak amplitude values of a plurality of standard electromyograms may be calculated as the first amplitude value. To reduce false identifications, the first amplitude value may be further increased.
In order to improve the real-time processing performance of the device and simplify the calculation, the motion type determination module 1120 may scan the electromyographic information segment by segment in a predetermined time window, compare the maximum amplitude value of the electromyographic information in each time window with the first amplitude value, and determine the motion type of the target motion as the air blowing motion if the maximum amplitude value is greater than the first amplitude value.
In addition, as understood by those skilled in the art, if any amplitude value in the electromyographic information is not greater than the first amplitude value, the action type of the target action may be determined to be an inspiratory action.
In another embodiment, the motion type determining module 1120 is configured to determine that the motion type of the target motion is an inhalation motion in response to an amplitude value of the electromyographic information being smaller than a second amplitude value.
The second amplitude value may be set according to a trough amplitude value of the standard electromyogram, for example, an average value (denoted as a second average value) of trough amplitude values of a plurality of standard electromyograms may be calculated as the second amplitude value. To reduce false identifications, the second amplitude value may be further reduced.
In order to improve the real-time processing performance of the method and simplify the calculation, the motion type determination module 1120 may scan the electromyographic information segment by segment in a predetermined time window, compare the minimum amplitude value of the electromyographic information in each time window with the second amplitude value, and determine the motion type of the target motion as an inhalation motion if the minimum amplitude value is smaller than the second amplitude value.
In addition, as understood by those skilled in the art, if any amplitude value in the myoelectric information is not less than the second amplitude value, the action type of the target action may be determined to be a blowing action.
In another embodiment, the action type determining module 1120 is configured to determine the action type of the target action according to the electromyographic information and a reference waveform.
The reference waveform may be a waveform corresponding to an insufflation action (referred to as an insufflation reference waveform) or a waveform corresponding to an inspiration action (referred to as an inspiration reference waveform). The motion type determination module 1120 may perform similarity calculation on the waveform of the electromyogram information and the reference waveform according to an existing waveform similarity calculation method, and compare the calculation result with a waveform threshold value to determine the motion type of the target motion. For example, assuming that the reference waveform is an insufflation reference waveform, if the similarity between the waveform of the myoelectric information and the insufflation reference waveform is greater than the waveform threshold, the action type of the target action is an insufflation action, otherwise, the action type is an inhalation action.
To improve recognition accuracy, the insufflation reference waveform and the inspiration reference waveform may each include a plurality of sub-waveforms. For example, the insufflation reference waveform may include a plurality of sub-waveforms corresponding to different insufflation intensities, so that the action type determination module 1120 may perform similarity calculation on the waveform of the electromyogram information and the plurality of sub-waveforms one by one, and if any calculation result is greater than the waveform threshold, identify the action type as an insufflation action.
In another embodiment, the action type determining module 1120 is configured to determine the action type of the target action according to the electromyographic information and a reference signal characteristic.
The reference signal characteristic may be a signal characteristic corresponding to an insufflation action (referred to as an insufflation reference signal characteristic) or a signal characteristic corresponding to an inspiration action (referred to as an inspiration reference signal characteristic). The action type determining module 1120 may perform similarity calculation on the signal characteristics of the electromyographic information and the reference signal characteristics, and compare the calculation result with a characteristic threshold value to determine the action type of the target action. For example, if the reference signal feature is an insufflation reference signal feature, if the similarity between the signal feature of the myoelectric information and the insufflation reference signal feature is greater than the feature threshold, the action type of the target action is an insufflation action, otherwise, the action type is an inhalation action.
Wherein the signal characteristics are related to at least one of amplitude, phase, frequency spectrum of the respective waveform. For example, the signal characteristics of the electromyographic information include: at least one of a fingerprint, an average, and a difference; the fingerprint is composed of at least one item of amplitude, phase and frequency spectrum of the electromyographic information; the average value is an average value of at least one of amplitude, phase and frequency spectrum of the electromyographic information; the difference is a difference of at least one of amplitude, phase and frequency spectrum of the electromyographic information. Similarly, the blowing reference signal characteristics also include at least one of fingerprint, average value and difference; the fingerprint consists of at least one of amplitude, phase and frequency spectrum of the corresponding waveform of the blowing air; the average value is the average value of at least one of amplitude, phase and frequency spectrum of the corresponding waveform of the blowing air; the difference is the difference of at least one of the amplitude, the phase and the frequency spectrum of the corresponding waveform of the blowing air.
To improve identification accuracy, the insufflation reference signal feature and the inspiration reference signal feature may each include a plurality of sub-signal features. For example, the insufflation reference signal feature may include a plurality of sub-signal features corresponding to different insufflation intensities, so that the action type determination module 1120 may perform similarity calculation on the signal features of the electromyographic information and the plurality of sub-signal features one by one, and if any calculation result is greater than the feature threshold, identify the action type as an insufflation action.
In one embodiment, referring to fig. 24, the apparatus 1100 further comprises:
the intensity information determination module 1130c is configured to determine intensity information of the target motion according to at least one amplitude value of the electromyographic information.
The intensity information of the target motion may be intensity values at a plurality of motion moments, that is, intensity values corresponding to a plurality of sampling points on the myoelectric information, or may be a maximum intensity value of the target motion. The intensity information may correspond to a pressure or a pressure.
Specifically, the intensity information determining module 1130c may determine the intensity information of the target action according to at least one amplitude value of the electromyographic information and a predetermined corresponding relationship. The predetermined correspondence may be a correspondence between an amplitude value of the myoelectric information determined by training in advance and an intensity value of the target motion.
In one embodiment, referring to fig. 25, the strength information determination module 1130c includes:
the first determining unit 1131c is configured to, in response to that the motion type of the target motion is an air blowing motion, determine a maximum strength value of the air blowing motion according to the maximum amplitude value of the myoelectric information;
a second determining unit 1132c, configured to, in response to that the motion type of the target motion is an inhalation motion, determine a maximum intensity value of the inhalation motion according to the minimum amplitude value of the electromyographic information.
In one embodiment, referring to fig. 26, the apparatus 1100 further comprises:
a first input information determining module 1140c, configured to determine a first input information corresponding to the intensity information.
The corresponding relationship between the intensity information and the first input information may be preset, for example, the intensity information may be used as input information of an electronic game, and different intensity values may be input corresponding to different strengths of the game.
In one embodiment, referring to fig. 27, the apparatus 1100 further comprises:
a frequency determining module 1150c, configured to determine the frequency of the target action according to the number of reference waveforms included in the electromyographic information and corresponding to the action type.
If the action type is an air blowing action, determining the times of the air blowing action according to the number of the air blowing reference waveforms included in the myoelectric information; and if the action type is an inspiration action, determining the frequency of the inspiration action according to the number of inspiration reference waveforms included in the electromyographic information.
In another embodiment, referring to fig. 28, the apparatus 1100 further comprises:
a frequency determining module 1150 c' configured to determine the frequency of the target action according to the number of the reference signal features corresponding to the action type included in the electromyographic information.
If the action type is an air blowing action, determining the number of times of the air blowing action according to the number of the myoelectric information including the characteristic of an air blowing reference signal; and if the action type is an inspiration action, determining the frequency of the inspiration action according to the number of the electromyographic information including inspiration reference signal characteristics.
In one embodiment, referring to fig. 29, the device 1100 further comprises:
a second input information determining module 1160c, configured to determine second input information corresponding to the number of times of the target action.
The corresponding relationship between the number of the target actions and the second input information may be preset, for example, the corresponding relationship may be preset and stored in a table shown in table 1, and the second input information determining module 1160c may determine the second input information corresponding to the number of the target actions through table lookup.
In one embodiment, referring to fig. 30, the apparatus 1100 further comprises:
a third input information determining module 1170, configured to determine a third input information corresponding to the action type of the target action.
That is, the two motion types of the target motion may correspond to different input information, for example, if the motion type is a blowing motion, a selected current application command is input, and if the motion type is an inhaling motion, a switch next application command is input.
The hardware structure of the user equipment according to an embodiment of the present application is shown as 31. The specific embodiment of the present application is not limited to the specific implementation of the user equipment, and referring to fig. 31, the device 3100 may include:
a processor (processor)3110, a communication Interface (Communications Interface)3120, a memory (memory)3130, a myoelectric sensor (not shown), and a communication bus 3140. Wherein:
the processor 3110, the communication interface 3120, and the memory 3130 communicate with each other via a communication bus 3140.
Communication interface 3120 for communicating with other network elements.
The processor 3110 is configured to execute the program 3132, and may specifically perform the relevant steps in the method embodiment shown in fig. 1.
In particular, the program 3132 may include program code including computer operating instructions.
The processor 3110 may be a central processing unit CPU, or an application specific Integrated circuit asic, or one or more Integrated circuits configured to implement embodiments of the application.
A memory 3130 for storing a program 3132. The memory 3130 may comprise a high-speed RAM memory, and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory. The program 3132 may specifically perform the following steps:
responding to a target action executed by a user, and acquiring myoelectric information of an eye of the user, wherein the target action is a blowing action or an inhaling action;
and determining the action type of the target action according to the electromyographic information and electromyographic reference information.
The specific implementation of each step in the program 3132 may refer to corresponding steps or modules in the foregoing embodiments, which are not described herein again. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described devices and modules may refer to the corresponding process descriptions in the foregoing method embodiments, and are not described herein again.
The hardware structure of the user equipment according to another embodiment of the present application is shown in fig. 32. The specific embodiment of the present application does not limit the specific implementation of the user equipment, and referring to fig. 32, the apparatus 3200 may include:
a processor 3210, a communication Interface 3220, a memory 3230, a motion sensor (not shown), and a communication bus 3240. Wherein:
processor 3210, communication interface 3220, and memory 3230 communicate with one another via a communication bus 3240.
Communication interface 3220 is used for communicating with other network elements.
The processor 3210 is configured to execute the program 3232, and may specifically perform the relevant steps in the method embodiment shown in fig. 1.
In particular, program 3232 can include program code that includes computer operational instructions.
The processor 3210 may be a central processing unit CPU, or an application specific Integrated circuit asic, or one or more Integrated circuits configured to implement embodiments of the present application.
A memory 3230 for storing a program 3232. The memory 3230 may include high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory. The program 3232 may specifically execute the following steps:
responding to a target action executed by a user, and acquiring body feeling information of the user, wherein the target action is a blowing action or a sucking action;
and determining the action type of the target action according to the somatosensory information and reference information.
For specific implementation of each step in the program 3232, reference may be made to corresponding steps or modules in the foregoing embodiments, which are not described herein again. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described devices and modules may refer to the corresponding process descriptions in the foregoing method embodiments, and are not described herein again.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a controller, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above embodiments are merely illustrative, and not restrictive, and those skilled in the relevant art can make various changes and modifications without departing from the spirit and scope of the present application, and therefore all equivalent technical solutions also fall within the scope of the present application, and the scope of the present application is defined by the appended claims.

Claims (7)

1. A method of determining a blowing or suctioning motion, the method comprising:
responding to a target action executed by a user, and acquiring myoelectric information of an eye of the user, wherein the target action is a blowing action or an inhaling action;
determining the action type of the target action according to the electromyographic information of the eye and electromyographic reference information, wherein the action type of the target action comprises the following steps: the insufflation action and the suction action;
wherein the electromyographic reference information comprises reference signal characteristics; the reference signal characteristics include: a blowing reference signal characteristic corresponding to a blowing action, or an inhaling reference signal characteristic corresponding to an inhaling action; the insufflation reference signal feature comprises a plurality of sub-signal features and the inspiration reference signal feature comprises a plurality of sub-signal features.
2. The method of claim 1, wherein the determining the action type of the target action according to the electromyographic information of the eye and electromyographic reference information comprises:
and determining the action type of the target action according to the electromyographic information of the eye and a reference amplitude value.
3. The method of claim 2, wherein the determining the motion type of the target motion according to the electromyographic information of the eye and a reference amplitude value comprises:
and determining the action type of the target action as the blowing action in response to that an amplitude value of the electromyographic information of the eye is larger than a first amplitude value.
4. The method of claim 2, wherein the determining the motion type of the target motion according to the electromyographic information of the eye and a reference amplitude value comprises:
and determining the action type of the target action as an inspiration action in response to an amplitude value of the electromyographic information of the eye being smaller than a second amplitude value.
5. A blowing and suction air determination apparatus, characterized in that the apparatus comprises:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for responding to a target action executed by a user and acquiring myoelectric information of one eye of the user, and the target action is used as a blowing action or a suction action;
an action type determining module, configured to determine an action type of the target action according to electromyographic information of the eye and electromyographic reference information, where the action type of the target action includes: the insufflation action and the suction action;
wherein the electromyographic reference information comprises reference signal characteristics; the reference signal characteristics include: a blowing reference signal characteristic corresponding to a blowing action, or an inhaling reference signal characteristic corresponding to an inhaling action; the insufflation reference signal feature comprises a plurality of sub-signal features and the inspiration reference signal feature comprises a plurality of sub-signal features.
6. A wearable device characterized by comprising the blowing and suction air determination device according to claim 5.
7. A user equipment, the user equipment comprising:
a myoelectric sensor;
a memory for storing instructions;
a processor to execute the memory-stored instructions, the instructions to cause the processor to:
responding to a target action executed by a user, and acquiring myoelectric information of an eye of the user, wherein the target action is a blowing action or an inhaling action;
determining the action type of the target action according to the electromyographic information of the eye and electromyographic reference information, wherein the action type of the target action comprises the following steps: the insufflation action and the suction action;
wherein the electromyographic reference information comprises reference signal characteristics; the reference signal characteristics include: a blowing reference signal characteristic corresponding to a blowing action, or an inhaling reference signal characteristic corresponding to an inhaling action; the insufflation reference signal feature comprises a plurality of sub-signal features and the inspiration reference signal feature comprises a plurality of sub-signal features.
CN201510512499.9A 2015-08-19 2015-08-19 Method and apparatus for determining blowing and suction air Active CN106371560B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510512499.9A CN106371560B (en) 2015-08-19 2015-08-19 Method and apparatus for determining blowing and suction air

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510512499.9A CN106371560B (en) 2015-08-19 2015-08-19 Method and apparatus for determining blowing and suction air

Publications (2)

Publication Number Publication Date
CN106371560A CN106371560A (en) 2017-02-01
CN106371560B true CN106371560B (en) 2020-06-02

Family

ID=57880953

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510512499.9A Active CN106371560B (en) 2015-08-19 2015-08-19 Method and apparatus for determining blowing and suction air

Country Status (1)

Country Link
CN (1) CN106371560B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622605A (en) * 2012-02-17 2012-08-01 国电科学技术研究院 Surface electromyogram signal feature extraction and action pattern recognition method
CN102855370A (en) * 2011-06-30 2013-01-02 德信互动科技(北京)有限公司 Network game implementation system and method
CN102854973A (en) * 2011-06-30 2013-01-02 德信互动科技(北京)有限公司 Console game implementation device and method
CN104199543A (en) * 2014-08-26 2014-12-10 北京智谷技术服务有限公司 Leading limb identification method and system
CN104367320A (en) * 2014-11-07 2015-02-25 北京智谷睿拓技术服务有限公司 Method and device for determining dominant eye
CN104503593A (en) * 2015-01-23 2015-04-08 北京智谷睿拓技术服务有限公司 Control information determination method and device
CN104503592A (en) * 2015-01-23 2015-04-08 北京智谷睿拓技术服务有限公司 Method and device for determining head gestures

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6244873B1 (en) * 1998-10-16 2001-06-12 At&T Corp. Wireless myoelectric control apparatus and methods
CN102968072A (en) * 2012-11-09 2013-03-13 上海大学 Electro-oculogram control system and method based on correction/training

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855370A (en) * 2011-06-30 2013-01-02 德信互动科技(北京)有限公司 Network game implementation system and method
CN102854973A (en) * 2011-06-30 2013-01-02 德信互动科技(北京)有限公司 Console game implementation device and method
CN102622605A (en) * 2012-02-17 2012-08-01 国电科学技术研究院 Surface electromyogram signal feature extraction and action pattern recognition method
CN104199543A (en) * 2014-08-26 2014-12-10 北京智谷技术服务有限公司 Leading limb identification method and system
CN104367320A (en) * 2014-11-07 2015-02-25 北京智谷睿拓技术服务有限公司 Method and device for determining dominant eye
CN104503593A (en) * 2015-01-23 2015-04-08 北京智谷睿拓技术服务有限公司 Control information determination method and device
CN104503592A (en) * 2015-01-23 2015-04-08 北京智谷睿拓技术服务有限公司 Method and device for determining head gestures

Also Published As

Publication number Publication date
CN106371560A (en) 2017-02-01

Similar Documents

Publication Publication Date Title
Scherer et al. Toward self-paced brain–computer communication: navigation through virtual worlds
Rossi et al. Hybrid EMG classifier based on HMM and SVM for hand gesture recognition in prosthetics
Lotte et al. Spatially regularized common spatial patterns for EEG classification
Hamedi et al. Surface electromyography-based facial expression recognition in Bi-polar configuration
CN103440498A (en) Surface electromyogram signal identification method based on LDA algorithm
CN108784693B (en) P300 single extraction technology based on independent component analysis and Kalman smoothing
EP3187967B1 (en) Terminal control method and system
CN110399846A (en) A kind of gesture identification method based on multichannel electromyography signal correlation
CN101930285A (en) Handwriting recognition method based on surface electromyographic signal
CN109009098B (en) Electroencephalogram signal feature identification method under motor imagery state
Hamedi et al. Comparison of different time-domain feature extraction methods on facial gestures’ EMGs
Naik et al. Classification of finger extension and flexion of EMG and Cyberglove data with modified ICA weight matrix
Ouyang et al. Electroencephelograph based brain machine interface for controlling a robotic arm
CN107480635B (en) Glance signal identification method and system based on bimodal classification model fusion
CN114384999B (en) User-independent myoelectric gesture recognition system based on self-adaptive learning
CN106845348B (en) Gesture recognition method based on arm surface electromyographic signals
Geng et al. A novel design of 4-class BCI using two binary classifiers and parallel mental tasks
Khorshidtalab et al. Evaluation of time-domain features for motor imagery movements using FCM and SVM
Hamedi et al. Facial neuromuscular signal classification by means of least square support vector machine for MuCI
CN106371560B (en) Method and apparatus for determining blowing and suction air
Herman et al. Design and on-line evaluation of type-2 fuzzy logic system-based framework for handling uncertainties in BCI classification
CN106843509B (en) Brain-computer interface system
JP2016030096A (en) Living body electric noise identification system and living body electric noise removal system
Tomczyński et al. Hand gesture-based interface with multichannel sEMG band enabling unknown gesture discrimination
KR101034875B1 (en) Intention reasoning method using pattern of brain waves

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant