CN110727353A - Control component control method and device based on two-dimensional intention definition - Google Patents

Control component control method and device based on two-dimensional intention definition Download PDF

Info

Publication number
CN110727353A
CN110727353A CN201911186185.9A CN201911186185A CN110727353A CN 110727353 A CN110727353 A CN 110727353A CN 201911186185 A CN201911186185 A CN 201911186185A CN 110727353 A CN110727353 A CN 110727353A
Authority
CN
China
Prior art keywords
control
action
human body
signal
intention
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911186185.9A
Other languages
Chinese (zh)
Inventor
李远清
肖景
翟军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China Brain Control (guangdong) Intelligent Technology Co Ltd
Original Assignee
South China Brain Control (guangdong) Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China Brain Control (guangdong) Intelligent Technology Co Ltd filed Critical South China Brain Control (guangdong) Intelligent Technology Co Ltd
Publication of CN110727353A publication Critical patent/CN110727353A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Dermatology (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The control method and the control device of the control component are defined based on two-dimensional intention, and are characterized by comprising a human body posture action for expressing direction control intention and a head expression action for expressing confirmation intention, wherein the direction control intention is used for controlling the movement direction of the control component, and the confirmation intention is used for confirming the current operation and the operation type of the control component; the method has the advantages that by utilizing the two-dimensional combination of the human body posture action and the head expression action or the voice command, a direction control intention command mechanism adaptive to the body action is established, an adaptive and independent intention command mechanism is also established, the problem of no control of a voice technology is solved, the problem of limitation of an independent EOG eye electrical technology is solved, and the problem of multi-dimensional control of human-computer interaction equipment and the problem of coordination between the human body action and the control intention are effectively solved through effective cooperation between the two mechanisms.

Description

Control component control method and device based on two-dimensional intention definition
Technical Field
The invention belongs to the technical field of human-computer interaction, and particularly relates to a control method and a control device of a control component defined based on a two-dimensional intention, for example, the control method and the control device can be based on an EOG signal and a head gesture. EOG (electro-oculogram) membrane electrooculogram is an electrical record of eye movement, a measure of eye potential.
Background
In the prior art, the interaction between a person and an operating device (such as a head-mounted display control part, a computer, a mobile phone and other living devices is mainly in a manual operation interaction mode, for example, when the person interacts with the head-mounted display device, the physical keys can be used for increasing the volume, playing or pausing, and when the person interacts with the computer, the user needs to manually operate a keyboard or a mouse for playing or opening, but for the disabled or the user who operates the user with two hands temporarily without idle (for example, the user has two hands doing activities such as washing, cooking and eating), the realization of human-computer interaction by using the traditional input device (such as the mouse, the keyboard, the operator and the like) is very difficult.
In the field of human-computer interaction, eyes are used as another important information interaction channel, and the sight line reflects the attention direction of people, so that the sight line has the characteristics of naturalness, directness, interactivity and the like when being applied to the field of human-computer interaction, and is concerned by people. Chinese patent publication No. CN104866100B discloses an eye control device, an eye control method thereof, and an eye control system, wherein the eye control device includes: the device comprises a fixation point acquisition unit, a human eye action detection unit and a control signal generation unit, wherein the fixation point acquisition unit is used for acquiring the position information of a fixation point of human eyes on a device to be operated; the human eye action detection unit is used for detecting whether human eyes make a preset action or not, and controlling the fixation point acquisition unit to send the current position information of the fixation point of the current human eyes on the device to be operated to the control signal generation unit when the human eyes make the preset action; the control signal generating unit is used for generating a control signal corresponding to the current position information according to a pre-stored position control corresponding table corresponding to the device to be operated and sending the control signal to the device to be operated so as to enable the device to be operated to execute corresponding operation. The technical scheme can effectively realize the control of the operating device by using the human eye action, but the control method is too simple and has single dimension, and is not suitable for being used on the human-computer interaction equipment with the multi-dimensional control requirement.
Disclosure of Invention
In order to solve the problem of human-computer interaction easily under adverse living environment, at least people firstly think of a language control technology, namely, the controlled equipment is executed according to the meaning contained in a language command by utilizing instructions given by human language, such as up-down, left-right movement, walking and the like, but the time delay property and the control degree of the language are limited, namely the problem of no control restriction, and a single voice technology is determined to be not completely and reliably used in the field of human-computer interaction; secondly, the eye control technique mentioned in the above prior art, also referred to as EOG electro-ocular technique, was also thought of by first and foremost, and obviously this method is also a single control mode in terms of control. None of these techniques perform well when multi-dimensional control is required for a human-computer interaction device.
In order to solve the problem of multi-dimensional control of human-computer interaction equipment and the like and ensure the convenience and reliability of control, a plurality of control methods are designed:
the control method is characterized by comprising a human body posture action for expressing a direction control intention and a head expression action for expressing a confirmation intention, wherein the direction control intention is used for controlling the movement direction of a control component, and the confirmation intention is used for confirming the current operation of the control component and the operation type thereof;
implementing the human body posture action, picking up a posture characteristic value of an action representation representing the human body posture action, setting a posture fusion calculation module, identifying a motion direction control command corresponding to the human body posture action according to the picked posture characteristic value and based on the posture fusion calculation module, and controlling the motion direction of the control component through the motion direction control command;
the method comprises the steps of implementing the head expression action, picking up a head bioelectricity signal characteristic value caused by the head expression action, setting a bioelectricity signal calculation module, identifying a confirmation instruction corresponding to the head expression action according to the picked-up bioelectricity signal characteristic value and based on the bioelectricity signal calculation module, and confirming the current operation and the operation type of the control component through the confirmation instruction.
The present invention also provides a two-dimensional intention-based control device that implements the first control method, characterized by comprising:
the attitude characteristic value picking device is used for picking up an attitude characteristic value of an action representation representing human body attitude action and giving a corresponding attitude characteristic signal, and the human body attitude action is used for expressing a control intention for controlling the movement direction of the control component;
the head bioelectricity signal characteristic value pickup device is used for picking up a head bioelectricity signal characteristic value caused by the head expression action and giving out a corresponding bioelectricity characteristic signal, and the head expression action is used for expressing the intention of confirming the current operation and the operation type of the control component;
the control module comprises an attitude fusion calculation module and a bioelectrical signal calculation module, the attitude fusion calculation module is used for receiving the attitude characteristic signal and recognizing a motion direction control command corresponding to the human body attitude action, and the control module is used for further controlling the motion direction of the control component according to the motion direction control command; the bioelectrical signal calculation module is used for receiving the bioelectrical characteristic signal and identifying a confirmation instruction corresponding to the head expression action, and the control module is used for confirming the current operation and the operation type of the control component according to the confirmation instruction.
Second, a control method of a control unit defined based on a two-dimensional intention is characterized by including a human body posture action for expressing a direction control intention intended to control a moving direction of the control unit and a voice command for expressing a confirmation intention intended to confirm a current operation of the control unit and an operation type thereof;
implementing the human body posture action, picking up a posture characteristic value of an action representation representing the human body posture action, setting a posture fusion calculation module, identifying a motion direction control command corresponding to the human body posture action according to the picked posture characteristic value and based on the posture fusion calculation module, and controlling the motion direction of the control component through the motion direction control command;
implementing a voice command, picking up a voice signal characteristic value representing the voice command, setting a voice signal calculation module, identifying a confirmation instruction corresponding to the voice command according to the picked-up voice signal characteristic value and based on the voice signal calculation module, and confirming the current operation of the control component and the operation type thereof through the confirmation instruction.
The present invention also provides a two-dimensional intention-based control device that implements the second control method, characterized by comprising:
the attitude characteristic value picking device is used for picking up an attitude characteristic value of an action representation representing human body attitude action and giving a corresponding attitude characteristic signal, and the human body attitude action is used for expressing a control intention for controlling the movement direction of the control component;
the voice command characteristic value picking device is used for picking up a voice signal characteristic value representing the voice command and giving out a corresponding voice characteristic signal, and the voice command is used for expressing the intention of confirming the current operation and the operation type of the control part;
the control module comprises an attitude fusion calculation module and a voice signal calculation module, the attitude fusion calculation module is used for receiving the attitude characteristic signal and recognizing a motion direction control instruction corresponding to the human body attitude action, and the control module is used for further controlling the motion direction of the control component according to the motion direction control instruction; the voice signal calculation module is used for receiving the voice characteristic signal and recognizing a confirmation instruction corresponding to the voice command, and the control module is used for confirming the current operation and the operation type of the control component according to the confirmation instruction.
In the above-described aspect, the control means is a virtual physical unit, such as a game doll, that exists in a real physical unit in a living environment or in a virtual environment such as a display screen, an AI, or a VR and that performs a certain function, and the control means is capable of performing not only a movement but also a confirmation operation under the drive of control software of the control module.
In the above scheme, the gesture feature value pickup device and the head expression action or voice information feature value pickup device may be integrated into one wearable device, and then the gesture feature value pickup device and the head expression action or voice information feature value pickup device may be connected to the control module by wire or by wireless transceiver. In addition, the control module may be integrated in the same control circuit, or may be separately disposed in a device including the gesture feature value pickup device, the head expression motion or voice information feature value pickup device, or the like.
Based on the technical scheme, compared with the prior art, the human body gesture control method has the beneficial technical effects that by means of the two-dimensional combination of the human body gesture action and the head expression action or the voice command, not only is a direction control intention command mechanism adaptive to the body action established, but also an adaptive and independent 'confirmation' intention command mechanism is established, the problem of no control of the voice technology alone is solved, the limitation problem of the EOG electro-ocular technology alone is also solved, and the problem of multi-dimensional control on human-computer interaction equipment and the problem of coordination between the human body action and the control intention are effectively solved through effective cooperation between the human body gesture control mechanism and the EOG electro-ocular technology.
The invention has the characteristics and advantages, so the invention can be applied to the human-computer interaction control part, the human-computer interaction control equipment and the control system thereof.
Drawings
Fig. 1 is a schematic flow chart of a first embodiment to which the technical solution of the present invention is applied.
Fig. 2 is a schematic flow chart of a second embodiment to which the technical solution of the present invention is applied.
Detailed Description
The control method and control device of the control component based on two-dimensional intent, which apply the technical scheme of the invention, are further explained with reference to the attached drawings.
Embodiment one, as shown in fig. 1, specifically, a control part control method defined based on a two-dimensional intent, characterized in that,
a human body posture action for expressing a direction control intention intended to control the moving direction of the control part 4 and a head expression action for expressing a confirmation intention intended to confirm the current operation of the control part 4 and the operation type thereof;
implementing the human body posture action, picking up a posture characteristic value of an action representation representing the human body posture action, setting a posture fusion calculation module, identifying a motion direction control command corresponding to the human body posture action according to the picked-up posture characteristic value and based on the posture fusion calculation module, and controlling the motion direction of the control component 4 through the motion direction control command;
implementing the head expression action, picking up a head bioelectrical signal characteristic value caused in association with the head expression action, setting a bioelectrical signal calculation module 23, recognizing a confirmation instruction corresponding to the head expression action based on the bioelectrical signal characteristic value picked up by the bioelectrical signal calculation module 23, and confirming the current operation of the control part 4 and the operation type thereof by the confirmation instruction.
The two-dimensional intention-based control device for implementing the method is characterized by comprising:
the attitude characteristic value pickup device 21 is used for picking up an attitude characteristic value of an action representation which represents the human body attitude action and giving out a corresponding attitude characteristic signal, and the human body attitude action is used for expressing a control intention for controlling the motion direction of the control component 4;
a head bioelectrical signal characteristic value pickup means 11, said head bioelectrical signal characteristic value pickup means 11 being configured to pick up a head bioelectrical signal characteristic value caused in association with a head expression action for expressing an intention of confirming a current operation and an operation type thereof of said control section 4 and giving a corresponding bioelectrical characteristic signal;
the control module 2 comprises an attitude fusion calculation module 22 and a bioelectrical signal calculation module 23, the attitude fusion calculation module 22 is used for receiving the attitude characteristic signal and recognizing a motion direction control command corresponding to the human body attitude action, and the control module 2 is used for further controlling the motion direction of the control component 4 according to the motion direction control command; the bioelectrical signal calculating module 23 is configured to receive the bioelectrical characteristic signal and identify a confirmation instruction corresponding to the head expression action, and the control module 2 is configured to confirm the current operation and the operation type of the control component 4 according to the confirmation instruction.
The control component is a patient bed 4, and the present invention will be further described by taking the control of the patient bed 4 as an example.
The utility model provides a controlling means based on EOG (eye electrical) information and gesture, includes signal acquisition module, wireless communication module, control module 2 and display screen 3, and wireless communication module contains wireless transmitting unit and wireless receiving unit, and wireless transmitting unit sets up the signal acquisition end at signal acquisition module, and wireless receiving unit sets up the control algorithm end at control module 2, adopts Wifi wireless communication agreement. The control module 2 is installed on a desktop computer.
A method using an eye-controlled EOG technique, including a human body posture action for expressing a direction control intention, which is intended to control the moving direction of a control part 4, such as an action of shaking the head, and an eye action resulting in EOG (electro-oculogram) for expressing a confirmation intention, which is intended to confirm the current operation of the control part 4 and the type of operation thereof; the head shaking actions comprise four posture actions of leftward shaking, rightward shaking, head raising movement and head lowering movement.
Implementing the head shaking action, picking up attitude characteristic values of an action representation self representing the head shaking action, including acceleration, angle and movement direction characteristic values of the action, setting an attitude fusion calculation module 22, identifying a movement direction control command corresponding to the head shaking action according to the picked attitude characteristic values and based on the attitude fusion calculation module 22, and controlling the movement direction of the control component 4 through the movement direction control command;
implementing a blinking expression action of the head, picking up an EOG (electro-oculogram) signal characteristic value caused in association with the head expression action, setting a bioelectrical signal calculation module 23, recognizing a confirmation instruction corresponding to the head expression action from the picked-up bioelectrical signal characteristic value and based on the bioelectrical signal calculation module 23, confirming the current operation of the control part 4 and the operation type thereof by the confirmation instruction.
The signal acquisition module is a wearable device, and further comprises a head bioelectricity signal characteristic value pickup device 11 for picking up an EOG electric signal represented by blinking of the user 1, a posture characteristic value pickup device 21 for picking up a posture action characteristic value of the user 1, and signals respectively acquired by the signal acquisition device and the posture action characteristic value pickup device are transmitted to the control module 2 through a wireless communication module; the signal acquisition module also comprises a microprocessor unit, wherein the microprocessor unit comprises an STM32F103 chip and is responsible for synchronization and control of all components of the whole signal acquisition module.
In this embodiment, the attitude characteristic value pickup device 21 is a wearable attitude sensor unit, is worn on the head of the user 1, and includes a PU9250 sensor chip connected to the microprocessor unit, where the chip is a nine-axis attitude sensor, and is composed of a three-axis acceleration sensor, a three-axis gyroscope and a three-axis magnetometer, and respectively collects acceleration, angle and motion direction characteristic values of the head shaking motion of the user 1.
In this embodiment, the head bioelectrical signal characteristic value pickup apparatus 11 is also a wearable electro-ocular device based on an EOG signal, is worn on the head of the user 1, and includes an electrode unit and an EOG signal amplification unit, and the electrode unit and the EOG signal amplification unit are sequentially connected to the microprocessor unit. The electrode unit comprises three conductive electrodes, the three conductive electrodes are tightly attached to different parts of the skin of the head of the user 1, one conductive electrode is arranged on the forehead of the user 1, and the other two conductive electrodes are arranged behind the ears; an AD8232 integrated instrument amplifier chip is arranged in the EOG signal amplification unit, and an instrument amplifier, a high-pass filter, a low-pass filter and a right leg driving circuit are integrated on the chip.
The control module 2 includes the posture fusion calculation module 22, and is configured to perform algorithm recognition on the posture of the head of the user 1 (for example, four posture actions of leftward sway, rightward sway, head-up movement, and head-down movement) according to the collected posture signal of the swaying head, and provide a corresponding posture characteristic signal, determine a recognition result according to the posture characteristic signal, and further determine and drive the control component 4 to move in a corresponding direction, for example, when the head-up movement is performed, the control component 4 is also lifted up, so that control is facilitated. The control module 2 further includes a bioelectrical signal calculation module 23, configured to perform algorithm recognition on the eye movement of the user 1 according to the collected EOG electrical signal corresponding to the blinking movement, provide a corresponding bioelectrical characteristic signal, determine a recognition result according to the bioelectrical characteristic signal, further express a confirmation control instruction, and confirm the current operation type (stop rising or stop falling) of the control component 4. That is, the patient bed 4 moves in response to the motion direction control instruction implied by the gesture motion, and completes "confirmation" of the current operation and the operation type thereof in response to the confirmation instruction implied by the head expression motion.
In the above scheme, the human body posture motion can be a left-side sway, a right-side sway, a chest-lift movement and a waist-bending movement of the whole body besides a head sway motion; the gesture signals have the characteristic that each action is distinguished obviously easily, and the direction control intention can be unified with the gesture direction of the human body, so that the expression and the control are facilitated, the action expression of the gesture signals is easy to recognize, and the characteristic values of the gesture signals are easy to extract. One way to control the movement direction is to try to make the result of the movement direction control substantially consistent with the direction of these gesture movements, for example, when you raise their head or push up their chest, the control module 2 preferably controls the movement direction of the moving cursor to move upwards instead of moving left or right.
However, these human body posture actions are only expressions of control intentions, and are neither signals themselves for controlling the control unit 4 nor control signals themselves to be expressed. Therefore, in order to realize accurate expression of the control intention, it is preferable to set a standard and standardized action of the human posture action in advance, so that the control intention can be conveniently and accurately displayed and expressed without being difficult to recognize and extract characteristic values by a control system. For this purpose, a human body posture standard model about head shaking or body shaking may be established, the human body posture model defines the type of human body motion for expressing the direction control intention, such as whether the head shaking or the body shaking is effective, the shaking speed, or the maximum and minimum angles and other postures, the human body posture model is defined according to the specification, and the control intention of the motion direction of the control component 4 is expressed by implementing the human body posture motion conforming to the definition of the human body posture model. After the body posture model is established, the user 1 can start operation after learning and learning the requirements of the standardized normative actions before use. For example, when the initial state of the controlled device 4 is started, the user 1 can adjust the body posture to the set initial posture, including the body being adjusted and the face being oriented toward the display 3. For example, the angle, speed, etc. at which the user 1 shakes the body are required to meet included specifications, etc.
In the above solution, the gesture fusion calculation module 22 is a software operation module established based on experience and data, and is configured to recognize a control intention for interpreting the current human gesture motion. In use, the gesture fusion calculation module 22 may also collect feature values of a gesture motion based on a specification, i.e. a human body gesture model, as a reference feature value for establishing a corresponding gesture motion, and compare the picked current gesture feature values to identify the current gesture motion intention of the user 1 in real time.
Wherein, the picking up the gesture characteristic value representing the human gesture action, which defines the starting point of the control as the picking up characteristic value, picks up the characteristic parameter as a kind of stereo behavior action, and provides it for the control module 2.
The action representation of the human body gesture action refers to the appearance characteristics of the gesture action in a three-dimensional space. For this purpose, the gesture feature values of the motion representation itself characterizing the human body gesture motion are picked up, i.e. in fact the trajectory left by the gesture motion in time and space.
The current operation of the control unit 4 is a confirmation operation performed on the control unit 4, and the blinking operation is similar to a left-click or right-click confirmation operation performed on an operation unit on the screen by a mouse cursor once, and the operation type is a type of performing the confirmation operation, for example, a start or stop, a use or a negation is performed on the control unit 4, or the control unit 4 is accelerated or decelerated, and these operation types may be set and adjusted according to the control requirements of different control units 4.
In the above-described aspect, after a current posture action is finished and confirmed to be valid, the current posture characteristic value may be replaced with the reference posture characteristic value. In order to prevent the problems of recognition error and slow recognition caused by the drift of the reference attitude characteristic value, the current attitude characteristic value is replaced by the reference attitude characteristic value at the end of each action or periodically, and the reference characteristic value is properly refreshed and reset.
In the above aspect, the posture characteristic value may include an angle, a motion direction, and a motion acceleration of the human motion that can be recognized from the motion representation. Namely, the attitude characteristic value is defined from the aspects of the action angle, the action direction and the action acceleration. In the invention, a wearable rate gyroscope detector is adopted to measure the angle of the action and an acceleration sensor is adopted to measure the acceleration of the action, and in a further improved scheme, a magnetometer can be used to collect the motion direction characteristic value in the attitude characteristic value when the sensor moves along with the human body attitude action. In other embodiments, a camera sampling device may also be used to capture an image of the human body posture action and obtain the posture characteristic value of the human body posture action by using an image analysis technology.
In the above-described aspect, the head expression movement is a micro-muscular movement or mental movement existing based on the head, and for this purpose, other micro-muscular movements such as an eyeball rolling movement, a facial twitching movement, a lower jaw movement of a bite, a molar, or a purely brain-conscious mental movement may be employed in addition to the eye electrical EOG signal caused by a blinking movement of the eyes; when the expression actions are carried out, corresponding bioelectric signals with different characteristics, which are commonly known as electroencephalogram signals, electromyogram signals and the like, can be generated on the head. Such as EOG electro-oculography and EMG electro-myography, there are a number of methods and devices that currently pick up these bio-electrical signals. The characteristic values of the bioelectric signals mainly comprise the amplitude, the frequency spectrum composition and the like of the electric signals, and the confirmation instruction can be realized by picking and distinguishing the characteristic values.
The human body posture action and the head expression action are compared with each other, so that the difference between the human body posture action and the head expression action is obvious, the former belongs to limb actions with larger amplitude so as to directly express the control intention related to the movement direction by using the limb actions, and the latter is only micro muscle actions or thinking actions of the head, and the head expression action can be completely and synchronously implemented when the human body posture action is implemented, or the head expression action is rapidly responded and implemented before or after the human body posture action is implemented so as to implement rapid confirmation control, and the human body posture action and the head expression action are not interfered with each other, so that the operation of a common user is easy.
To this end, the invention uses the implementation of these expressive actions to express or communicate a confirmation instruction, based on the intention and instruction of the confirmation, in the invention the control program itself can arrange either one action to represent a confirmation meaning such as a confirmation action like a traditional one-click mouse, or two actions in succession to represent a confirmation meaning such as a confirmation action like a traditional two-click mouse, i.e. one action or a repetition or combination of several actions.
Next, the head expression motion is not only an independent motion and its signal, but also defines an intention that the head expression motion is mainly used to express confirmation control, that is, to implement the action, it is not intended that the control means 4 such as a mouse cursor (also referred to as a specific mark) is moved as it is, but to express a meaning of confirming confirmation such as start or stop of mouse click confirmation.
In the above-described aspect, after a current expression action is finished and confirmed to be valid, the current bioelectric signal characteristic value may be replaced with the reference bioelectric signal characteristic value. That is, in order to prevent the recognition error (drift of the reference bioelectric signal characteristic value) caused by the bioelectric signal characteristic change of the user 1 in different periods or the current bioelectric signal characteristic change of different participants, the reference bioelectric signal characteristic value is appropriately refreshed and reset by replacing the current posture characteristic value with the reference bioelectric signal characteristic value at the end of each action or periodically.
Embodiment two, as shown in fig. 2, specifically, the present invention also provides a control method defined based on two-dimensional intention, which is characterized by comprising a human body posture action for expressing a direction control intention and a voice command for expressing a confirmation intention, wherein the direction control intention is intended to control the movement direction of the control component 4, and the confirmation intention is intended to confirm the current operation of the control component 4 and the operation type thereof;
implementing the human body gesture action, picking up gesture characteristic values of action representations representing the human body gesture action, setting a gesture fusion calculation module 22, recognizing a motion direction control command corresponding to the human body gesture action according to the picked gesture characteristic values and based on the gesture fusion calculation module 22, and controlling the motion direction of the control component 4 through the motion direction control command;
implementing a voice command, picking up a voice signal characteristic value representing the voice command, setting a voice signal calculation module 24, recognizing a confirmation instruction corresponding to the voice command according to the picked-up voice signal characteristic value and based on the voice signal calculation module 24, confirming the current operation of the control component 4 and the operation type thereof through the confirmation instruction, and confirming.
The second embodiment is compared with the first embodiment, and the main difference is that the head expression action expressing the confirmation intention in the first embodiment is changed into the voice command sent by the user 1, and for this reason, in the above-mentioned embodiment, the voice signal calculation module may also establish a voice signal reference characteristic value corresponding to the voice command and compare the characteristic values of the picked-up current voice signal, so as to recognize the current voice command intention of the user 1 in real time.
In the foregoing solution, after a current speech signal is ended and confirmed to be valid, the speech signal calculation module may replace the current speech signal feature value with the reference feature value. Like this when the condition such as product user 1 change, user 1's accent characteristic change takes place, still can in time adjust the discernment ability.
The present invention also provides a control device based on a two-dimensional intention, comprising:
the attitude characteristic value pickup device 21 is used for picking up an attitude characteristic value of an action representation which represents the human body attitude action and giving out a corresponding attitude characteristic signal, and the human body attitude action is used for expressing a control intention for controlling the motion direction of the control component 4;
a voice command feature value pickup means 5, said voice command feature value pickup means 5 being configured to pick up a voice signal feature value characterizing said voice command and to give a corresponding voice feature signal, said voice command being used to express an intention to confirm the current operation of the control section 4 and its operation type;
the control module 2 comprises an attitude fusion calculation module 22 and a voice signal calculation module 24, the attitude fusion calculation module 22 is used for receiving the attitude characteristic signal and recognizing a motion direction control instruction corresponding to the human body attitude action, and the control module 2 is used for further controlling the motion direction of the control component 4 according to the motion direction control instruction; the voice signal calculating module 24 is configured to receive the voice feature signal and recognize a confirmation instruction corresponding to the voice command, and the control module 2 is configured to confirm the current operation and the operation type of the control component 4 according to the confirmation instruction.

Claims (27)

1. The control method of the control component defined based on the two-dimensional intention is characterized by comprising a human body posture action for expressing a direction control intention and a head expression action for expressing a confirmation intention, wherein the direction control intention is used for controlling the movement direction of the control component, and the confirmation intention is used for confirming the current operation of the control component and the operation type thereof;
implementing the human body posture action, picking up a posture characteristic value of an action representation representing the human body posture action, setting a posture fusion calculation module, identifying a motion direction control command corresponding to the human body posture action according to the picked posture characteristic value and based on the posture fusion calculation module, and controlling the motion direction of the control component through the motion direction control command;
the method comprises the steps of implementing the head expression action, picking up a head bioelectricity signal characteristic value caused by the head expression action, setting a bioelectricity signal calculation module, identifying a confirmation instruction corresponding to the head expression action according to the picked-up bioelectricity signal characteristic value and based on the bioelectricity signal calculation module, and confirming the current operation and the operation type of the control component through the confirmation instruction.
2. The control method according to claim 1, wherein the current gesture action intention of the user is recognized in real time by establishing a reference characteristic value corresponding to the gesture action by the gesture fusion calculation module and comparing the picked current gesture characteristic values.
3. The control method according to claim 2, wherein after a current attitude motion is finished and confirmed to be valid, the current attitude feature value is replaced with the reference feature value.
4. The control method according to claim 2, wherein the posture characteristic values include angles, directions and accelerations of human body motions recognizable from motion representations.
5. The control method according to claim 4, wherein a human body posture model is established, the human body posture model defines a human body action type for expressing the direction control intention, and the control intention of the motion direction of the control component is expressed by implementing a human body posture action conforming to the human body posture model.
6. The control method according to claim 1, wherein the current expression action intention of the user is recognized in real time by the bioelectric signal calculating module establishing a bioelectric signal reference characteristic value corresponding to the expression action and comparing the characteristic values of the picked current bioelectric signals.
7. The control method according to claim 6, wherein the current bioelectrical signal characteristic value is replaced with the reference posture characteristic value after a current expressive motion is ended and confirmed to be valid.
8. The control method according to claim 6, wherein the bioelectrical signal characteristic value includes a signal amplitude and a signal frequency.
9. Control method according to claim 1, characterized in that the type of operation comprises whether to start or stop, or whether to accelerate or decelerate.
10. The control method according to any one of claims 1 to 9, wherein the human body gesture actions include four gesture actions capable of independently expressing a movement direction control intention, each gesture action defining a movement direction control intention, the four gesture actions being used for expressing control intents for four directions of movement, namely front, rear, left and right, respectively.
11. The control method according to claim 10, wherein the human body gesture motion is a shaking motion of a body.
12. The control method according to claim 11, wherein the shaking motion of the body includes four posture motions of left-hand shaking, right-hand shaking, chest-lift movement, and waist-bending movement of the body.
13. The control method according to claim 10, wherein the human body gesture motion is a head shaking motion.
14. The control method according to claim 13, wherein the head swing motion includes four attitude motions of a left-hand swing, a right-hand swing, a head raising movement, and a head lowering movement of the head.
15. The control method according to any one of claims 1 to 9, wherein the head expressive action is a head muscle action or a mental action capable of causing the head to generate a pickable bioelectric signal.
16. The control method according to claim 15, wherein the head expression action includes one or a repetition or a combination of several of an eye action, a facial action, a chin action, or a brain-conscious action.
17. The control method according to any one of claims 1 to 9, wherein the control member is a hospital bed or a wheelchair, and the control member moves in response to a motion direction control instruction implied by the gesture motion and performs "confirmation" of the current operation and the operation type thereof in response to a confirmation instruction implied by the head expression motion.
18. The control method of the control component defined based on the two-dimensional intention is characterized by comprising a human body posture action for expressing a direction control intention and a voice command for expressing a confirmation intention, wherein the direction control intention is used for controlling the movement direction of the control component, and the confirmation intention is used for confirming the current operation of the control component and the operation type thereof;
implementing the human body posture action, picking up a posture characteristic value of an action representation representing the human body posture action, setting a posture fusion calculation module, identifying a motion direction control command corresponding to the human body posture action according to the picked posture characteristic value and based on the posture fusion calculation module, and controlling the motion direction of the control component through the motion direction control command;
implementing a voice command, picking up a voice signal characteristic value representing the voice command, setting a voice signal calculation module, identifying a confirmation instruction corresponding to the voice command according to the picked-up voice signal characteristic value and based on the voice signal calculation module, and confirming the current operation of the control component and the operation type thereof through the confirmation instruction.
19. The control method according to claim 18, wherein the current voice command intention of the user is recognized in real time by the voice signal calculation module establishing a voice signal reference feature value corresponding to the voice command and comparing the feature values of the picked-up current voice signal.
20. The control method according to claim 19, wherein the voice signal calculation module replaces the current voice signal feature value with the reference voice signal feature value after the current voice signal is ended and confirmed to be valid.
21. A control device based on two-dimensional intent, comprising:
the attitude characteristic value picking device is used for picking up an attitude characteristic value of an action representation representing human body attitude action and giving a corresponding attitude characteristic signal, and the human body attitude action is used for expressing a control intention for controlling the movement direction of the control component;
the head bioelectricity signal characteristic value pickup device is used for picking up a head bioelectricity signal characteristic value caused by the head expression action and giving out a corresponding bioelectricity characteristic signal, and the head expression action is used for expressing the intention of confirming the current operation and the operation type of the control component;
the control module comprises an attitude fusion calculation module and a bioelectrical signal calculation module, the attitude fusion calculation module is used for receiving the attitude characteristic signal and recognizing a motion direction control command corresponding to the human body attitude action, and the control module is used for further controlling the motion direction of the control component according to the motion direction control command; the bioelectrical signal calculation module is used for receiving the bioelectrical characteristic signal and identifying a confirmation instruction corresponding to the head expression action, and the control module is used for confirming the current operation and the operation type of the control component according to the confirmation instruction.
22. The control device according to claim 21, wherein the control member is a hospital bed or a wheelchair, and the control member moves in response to a motion direction control instruction implied by the gesture motion and completes "confirmation" of the current operation and the operation type thereof in response to a confirmation instruction implied by the head expression motion.
23. The control device according to claim 21, wherein the posture feature value pickup device includes a rate gyro detector for providing an angular velocity variation feature value among the posture feature values when moving with the human body posture action; the acceleration sensor is used for providing an acceleration change characteristic value in the posture characteristic value when moving along with the human body posture action.
24. The control device according to claim 23, wherein the attitude feature value pickup device further comprises a magnetometer for providing a motion direction feature value among the attitude feature values when moving with the human body attitude motion.
25. The control device according to claim 21, wherein the posture characteristic value pickup device is a camera sampling device for taking an image of the human body posture action and acquiring the posture characteristic value of the human body posture action by using an image analysis technique.
26. The control device according to any one of claims 21 to 25, wherein the head bioelectrical signal characteristic value pickup device is a wearable device based on an EOG signal, and includes an electrode unit, an EOG signal amplification unit; the electrode unit comprises three conductive electrodes, the conductive electrodes are tightly attached to the skin of the head of a user, one conductive electrode is arranged on the forehead of the user, and the other conductive electrodes are arranged behind the ears; and an integrated instrument amplifier is arranged in the EOG signal amplification unit, and an instrument amplifier, a high-pass filter, a low-pass filter and a right leg driving circuit are integrated on the integrated instrument amplifier.
27. A control device based on two-dimensional intent, comprising:
the attitude characteristic value picking device is used for picking up an attitude characteristic value of an action representation representing human body attitude action and giving a corresponding attitude characteristic signal, and the human body attitude action is used for expressing a control intention for controlling the movement direction of the control component;
the voice command characteristic value picking device is used for picking up a voice signal characteristic value representing the voice command and giving out a corresponding voice characteristic signal, and the voice command is used for expressing the intention of confirming the current operation and the operation type of the control part;
the control module comprises an attitude fusion calculation module and a voice signal calculation module, the attitude fusion calculation module is used for receiving the attitude characteristic signal and recognizing a motion direction control instruction corresponding to the human body attitude action, and the control module is used for further controlling the motion direction of the control component according to the motion direction control instruction; the voice signal calculation module is used for receiving the voice characteristic signal and recognizing a confirmation instruction corresponding to the voice command, and the control module is used for confirming the current operation and the operation type of the control component according to the confirmation instruction.
CN201911186185.9A 2019-05-21 2019-11-28 Control component control method and device based on two-dimensional intention definition Pending CN110727353A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910425565.7A CN110134245A (en) 2019-05-21 2019-05-21 A kind of eye control device and eye prosecutor method based on EOG and attitude transducer
CN2019104255657 2019-05-21

Publications (1)

Publication Number Publication Date
CN110727353A true CN110727353A (en) 2020-01-24

Family

ID=67572108

Family Applications (5)

Application Number Title Priority Date Filing Date
CN201910425565.7A Pending CN110134245A (en) 2019-05-21 2019-05-21 A kind of eye control device and eye prosecutor method based on EOG and attitude transducer
CN201911186185.9A Pending CN110727353A (en) 2019-05-21 2019-11-28 Control component control method and device based on two-dimensional intention definition
CN201911186227.9A Pending CN110850987A (en) 2019-05-21 2019-11-28 Specific identification control method and device based on two-dimensional intention expressed by human body
CN201911186189.7A Pending CN111290572A (en) 2019-05-21 2019-11-28 Driving device and driving method based on EOG signal and head posture
CN202020852482.4U Active CN212112406U (en) 2019-05-21 2020-05-20 Driving device based on user EOG signal and head gesture

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201910425565.7A Pending CN110134245A (en) 2019-05-21 2019-05-21 A kind of eye control device and eye prosecutor method based on EOG and attitude transducer

Family Applications After (3)

Application Number Title Priority Date Filing Date
CN201911186227.9A Pending CN110850987A (en) 2019-05-21 2019-11-28 Specific identification control method and device based on two-dimensional intention expressed by human body
CN201911186189.7A Pending CN111290572A (en) 2019-05-21 2019-11-28 Driving device and driving method based on EOG signal and head posture
CN202020852482.4U Active CN212112406U (en) 2019-05-21 2020-05-20 Driving device based on user EOG signal and head gesture

Country Status (1)

Country Link
CN (5) CN110134245A (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113520740A (en) * 2020-04-13 2021-10-22 广东博方众济医疗科技有限公司 Wheelchair bed control method and device, electronic equipment and storage medium
CN112751882A (en) * 2021-01-19 2021-05-04 华南理工大学 Real-time communication method based on hybrid brain-computer interface
CN112860073A (en) * 2021-03-17 2021-05-28 华南脑控(广东)智能科技有限公司 Man-machine interactive closed-loop mouse identification control system
CN113156861A (en) * 2021-04-21 2021-07-23 华南脑控(广东)智能科技有限公司 Intelligent wheelchair control system
CN113448435B (en) * 2021-06-11 2023-06-13 北京数易科技有限公司 Eye control cursor stabilization method based on Kalman filtering
CN115741670B (en) * 2022-10-11 2024-05-03 华南理工大学 Wheelchair mechanical arm system based on multi-mode signal and machine vision fusion control
CN115890655B (en) * 2022-10-11 2024-02-09 人工智能与数字经济广东省实验室(广州) Mechanical arm control method, device and medium based on head gesture and electrooculogram
CN116880700A (en) * 2023-09-07 2023-10-13 华南理工大学 Raspberry group intelligent trolley control method and system based on wearable brain-computer interface
CN117357351B (en) * 2023-12-05 2024-06-18 华南脑控(广东)智能科技有限公司 Multi-mode intelligent control method and device for electric sickbed and household appliances

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101308400A (en) * 2007-05-18 2008-11-19 肖斌 Novel human-machine interaction device based on eye-motion and head motion detection
TW201028895A (en) * 2009-01-23 2010-08-01 Rui-Keng Chou Electro-oculogram control system
JP5888205B2 (en) * 2012-11-02 2016-03-16 ソニー株式会社 Image display device and information input device
JP2017049960A (en) * 2015-09-06 2017-03-09 株式会社ローレル・コード User interface program and device using sensors of hmd device
CN106933353A (en) * 2017-02-15 2017-07-07 南昌大学 A kind of two dimensional cursor kinetic control system and method based on Mental imagery and coded modulation VEP
WO2019001360A1 (en) * 2017-06-29 2019-01-03 华南理工大学 Human-machine interaction method based on visual stimulations
CN108703760A (en) * 2018-06-15 2018-10-26 安徽中科智链信息科技有限公司 Human motion gesture recognition system and method based on nine axle sensors

Also Published As

Publication number Publication date
CN111290572A (en) 2020-06-16
CN110134245A (en) 2019-08-16
CN212112406U (en) 2020-12-08
CN110850987A (en) 2020-02-28

Similar Documents

Publication Publication Date Title
CN110727353A (en) Control component control method and device based on two-dimensional intention definition
EP3860527B1 (en) Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment
CN109875501B (en) Physiological parameter measurement and feedback system
EP3264977B1 (en) Brain activity measurement and feedback system
US20190286234A1 (en) System and method for synchronized neural marketing in a virtual environment
KR20190041467A (en) Detection and use of body tissue electrical signals
CN110840666B (en) Wheelchair mechanical arm integrated system based on electro-oculogram and machine vision and control method thereof
JP2021502659A (en) Brain-computer interface with fits for fast, accurate and intuitive user interaction
Wen et al. The current research of combining multi-modal brain-computer interfaces with virtual reality
JP2024012497A (en) Communication methods and systems
CN108379713A (en) One interaction meditation system based on virtual reality
Zhang et al. Study on robot grasping system of SSVEP-BCI based on augmented reality stimulus
Rechy-Ramirez et al. Impact of commercial sensors in human computer interaction: a review
CN115890655B (en) Mechanical arm control method, device and medium based on head gesture and electrooculogram
CN117873330B (en) Electroencephalogram-eye movement hybrid teleoperation robot control method, system and device
CN113035000A (en) Virtual reality training system for central integrated rehabilitation therapy technology
Petrushin et al. Effect of a click-like feedback on motor imagery in EEG-BCI and eye-tracking hybrid control for telepresence
Longo et al. Using brain-computer interface to control an avatar in a virtual reality environment
Dietrich et al. Towards EEG-based eye-tracking for interaction design in head-mounted devices
CN112860073A (en) Man-machine interactive closed-loop mouse identification control system
Scherer et al. Non-manual Control Devices: Direct Brain-Computer Interaction
Matthies Reflexive interaction-extending peripheral interaction by augmenting humans
Wen et al. Design of a multi-functional system based on virtual reality for stroke rehabilitation
Mamatha et al. Smart sensor design and analysis of brain machine interface using labview
US11493994B2 (en) Input device using bioelectric potential

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Li Yuanqing

Inventor after: Xiao Jing

Inventor after: Qu Jun

Inventor before: Li Yuanqing

Inventor before: Xiao Jing

Inventor before: Zhai Jun