CN115590695A - Wheelchair control system based on electro-oculogram and face recognition - Google Patents

Wheelchair control system based on electro-oculogram and face recognition Download PDF

Info

Publication number
CN115590695A
CN115590695A CN202211221489.6A CN202211221489A CN115590695A CN 115590695 A CN115590695 A CN 115590695A CN 202211221489 A CN202211221489 A CN 202211221489A CN 115590695 A CN115590695 A CN 115590695A
Authority
CN
China
Prior art keywords
user
wheelchair
electric wheelchair
face
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211221489.6A
Other languages
Chinese (zh)
Other versions
CN115590695B (en
Inventor
李远清
陆子霖
朱俊标
胡力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China Brain Control Guangdong Intelligent Technology Co ltd
Original Assignee
South China Brain Control Guangdong Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China Brain Control Guangdong Intelligent Technology Co ltd filed Critical South China Brain Control Guangdong Intelligent Technology Co ltd
Priority to CN202211221489.6A priority Critical patent/CN115590695B/en
Publication of CN115590695A publication Critical patent/CN115590695A/en
Application granted granted Critical
Publication of CN115590695B publication Critical patent/CN115590695B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/04Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs motor-driven
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/10Parts, details or accessories
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/10Parts, details or accessories
    • A61G5/1051Arrangements for steering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G2203/00General characteristics of devices
    • A61G2203/10General characteristics of devices characterised by specific control means, e.g. for adjustment or steering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G2203/00General characteristics of devices
    • A61G2203/10General characteristics of devices characterised by specific control means, e.g. for adjustment or steering
    • A61G2203/18General characteristics of devices characterised by specific control means, e.g. for adjustment or steering by patient's head, eyes, facial muscles or voice

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Dermatology (AREA)
  • Biomedical Technology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a wheelchair control system based on electro-oculography and human face recognition, which comprises a user head signal acquisition and recognition module, a user head posture detection module, a Bluetooth module, a human face recognition module, a human-computer interaction interface, a voice module, an instruction generation module and an electric wheelchair execution module; generating a wheelchair control instruction by fusing multimodal signals such as an eye electric signal, a face posture orientation signal and a voice signal; according to the invention, the user rotates the head to change the face orientation to control the steering of the electric wheelchair, and the electric wheelchair is controlled to move in a mode of matching with a human-computer interaction interface through a virtual cursor blinking click button.

Description

Wheelchair control system based on electro-oculogram and face recognition
Technical Field
The invention belongs to the technical field of wheelchair moving control, and particularly relates to a wheelchair control system based on electrooculography and face recognition.
Background
The electric wheelchair has great demands in the life of old people with inconvenient movement and paralytic patients, and most of common electric wheelchair control modes are manual control of a steering rod and an operation keyboard, so that a user is required to have good upper limb movement capability. However, it is difficult to operate a wheelchair through a conventional manual control for some people suffering from severe dyskinesias caused by Amyotrophic Lateral Sclerosis (ALS), spinal Cord Injury (SCI), and the like. There is a need to provide such people with a non-manual human-machine interaction to assist their movement.
The brain-computer interface is a communication mode which does not depend on peripheral nerves and muscles and is used for finishing control command output by the brain, and the brain-computer interface is widely researched and applied in the fields of neural prosthesis, neural feedback training, emotion classification, military, entertainment and the like. The brain signals are processed and converted to a certain degree, the actions or the intentions of the user are converted into instructions, auxiliary equipment such as a wheelchair and a mechanical arm is controlled, the user can directly interact with the outside through the brain, and the problem of communication between a patient with damaged muscles or nerve endings and the environment is solved to a certain degree. The characteristic nerve signals are extracted from the cerebral motor cortex, the electric stimulation parameters of the corresponding functions are analyzed, and peripheral nerves or muscle tissues are stimulated, so that the paralyzed patient can recover the limb movement capability. The BCI can be used for improving the skills of personnel in special working environments such as pilots, astronauts and the like for specially operating and controlling professional equipment. In addition, the BCI technology is applied to games, and a new interactive interface can be provided.
At present, brain-computer interface wheelchair control systems based on electroencephalogram signals mainly use electroencephalogram modes such as motor imagery, P300 and SSVEP. The control instruction provided by the wheelchair controlled by the motor imagery method is limited, and is generally mainly used for controlling the steering of the wheelchair, and the motor imagery method needs to collect a large amount of training data to train the model, so that the time is consumed, the individual difference of training results is large, and the universality is difficult to realize. The use of P300 and SSVEP signal controls can provide rich control commands, but they require the user to constantly look at the display screen to apply repetitive stimuli to the vision, are prone to fatigue, and are not suitable for long-term control. The above control methods all require the user to wear the electrode cap, which is inconvenient to wear and requires a large number of channels. The wheelchair control system based on the electro-oculogram mainly triggers corresponding control instructions by detecting eye movement and blink signals to realize electric wheelchair control. However, the electrically-controlled electric wheelchair using the eyes needs different eye movements to realize control, is complex in most cases, is easy to cause misoperation, low in accuracy rate, easy to cause eye fatigue, and is not suitable for long-time control.
With the continuous development of artificial intelligence and computer vision, face recognition is also paid more and more attention by researchers as a hotspot problem in computer vision and pattern recognition research, and as a biological feature recognition technology which is most easily accepted by people, the face recognition is direct, friendly and safe and conforms to the vision habit of people, and under the general visual condition, a face image can be normally captured. The head posture detection has important significance for the development of face recognition, the head posture detection is that a 2D image is mapped to a 3D image to obtain the orientation of the face posture, and the main detected parameters comprise a pitch angle (pitch) yaw angle (yaw) and a roll angle (roll) which respectively correspond to three head postures of head raising, head shaking and head turning. Aiming at the characteristics that the upper limb movement ability of a severe paralyzed patient is damaged or lost and mostly only has the head movement ability, if the face recognition feature is added into a wheelchair control system to control the electric wheelchair to turn, the electric wheelchair can quickly and conveniently send out a control instruction, is more convenient for the patient to control the wheelchair, and can provide more abundant control instructions by matching with an eye blink click button on a human-computer interaction interface. Therefore, the invention provides a wheelchair control system based on the electro-oculography and the human face recognition.
Disclosure of Invention
The invention aims to provide a wheelchair control system based on eye and face recognition, so as to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme: a wheelchair control system based on electro-oculography and human face recognition comprises a user head signal acquisition and recognition module, a user head posture detection module, a Bluetooth module, a human face recognition module, a human-computer interaction interface, a voice module, an instruction generation module and an electric wheelchair execution module;
the user head signal acquisition and recognition module is used for acquiring an eye electric signal and an attention signal of a user, and then performing feature extraction and calculation recognition on the acquired multi-mode signal after amplification and filtering;
the user head posture detection module is a nine-axis inertial measurement unit, is integrated on a brain-computer intelligent head ring worn on the head of a user, and is used for acquiring a real-time azimuth posture angle of the head of the user;
the Bluetooth module transmits signals and control instructions among the brain-computer intelligent head ring, the tablet personal computer and the electric wheelchair through Bluetooth;
the face recognition module is used for acquiring a face signal of a user, resolving the real-time face posture orientation of the user after data processing, sending the face posture orientation to the instruction generation module, and generating an electric wheelchair steering instruction to control the left-right steering of the wheelchair;
the human-computer interaction interface is connected with the brain-computer intelligent head ring and the electric wheelchair through the Bluetooth module and is used for displaying the system state and providing a wheelchair control interface for a user;
the voice module is used for receiving a voice signal of a user and triggering the instruction generation module to generate a control instruction after receiving the voice signal;
the instruction generating module generates a wheelchair left-right steering instruction according to the transmitted human face posture orientation to control the electric wheelchair to steer; clicking different buttons according to blinking of a user control virtual cursor on a human-computer interaction interface to output corresponding control instructions to an electric wheelchair execution module; generating a parking instruction according to the user voice signal received by the voice module to control the electric wheelchair to park;
and the electric wheelchair execution module controls the electric wheelchair to execute corresponding actions according to the received control instruction.
Preferably, the blink algorithm adopted by the user head signal acquisition and recognition module judges whether the user blinks actively according to the eye electric signal characteristic and the brain electric attention characteristic of the user, detects the eye electric signal and the attention signal of the user in real time, judges that the user performs a conscious blink action if the blink signal of the user is recognized by the algorithm and the current brain concentration is higher than a preset threshold value, then sends the judgment result to the tablet computer through the bluetooth module, controls the virtual cursor on the human-computer interaction interface to perform one click, and triggers the corresponding control signal if the user moves the virtual cursor to the button on the interface and then actively blinks and is detected.
Preferably, the face recognition module acquires a face signal of a user through a camera on a tablet personal computer, first acquires 212 key points according to a face key point detection algorithm according to a shot image, wherein the key points comprise eye, eyebrow, nose, mouth and face contour facial features, then calculates local features of the acquired key points, determines an affine transformation matrix from a 3D model to a 2D image after 3D model matching, solves Euler angles according to the matrix, and determines the real-time orientation of the face of the user; according to the actual use scene of the electric wheelchair, only the data of the yaw angle is selected, the calculated data are divided into three types of left turning, straight going and right turning according to the divided threshold values, and the user controls the electric wheelchair to turn by turning the head.
Preferably, the human-computer interaction interface is displayed through a display screen of a tablet personal computer, the tablet personal computer is fixed on the wheelchair, the front face of the tablet personal computer faces a user, and the human-computer interaction interface is divided into a function selection interface and an electric wheelchair operation interface.
Preferably, the Bluetooth module transmits a blink signal acquired by the brain-computer intelligent head ring to the tablet personal computer, and controls the virtual cursor to trigger a click operation on a human-computer interaction interface; transmitting the real-time head azimuth attitude angle of the user to a tablet personal computer for controlling the position of a virtual cursor on a human-computer interaction interface; and transmitting the control instruction generated by the instruction generating module to the electric wheelchair to control the electric wheelchair to move.
Preferably, the instruction generation module fuses multimodal signals of the eye electric signals, the face posture orientation signals and the voice signals to generate the wheelchair control instruction.
Preferably, when the voice module recognizes the voice keyword 'parking' spoken by the user, the voice module sends the trigger signal to the instruction generation module, and the instruction generation module immediately generates a parking instruction to control the electric wheelchair to park.
Preferably, the steering of the electric wheelchair is controlled by using a user face posture orientation signal, the data of a yaw angle in a face posture orientation angle when the user is over against a tablet computer when the wheelchair is static are firstly solved and recorded in a control algorithm, when the electric wheelchair starts to move, the real-time face posture orientation of the user is detected, the corresponding yaw angle data are solved, the deviation is calculated with the recorded head azimuth orientation angle when the user is over against the tablet computer, a fixed angle difference threshold value range is set, the steering of the electric wheelchair is controlled according to the positive and negative of the difference value, and the steering speed of the electric wheelchair is controlled according to the difference value; in the electric wheelchair straight-moving mode, a user rotates the head, and when the face of the user is detected to face towards the left side, a left-turning control instruction is generated, and the electric wheelchair turns to the left; when the face of the user faces the right side, a right turning control instruction is generated, and the electric wheelchair turns to the right; when the face of the user is detected to face forwards, the electric wheelchair does not generate a steering instruction and moves straight. In the in-situ rotation mode of the electric wheelchair, when the face of a user is detected to face towards the left side, a left-turning control instruction is generated, and the electric wheelchair rotates towards the left side; when the face of the user faces the right side, a right-turning control instruction is generated, and the electric wheelchair turns to the right; when it is detected that the face of the user faces forward, no steering command is generated and the electric wheelchair is stationary.
Compared with the prior art, the invention has the beneficial effects that: the method can quickly and accurately detect the blink signal and the attention signal of the user, thereby quickly and accurately judging whether the user blinks consciously; the azimuth angle of the head posture of the user can be detected in real time, and the position of a virtual cursor on a human-computer interaction interface is controlled; the human-computer interaction interface can provide richer control instructions by combining with virtual cursor clicking operation, and compared with other eye electric control modes, the control method is faster and more accurate, and the control function is richer; the face direction of a user is detected by adopting a face recognition algorithm, the user controls the left and right steering of the electric wheelchair by changing the face direction by rotating the head, and compared with other control modes, the electric wheelchair has simpler operation and shorter response time and is more suitable for patients with serious loss of motion function; the invention sets various measures for preventing the control instruction from being triggered by mistake and the electric wheelchair from being started by mistake, and adds the voice signal to control the electric wheelchair to stop, so that the electric wheelchair can stop more quickly and has high system safety; the invention also has the advantages of low equipment cost, simple operation and good practicability.
Drawings
FIG. 1 is a front view of the positions of the devices in accordance with the preferred embodiment of the present invention;
FIG. 2 is a side view of the devices in a worn, installed position in accordance with an embodiment of the present invention;
FIG. 3 is a top view of the device in accordance with the present invention in a worn, installed position;
FIG. 4 is a schematic structural diagram of an embodiment of the present invention;
FIG. 5 is a waveform diagram of a single blink eye electrical signal;
FIG. 6 is a schematic view of a human-computer interaction interface function selection interface of the present invention;
FIG. 7 is a diagram illustrating an interface of the human-computer interaction interface advancing function initial interface;
FIG. 8 is a schematic view of a human-computer interaction interface forward function operating interface of the present invention;
FIG. 9 is a schematic diagram of an initial interface of a human-computer interaction interface fallback function according to the present invention;
FIG. 10 is a schematic view of a human-computer interaction interface fallback function operation interface of the present invention;
FIG. 11 is a schematic diagram of an in-place rotation function initial interface of the human-computer interaction interface of the present invention;
FIG. 12 is a schematic diagram of an in-place rotation function operating interface of the human-computer interaction interface of the present invention;
FIG. 13 is a schematic view of the electric wheelchair steering function of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 2 and fig. 3, the wearing and mounting positions of the devices according to the embodiments of the present invention are shown from the front view, the side view and the top view, respectively: a user wears an independently researched and developed brain-computer intelligent head ring right in front of the forehead and sits on the modified electric wheelchair; a tablet personal computer display screen faces a user and is fixed on the electric wheelchair; the user sits right ahead and prepares to control the electric wheelchair.
As shown in fig. 4, the wheelchair control system based on the electro-oculogram and the face recognition is implemented by the invention and comprises a user head signal acquisition and recognition module 1, a user head posture detection module 2, a bluetooth module 3, a face recognition module 4, a human-computer interaction interface 5, a voice module 6, an instruction generation module 7 and an electric wheelchair execution module 8.
The user head signal acquisition and recognition module 1 is used for acquiring an eye electric signal and an attention signal of a user, then carrying out feature extraction and calculation recognition on the acquired multi-mode signal after amplification and filtering, realizing blink detection and attention detection on the user, checking whether the signal waveform feature parameters meet threshold conditions or not, judging whether a conscious blink action exists in the user or not, and controlling a virtual cursor on a human-computer interaction interface to trigger and click a button on the interface according to a detection result;
the user head posture detection module 2 is integrated on a brain-computer intelligent head ring worn on the head of a user, and a head posture sensor is formed by a nine-axis Inertial Measurement Unit (IMU) and used for acquiring a real-time azimuth posture angle of the head of the user;
the Bluetooth module 3 directly transmits signals and control instructions through Bluetooth on a brain-computer intelligent head ring, a tablet personal computer and an electric wheelchair, transmits blink signals acquired by the user head signal acquisition and recognition module 1 to the tablet personal computer, and controls a virtual cursor to trigger click operation on a human-computer interaction interface 5; transmitting the real-time user head azimuth attitude angle acquired by the user head attitude detection module 2 to a tablet personal computer for controlling the position of a virtual cursor on a human-computer interaction interface 5; the control instruction generated by the instruction generating module 7 is transmitted to the electric wheelchair executing module 8, and the electric wheelchair is controlled to move;
the face recognition module 4 acquires a face signal of a user through a camera on the tablet personal computer, performs data processing, then calculates the real-time face posture orientation of the user, sends the face signal to the instruction generation module 7, and generates an electric wheelchair steering instruction to control the left and right steering of the wheelchair;
the human-computer interaction interface 5 is displayed through a display screen of a tablet personal computer, the tablet personal computer is fixed on the wheelchair, the front face of the tablet personal computer faces a user, and the tablet personal computer is connected with the brain-computer intelligent head ring and the electric wheelchair execution module 8 through the Bluetooth module 3 and used for displaying the system state and providing a wheelchair control interface for the user;
the voice module 6 is integrated on the tablet computer and used for receiving a voice signal of a user and triggering the instruction generation module 7 to generate a control instruction after receiving the voice signal;
the instruction generating module 7 generates a left-right wheelchair steering instruction according to the transmitted human face posture orientation to control the electric wheelchair to steer; clicking different buttons on the human-computer interaction interface 5 according to the blinking of a virtual cursor controlled by a user to output corresponding control instructions to the electric wheelchair execution module 8; generating a parking instruction according to the user voice signal received by the voice module 6 to control the electric wheelchair to park;
the electric wheelchair execution module 8 controls the electric wheelchair to execute corresponding actions according to the received control instruction.
In the user head signal acquisition and recognition module 1, the eye electric signals and the attention signals of the user are detected in real time, and then the acquired multi-mode signals are amplified and filtered to be subjected to feature extraction and calculation recognition, so that the blink detection and the attention detection of the user are realized. As shown in fig. 5, when the module detects the blink waveform as shown in the figure, if the attention signal characteristic parameter of the user is greater than the preset attention threshold at this time, it is determined that the user has a conscious blink action, and the virtual cursor is controlled to execute one click operation, otherwise, the virtual cursor does not execute any click operation.
The face recognition module 4 acquires a face signal of a user through a camera on a tablet personal computer, first acquires 212 key points according to a face key point detection algorithm according to a shot image, wherein the key points comprise face features such as eyes, eyebrows, a nose, a mouth and a face contour, then calculates local features of the acquired key points, determines an affine transformation matrix from a 3D model to a 2D picture after performing 3D model matching, solves an Euler angle according to the matrix, and determines the real-time orientation of the face of the user; according to the actual use scene of the electric wheelchair, only selecting yaw angle (yaw) data, dividing the calculated data into three types of left turn, straight line and right turn according to divided thresholds, and controlling the electric wheelchair to turn by rotating the head of a user;
and the human-computer interaction interface 5 comprises a function selection interface, a function initial interface and a function operation interface three-level interface. FIG. 6 shows a function selection interface, which has 6 buttons; the three functional buttons of forward movement, backward movement and rotation respectively represent three movement modes of forward movement, backward movement and in-situ rotation of the electric wheelchair, and a user can jump to a corresponding movement initial interface after clicking a determined button within 10 seconds after clicking a required movement mode button; clicking an exit button to exit to an entry interface; clicking the SOS button may send a distress message. The wheelchair remains stopped at this interface. Fig. 7 shows an initial interface of the forward function, which has 2 buttons, and the wheelchair remains stopped. After the user clicks the start button, the electric wheelchair starts to move straight forward, the interface is switched to the forward function operation interface shown in fig. 8, the start button disappears, and the acceleration, deceleration, stop and SOS buttons are displayed. The wheelchair can act only after the start button is clicked; clicking an SOS button to send help-seeking information, jumping to a function selection interface, and enabling a user to reselect an operation function; clicking the start button within 10 seconds is effective, otherwise the interface jumps back to the function selection interface and the user needs to re-select the function. Fig. 8 shows a forward function execution interface, which has 4 buttons. The electric wheelchair advances at a low-speed gear by default, and after a user clicks an acceleration button, the electric wheelchair is accelerated to a high-speed gear to operate; in a high-speed state, a user clicks a speed reduction button, and the electric wheelchair is decelerated to a low-speed gear to run; after a stop button is clicked, the electric wheelchair is stopped, the interface jumps to a function selection interface, and a user needs to reselect an operation function; and after the SOS button is clicked, the electric wheelchair is stopped, help seeking information is sent, the interface jumps to a function selection interface, and the user needs to reselect an operation function. Fig. 9 shows a backward function initial interface, which has 2 buttons, and the wheelchair is kept stopped. After the user clicks the back button, the electric wheelchair starts to move straight backwards, the interface is switched to the back function operation interface shown in fig. 10, the back button disappears, and the stop and SOS buttons are displayed. The wheelchair can act only after the reversing button is clicked; clicking an SOS button to send help-seeking information, jumping to a function selection interface, and enabling a user to reselect an operation function; the key-down is only valid within 10 seconds, otherwise the interface jumps back to the function selection interface and the user needs to re-select the function. Fig. 10 shows a back function execution interface, which has 2 buttons. After the user clicks the stop button, the wheelchair is stopped, the interface jumps to the function selection interface, and the user needs to reselect the operation function. And after the user clicks the SOS button, the wheelchair is stopped, help seeking information is sent, the interface jumps to a function selection interface, and the user needs to reselect an operation function. Fig. 11 shows a pivot function initiation interface, which has 2 buttons, and the wheelchair remains stopped. After a user clicks the in-place rotation button, the interface is switched to the in-place rotation function operation interface, the in-place rotation button disappears, the stop button and the SOS button are displayed, the electric wheelchair is in a stop state firstly, and the wheelchair starts to rotate in place after the rotation angle of the user exceeds a set threshold value. The wheelchair can act only after the in-situ rotation button is clicked; clicking an SOS button to send help-seeking information, jumping to a function selection interface by the interface, and enabling a user to reselect an operation function; clicking the pivot button within 10 seconds is effective, otherwise the interface jumps back to the function selection interface and the user needs to re-select the function. Fig. 12 shows a pivot function execution interface, which has 2 buttons. And after the user clicks the stop button, the wheelchair is stopped, the interface jumps to the function selection interface, and the user needs to reselect the operation function. The wheelchair is stopped after the user clicks the SOS button, the distress message is sent, the interface jumps to the function selection interface, and the user needs to reselect the operation function.
The instruction generation module 7 generates a wheelchair control instruction by fusing multimodal signals such as an eye electrical signal, a face posture orientation signal, a voice signal and the like, and the specific process is as follows: and the user blinks and clicks an AI wheelchair button on the human-computer interaction interface 5 to enter a function selection interface in the static state of the electric wheelchair, blinks and clicks a required motion state, then blinks and clicks a corresponding start button, and the electric wheelchair starts to act. All instructions triggered by blinking click buttons only control the motion mode of the electric wheelchair or the action of the electric wheelchair in the straight-going direction. The electric wheelchair is controlled to turn by using the face posture orientation signal of the user, and as shown in fig. 13, the yaw angle (yaw) theta in the face posture orientation angle of the user facing the tablet personal computer when the wheelchair is static is firstly solved and recorded in the control algorithm 1 When the electric wheelchair starts to move, the real-time human face posture orientation of the user is detected, and the corresponding yaw angle data theta are calculated 2 And the recorded head azimuth attitude angle theta when facing the tablet computer 1 Calculating the deviation Δ θ = θ 21 Setting a fixed angle difference threshold range, controlling the steering of the electric wheelchair according to the positive and negative values of the difference, and controlling the steering speed of the electric wheelchair according to the magnitude of the difference. In the electric wheelchair straight-moving mode (forward or backward), a user rotates the head, and when the face of the user is detected to face towards the left side, a left-turning control instruction is generated, and the electric wheelchair turns to the left; when the face of the user faces the right side, a right turning control instruction is generated, and the electric wheelchair turns to the right; when the face of the user is detected to face forwards, the electric wheelchair does not generate a steering instruction and moves straight. In the in-situ rotation mode of the electric wheelchair, when the user surface is detectedWhen the part faces to the left side, a left-turning control instruction is generated, and the electric wheelchair rotates to the left; when the face of the user faces the right side, a right-turning control instruction is generated, and the electric wheelchair turns to the right; when it is detected that the face of the user faces forward, no steering command is generated and the electric wheelchair is stationary. When the voice module 6 identifies the parking keyword 'parking and parking', the instruction generating module 7 generates a parking instruction and transmits the parking instruction to the electric wheelchair executing module 8 to control the wheelchair to park.
In this example, MPU6050 chip manufactured by Invensense and QMC5883L chip manufactured by Shanghai silica are selected. The MPU6050 is internally provided with a three-axis MEMS gyroscope and a three-axis MEMS accelerometer, and can transmit original data of the six-axis sensor to the controller through a 400kHz I2C interface or a 20MHz SPI interface; the QMC5883L chip is derived from a Honeywell HMC5883L chip, is a surface-mounted three-axis magnetic sensor integrated with a signal processing circuit, is mainly applied to high-precision occasions such as compasses, navigation, unmanned planes, robots and handheld equipment, and has the advantages of low noise, high precision, low power consumption, disorder elimination, temperature compensation and the like; the device has an I2C interface supporting a standard mode of 100kHz and a fast mode of 400kHz for transmitting three-axis original magnetic strength data.
Compared with the prior art, the invention has the advantages of rich control instructions and quick instruction response. The motion mode of the electric wheelchair is selected by adopting a mode that a virtual cursor blinks a click button on a human-computer interaction interface, a posture sensor is adopted to control the virtual cursor to rapidly move to a specific button according to the change of the head posture of a user, and a blink signal consciously sensed by the user is detected by a blink detection algorithm to trigger the click button to act; the face recognition algorithm is adopted to detect the face posture orientation of the user, and the steering control of the electric wheelchair is realized according to different face orientations of the user, so that the control instruction of the electric wheelchair is quickly, accurately and fully provided. The existing non-manual wheelchair control technology such as a motor imagery control mode in electroencephalogram can only provide 2 to 3 control instructions generally, and the accuracy rate has a great relation with the pre-training effect of a user; whereas the P300 control mode generally requires 4-6 seconds to generate a control command, the SSVEP control mode is prone to cause fatigue and epilepsy. The time for generating the control command by the existing wheelchair control technology based on the electro-oculogram is at least 2-3 seconds.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (8)

1. The utility model provides a wheelchair control system based on eye electricity and face identification which characterized in that: the system comprises a user head signal acquisition and recognition module, a user head posture detection module, a Bluetooth module, a face recognition module, a human-computer interaction interface, a voice module, an instruction generation module and an electric wheelchair execution module;
the user head signal acquisition and recognition module is used for acquiring an electro-oculogram signal and an attention signal of a user, and then performing feature extraction, calculation and recognition on the acquired multi-mode signal after amplification and filtering;
the user head posture detection module is a nine-axis inertial measurement unit, is integrated on a brain-computer intelligent head ring worn on the head of a user, and is used for acquiring a real-time azimuth posture angle of the head of the user;
the Bluetooth module transmits signals and control instructions among the brain-computer intelligent head ring, the tablet personal computer and the electric wheelchair through Bluetooth;
the face recognition module is used for acquiring a face signal of a user, resolving the real-time face posture orientation of the user after data processing, sending the face posture orientation to the instruction generation module, and generating an electric wheelchair steering instruction to control the left-right steering of the wheelchair;
the human-computer interaction interface is connected with the brain-computer intelligent head ring and the electric wheelchair through the Bluetooth module and is used for displaying the system state and providing a wheelchair control interface for a user;
the voice module is used for receiving a voice signal of a user and triggering the instruction generation module to generate a control instruction after receiving the voice signal;
the instruction generating module generates a wheelchair left-right steering instruction according to the transmitted human face posture orientation to control the electric wheelchair to steer; clicking different buttons according to blinking of a user control virtual cursor on a human-computer interaction interface to output corresponding control instructions to an electric wheelchair execution module; generating a parking instruction according to the user voice signal received by the voice module to control the electric wheelchair to park;
and the electric wheelchair execution module controls the electric wheelchair to execute corresponding actions according to the received control instruction.
2. The system of claim 1, wherein the wheelchair control system is based on eye and face recognition, and comprises: the method comprises the steps that a blink algorithm adopted by a user head signal acquisition and recognition module judges whether a user blinks actively according to eye signal characteristics and brain electricity attention characteristics of the user, eye signals and attention signals of the user are detected in real time, if the blink signal of the user is recognized by the algorithm and the current brain concentration degree is higher than a preset threshold value, it is judged that the user carries out a conscious blink action, then a judgment result is sent to a tablet personal computer through a Bluetooth module, a virtual cursor on a human-computer interaction interface is controlled to click once, when the user moves the virtual cursor to a button on the interface and then actively blinks and is detected, the virtual cursor clicks the button once, and a corresponding control signal is triggered.
3. The system of claim 1, wherein the wheelchair control system is based on eye and face recognition, and comprises: the face recognition module acquires a face signal of a user through a camera on a tablet personal computer, first acquires 212 key points according to a face key point detection algorithm according to a shot image, wherein the key points comprise eye, eyebrow, nose, mouth and face contour facial features, then calculates local features of the acquired key points, determines an affine transformation matrix from a 3D model to a 2D picture after 3D model matching, solves an Euler angle according to the matrix, and determines the real-time orientation of the face of the user; according to the actual use scene of the electric wheelchair, only the data of the yaw angle is selected, the calculated data are divided into three types of left turning, straight going and right turning according to the divided threshold values, and the user controls the electric wheelchair to turn by turning the head.
4. The system of claim 1, wherein the wheelchair control system is based on electro-oculography and human face recognition, and comprises: the human-computer interaction interface is displayed through a display screen of a tablet personal computer, the tablet personal computer is fixed on the wheelchair, the front face of the tablet personal computer faces a user, and the human-computer interaction interface is divided into a function selection interface and an electric wheelchair operation interface.
5. The system of claim 1, wherein the wheelchair control system is based on eye and face recognition, and comprises: the Bluetooth module transmits a blink signal acquired by the brain-computer intelligent head ring to the tablet personal computer, and controls a virtual cursor to trigger a click operation on a human-computer interaction interface; transmitting the real-time head azimuth attitude angle of the user to a tablet personal computer for controlling the position of a virtual cursor on a human-computer interaction interface; and transmitting the control instruction generated by the instruction generating module to the electric wheelchair to control the electric wheelchair to move.
6. The system of claim 1, wherein the wheelchair control system is based on eye and face recognition, and comprises: the instruction generation module is used for generating a wheelchair control instruction by fusing multi-mode signals of an eye electric signal, a human face posture orientation signal and a voice signal; a user blinks and clicks an AI wheelchair button on a human-computer interaction interface to enter a function selection interface in a static state of the electric wheelchair, blinks and clicks a required motion state, then blinks and clicks a corresponding start button, and the electric wheelchair starts to act; all instructions triggered by blinking click buttons only control the motion mode of the electric wheelchair or the action of the electric wheelchair in the straight-going direction.
7. The system of claim 1, wherein the wheelchair control system is based on eye and face recognition, and comprises: when the voice module recognizes the voice keyword 'parking' spoken by the user, the triggering signal is sent to the instruction generating module, and the instruction generating module immediately generates a parking instruction to control the electric wheelchair to park.
8. The system of claim 1, wherein the wheelchair control system is based on eye and face recognition, and comprises: controlling the electric wheelchair to turn by using a user face posture orientation signal, firstly resolving and recording data of a yaw angle in a face posture orientation angle when a user is over against a tablet personal computer when the wheelchair is static in a control algorithm, detecting the real-time face posture orientation of the user when the electric wheelchair starts to move, resolving corresponding yaw angle data, calculating a deviation with the recorded head azimuth posture angle when the user is over against the tablet personal computer, setting a fixed angle difference threshold value range, controlling the electric wheelchair to turn according to the positive and negative of the difference value, and controlling the turning speed of the electric wheelchair according to the size of the difference value; in the electric wheelchair straight-moving mode, a user rotates the head, and when the face of the user is detected to face towards the left side, a left-turning control instruction is generated, and the electric wheelchair turns to the left; when the face of the user faces the right side, a right turning control instruction is generated, and the electric wheelchair turns to the right; when the face of the user is detected to face forwards, the electric wheelchair does not generate a steering instruction and moves straight. In the in-situ rotation mode of the electric wheelchair, when the face of a user is detected to face towards the left side, a left-turning control instruction is generated, and the electric wheelchair rotates towards the left side; when the face of the user faces the right side, a right-turning control instruction is generated, and the electric wheelchair turns to the right; when it is detected that the face of the user faces forward, no steering command is generated and the electric wheelchair is stationary.
CN202211221489.6A 2022-10-08 2022-10-08 Wheelchair control system based on electrooculogram and face recognition Active CN115590695B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211221489.6A CN115590695B (en) 2022-10-08 2022-10-08 Wheelchair control system based on electrooculogram and face recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211221489.6A CN115590695B (en) 2022-10-08 2022-10-08 Wheelchair control system based on electrooculogram and face recognition

Publications (2)

Publication Number Publication Date
CN115590695A true CN115590695A (en) 2023-01-13
CN115590695B CN115590695B (en) 2024-06-14

Family

ID=84845004

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211221489.6A Active CN115590695B (en) 2022-10-08 2022-10-08 Wheelchair control system based on electrooculogram and face recognition

Country Status (1)

Country Link
CN (1) CN115590695B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107007407A (en) * 2017-04-12 2017-08-04 华南理工大学 Wheelchair control system based on eye electricity
CN107981997A (en) * 2017-11-23 2018-05-04 郑州布恩科技有限公司 A kind of method for controlling intelligent wheelchair and system based on human brain motion intention
CN108904163A (en) * 2018-06-22 2018-11-30 北京信息科技大学 wheelchair control method and system
CN110134240A (en) * 2019-05-14 2019-08-16 南京邮电大学 Robot wheel chair control system based on brain electricity Yu head appearance hybrid interface
CN110658742A (en) * 2019-09-05 2020-01-07 四川省康复辅具技术服务中心 Multi-mode cooperative control wheelchair control system and method
CN110840666A (en) * 2019-11-19 2020-02-28 华南理工大学 Wheelchair mechanical arm integrated system based on electro-oculogram and machine vision and control method thereof
CN114569350A (en) * 2022-04-13 2022-06-03 北京信息科技大学 Head-mounted type eye-controlled intelligent wheelchair and control method thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107007407A (en) * 2017-04-12 2017-08-04 华南理工大学 Wheelchair control system based on eye electricity
CN107981997A (en) * 2017-11-23 2018-05-04 郑州布恩科技有限公司 A kind of method for controlling intelligent wheelchair and system based on human brain motion intention
CN108904163A (en) * 2018-06-22 2018-11-30 北京信息科技大学 wheelchair control method and system
CN110134240A (en) * 2019-05-14 2019-08-16 南京邮电大学 Robot wheel chair control system based on brain electricity Yu head appearance hybrid interface
CN110658742A (en) * 2019-09-05 2020-01-07 四川省康复辅具技术服务中心 Multi-mode cooperative control wheelchair control system and method
CN110840666A (en) * 2019-11-19 2020-02-28 华南理工大学 Wheelchair mechanical arm integrated system based on electro-oculogram and machine vision and control method thereof
CN114569350A (en) * 2022-04-13 2022-06-03 北京信息科技大学 Head-mounted type eye-controlled intelligent wheelchair and control method thereof

Also Published As

Publication number Publication date
CN115590695B (en) 2024-06-14

Similar Documents

Publication Publication Date Title
CN104083258B (en) A kind of method for controlling intelligent wheelchair based on brain-computer interface and automatic Pilot technology
Rechy-Ramirez et al. Head movements based control of an intelligent wheelchair in an indoor environment
Nam et al. GOM-Face: GKP, EOG, and EMG-based multimodal interface with application to humanoid robot control
CN102309366B (en) Control system and control method for controlling upper prosthesis to move by using eye movement signals
CN110850987A (en) Specific identification control method and device based on two-dimensional intention expressed by human body
Lu A motion control method of intelligent wheelchair based on hand gesture recognition
CN109933205A (en) A kind of vehicle-mounted expression in the eyes interactive device
CN111487988B (en) Brain-controlled unmanned aerial vehicle method based on steady-state visual evoked potential brain-computer interface
Wang et al. Human-centered, ergonomic wearable device with computer vision augmented intelligence for VR multimodal human-smart home object interaction
CN115890655B (en) Mechanical arm control method, device and medium based on head gesture and electrooculogram
CN110658742A (en) Multi-mode cooperative control wheelchair control system and method
Mahmud et al. A multi-modal human machine interface for controlling a smart wheelchair
CN110353899B (en) Intelligent wheelchair
Wei et al. Evaluating the performance of a face movement based wheelchair control interface in an indoor environment
Chatzidimitriadis et al. Non-intrusive head movement control for powered wheelchairs: A vision-based approach
Wang et al. A Human‐Machine Interface Based on an EOG and a Gyroscope for Humanoid Robot Control and Its Application to Home Services
CN115590695B (en) Wheelchair control system based on electrooculogram and face recognition
CN115741670B (en) Wheelchair mechanical arm system based on multi-mode signal and machine vision fusion control
CN105955486A (en) Method for assisting teleoperation based on visual stimulation of brainwaves
Chae et al. Noninvasive brain-computer interface-based control of humanoid navigation
Gupta et al. A portable & cost effective human computer interface device for disabled
Wang et al. EXG wearable human-machine interface for natural multimodal interaction in VR environment
US20220035453A1 (en) Apparatus and method for user interfacing in display glasses
CN115804695A (en) Multi-modal brain-computer interface wheelchair control system integrating double attitude sensors
CN113599096B (en) Automatic wheelchair system of eye movement control based on pupil corneal reflex technique

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant