CN110840666B - Wheelchair mechanical arm integrated system based on electro-oculogram and machine vision and control method thereof - Google Patents

Wheelchair mechanical arm integrated system based on electro-oculogram and machine vision and control method thereof Download PDF

Info

Publication number
CN110840666B
CN110840666B CN201911133337.9A CN201911133337A CN110840666B CN 110840666 B CN110840666 B CN 110840666B CN 201911133337 A CN201911133337 A CN 201911133337A CN 110840666 B CN110840666 B CN 110840666B
Authority
CN
China
Prior art keywords
eog
user
wheelchair
mechanical arm
eye movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911133337.9A
Other languages
Chinese (zh)
Other versions
CN110840666A (en
Inventor
李远清
黄骐云
张智军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201911133337.9A priority Critical patent/CN110840666B/en
Publication of CN110840666A publication Critical patent/CN110840666A/en
Application granted granted Critical
Publication of CN110840666B publication Critical patent/CN110840666B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/04Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs motor-driven
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/10Parts, details or accessories
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Veterinary Medicine (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Health & Medical Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Artificial Intelligence (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a wheelchair mechanical arm integrated system based on electro-oculography and machine vision, which comprises an EOG acquisition amplifier, a graphical user interface, a multi-eye EOG signal detection and identification unit, a target object detection unit, a comprehensive identification unit and an execution unit, wherein the EOG acquisition amplifier is used for acquiring a plurality of images; the invention also discloses a wheelchair mechanical arm integrated control method based on the electro-oculogram and the machine vision; the human-computer interface instruction output mode based on two eye movements disclosed by the invention shortens the average reaction time of generating one instruction to about 2 seconds on the premise of ensuring the elimination of interference caused by unconscious eye movement, and a user can output 14 different instructions through eye movement, wherein the average accuracy rate can reach 97.3 percent, and the highest accuracy rate of the existing wheelchair control technology based on the electro-oculogram is 80 percent, so that the performance is greatly improved compared with the existing human-computer interface based on the EOG; the human brain intelligence reflected by the electro-oculogram and the machine intelligence are fused with each other, so that the control accuracy is ensured, and the operation load of a user is reduced.

Description

Wheelchair mechanical arm integrated system based on electro-oculogram and machine vision and control method thereof
Technical Field
The invention belongs to the technical field of human-computer interaction, and particularly relates to a wheelchair mechanical arm integration system based on electro-oculography and machine vision and a wheelchair mechanical arm integration control method based on electro-oculography and machine vision.
Background
For paralyzed patients with inconvenient hands and feet (such as spinal cord injury patients), people need to help from the outside to complete daily tasks such as moving and grasping which require limb participation. Often, a nurse or family member attending to these patients will assist them in performing some of the necessary exercises to meet their daily needs. However, such care tasks take a lot of time and energy for nursing staff, and many family members even leave work at home or in a ward in order to take care of patients all the time. To reduce the burden on the caregiver and improve the patient's ability to self-care in life, the caregiver attempts to use various rehabilitation devices to assist the patient in performing basic daily activities, such as using a wheelchair to assist the patient in walking and using a robotic arm to assist the patient in performing a grasping action.
For many paralyzed patients, many people retain some degree of eye movement despite loss of limb movement. The reason is that the nerves controlling the movement of the eyeball and the eyelid are all located in the brain, and nerves related to eye movement, such as the oculomotor nerve, the trochlear nerve, the abduction nerve and the like, belong to cranial nerves, do not pass through the cervical vertebra and the spine, and are well protected by the skull and are not easily injured. Meanwhile, the completion of eye movement is usually easier and more convenient than the completion of limb movement, and is also easier to detect than electroencephalogram signals, so that an eye movement (EOG) signal is very suitable for being used as an input signal of a human-computer interface for assisting disabled persons to control disabled equipment such as wheelchairs and mechanical arms.
Because the EOG signal has the advantages of high signal-to-noise ratio, stable output, small individual difference and the like, the human-computer interface based on the EOG is stable in performance, has good time response performance, is simple to operate, and can master the using method through short-time training. Although there may be differences in recognition accuracy between different eye movements, overall recognition accuracy and stability are better than for EEG signals. Meanwhile, for a subject with normal eye movement capability, the subject can easily complete one common eye movement (such as upward looking, downward looking, leftward looking, rightward looking, upward-left strabismus, downward-left strabismus, upward-right strabismus, downward-right strabismus, upward-right strabismus, eyebrow lifting, eyebrow creasing, blinking, single blinking and the like), and the movement time is short, so that the operability and the quick response capability of the human-computer interface can be improved. In addition, the difference of the ocular potentials of different populations is not large, and the difference of the ocular potential EOG signals generated at different moments is also not large for a specific subject, so that the characteristics of the EOG signals are relatively stable as a whole, which is helpful for reducing the training time and enhancing the generalization capability of the EOG model. While most paralyzed patients can still retain normal eye function, the method provides a huge potential application market for the human-computer interface based on the EOG.
For example, the current mode of generating instructions by the human-computer interface based on the EOG signal is mainly to identify various eye actions, so that the number of instructions can be increased to increase the degree of freedom of control, but the increase of eye movement types not only reduces the accuracy, but also increases the complexity of system operation and increases the use load of a user; in addition, most wheelchair systems and robotic arm systems for handicapped people work independently, and for some common daily tasks, such as moving from a certain position in a room to a desk and then grabbing objects on the desk, the wheelchair and the robotic arm need to be integrated into the same system and controlled in an efficient and accurate manner. Compared to the task of controlling a single device, the new challenges brought by integrating two devices include:
(i) the wheelchair must be accurately controlled to approach and stop near a grabbed target, so that the target is positioned in the grabbing range of the mechanical arm, and the requirement on the movement precision of the wheelchair is far higher than that of the traditional wheelchair movement control;
(ii) the load of the control integrated system is far higher than that of the control single system, fatigue is easily caused, and measures are needed to reduce task load.
Disclosure of Invention
The invention aims to provide a wheelchair mechanical arm integrated system based on electro-oculography and machine vision and a control method thereof, aiming at solving the problems that the current mode of generating instructions based on a human-computer interface of an EOG signal in the background technology is mainly to identify various eye actions, so that the number of the instructions can be increased to increase the control freedom, but the increase of eye action types not only can cause the accuracy to be reduced, but also can increase the complexity of system operation and aggravate the use load of a user; in addition, most wheelchair systems and robotic arm systems for disabled people work independently, and for common daily tasks, such as moving from a certain position in a room to a desk and then grabbing objects on the desk, the wheelchair and the robotic arm need to be integrated into the same system and controlled in an efficient and accurate manner, which increases the operational load.
In order to achieve the purpose, the invention provides the following technical scheme: the utility model provides a wheelchair arm integrated system based on eye electricity and machine vision, includes EOG acquisition amplifier, graphical user interface, multi-eye moves EOG signal detection recognition unit, target object detection unit, synthesizes recognition unit, execution unit, wherein:
the EOG acquisition amplifier is used for acquiring and preprocessing eye movement EOG signals around the eyes of a user;
a graphical user interface for providing a user with visual eye movement opportunity prompts corresponding to the control instructions;
the multi-eye movement EOG signal detection and identification unit is used for extracting eye movement waveform characteristic parameters from an EOG signal section and identifying different eye movement modes;
a target object detection unit for identifying potential target object coordinates in the positioning environment;
the comprehensive identification unit is used for sending an action instruction to a controller of the electric wheelchair according to an identification result;
and the execution unit comprises a wheelchair and a mechanical arm, controls the forward, backward, steering and other actions of the wheelchair according to the instruction result identified by the EOG, and controls the mechanical arm to complete the grabbing action according to the coordinates of the target object and the EOG instruction.
As a preferred technical solution of the present invention, the acquisition frequency of the EOG acquisition amplifier is 250 hz, the total gain is 1000, and 3 wet electrodes are used as an EOG signal acquisition channel, a reference channel and a common-mode signal denoising channel, and are respectively attached to the back of the brow tip and the left and right earlobes.
As a preferred technical scheme of the invention, the graphical user interface is displayed by a screen of a notebook computer arranged at the front part of the wheelchair seat, and comprises two levels of interfaces: (i) a wheelchair control interface; (ii) mechanical arm control interface.
The invention also comprises a multi-eye movement EOG detection and identification algorithm which is divided into blink detection, pre-target selection and eyebrow lifting verification.
As a preferred technical solution of the present invention, the control of the mechanical arm is implemented by using a cooperative control strategy, and the used equipment includes: the system comprises a human-computer interface of the EOG, a first somatosensory camera and a second somatosensory camera, wherein the first camera is used for identifying an object in front of the wheelchair and obtaining a three-dimensional coordinate of the object in a camera space; the second camera is directed toward the face of the user on the wheelchair for recognizing the user's face and locating the coordinates of the user's mouth in the camera's three-dimensional space.
As a preferred technical solution of the present invention, the execution unit includes an electric wheelchair and a mechanical arm, wherein the electric wheelchair is used for realizing the movement of a user; the robotic arm may autonomously plan a path of movement according to the provided target coordinates.
As a preferred technical solution of the present invention, the present invention further includes a matching unit, and the matching unit is configured to match the eye movement waveform features extracted from the EOG signal segment.
The invention also discloses a wheelchair mechanical arm integrated control method based on the electro-oculogram and the machine vision, which comprises the following steps:
the method comprises the following steps: the user blinks according to the blinking prompt of the buttons on the screen, the system identifies the blinking EOG signal after detecting the blinking EOG signal, and the blinking EOG signal is distributed to a certain button to serve as a preselected target;
step two: the user judges whether the preselected target is correct by self, and carries out verification through the eyebrow raising action, when the verification is passed, the button is formally selected and triggers a corresponding control command, and the user can control the forward movement, backward movement, acceleration and deceleration, left-turning and right-turning and stopping of the electric wheelchair in such a way; the control of the mechanical arm needs the camera to return the detected space three-dimensional coordinates of the surrounding objects, then a user selects a target object through eye movement, the mechanical arm plans a grabbing path according to the coordinates of the target object, and the user can generate pause and reset instructions through eye movement in the moving process of the mechanical arm.
Compared with the prior art, the invention has the beneficial effects that:
(1) the human-computer interface instruction output mode based on two eye movements disclosed by the invention shortens the average response time of generating an instruction to about 2 seconds on the premise of ensuring that the interference caused by unconscious eye movement is eliminated (the false alarm rate is 0), a user can output 14 different instructions through eye movement, the average accuracy rate can reach 97.3 percent, and the highest accuracy rate of the existing wheelchair control technology based on electro-oculogram is 80 percent, so that the performance is greatly improved compared with the existing human-computer interface based on EOG;
(2) the wheelchair mechanical arm integrated control mode disclosed by the invention integrates the electro-oculogram and the machine vision technology, and integrates the human brain intelligence and the machine intelligence reflected by the electro-oculogram, so that the control accuracy is ensured, the operation load of a user is reduced, and the contradiction problem between the control diversity and the operation complexity is well solved;
(3) the wheelchair mechanical arm integrated system based on the electro-oculogram and the machine vision has the advantages that the operation is simple, the practicability is good, the electro-oculogram amplifier of the system is low in hardware cost, only three common silver chloride electrodes are used, a user can select a button only by blinking and eyebrow lifting actions aiming at the flickering of a target button, simplicity and clarity can be realized, the user can master the system in a short-time training mode, the body actions are not needed, and the time and the labor are saved;
(4) through the designed matching unit, the method is beneficial to extracting the eye movement waveform characteristics from the EOG signal segment for matching, and the recognition efficiency is improved.
Drawings
FIG. 1 is an overall workflow diagram of the electro-oculography and machine vision based wheelchair robotic arm integration system of the present invention;
fig. 2 is a schematic structural diagram of an effective range in which the target object can be grasped according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1 and fig. 2, the present invention provides a technical solution: the utility model provides a wheelchair arm integrated system based on eye electricity and machine vision, includes EOG acquisition amplifier, graphical user interface, multi-eye moves EOG signal detection recognition unit, target object detection unit, synthesizes recognition unit, execution unit, wherein:
the EOG acquisition amplifier is used for acquiring and preprocessing eye movement EOG signals around the eyes of a user; further description thereof: the acquisition frequency is 250 Hz, the total gain is 1000, and 3 wet electrodes are used as an EOG signal acquisition channel, a reference channel and a common-mode signal denoising channel and are respectively attached to the back of the brow tip and the left and right earlobes;
a graphical user interface for providing a user with visual eye movement opportunity prompts corresponding to the control instructions; further description thereof: the interface is displayed by a screen of a notebook computer arranged at the front part of the wheelchair seat, and comprises two levels of interfaces: (i) a wheelchair control interface; (ii) a robotic arm control interface; each control interface comprises a plurality of virtual command buttons corresponding to a specific control command, when the system is started, the wheelchair control interface is firstly displayed to a user, only the Switch button 'On/Off' flickers at the frequency of 1 Hz, other buttons are kept static and invalid, the whole wheelchair control interface is in a standby state at the moment, when the user selects the Switch button through eye movement, the wheelchair control interface is formally started, 14 buttons flick in a fixed sequence in a circulating way, each flicking lasts for 50 milliseconds, the flickering interval between adjacent buttons is 80 milliseconds, therefore, the complete flickering period (each button flicking and only flicking once) is 1.12 seconds, the user can also select the 'Switch' button to Switch to the mechanical arm control interface, the upper 9 buttons also flick in sequence like the wheelchair control interface, each flicking lasts for 50 milliseconds, the flashing interval is 120 milliseconds, and the flashing period is 1.08 seconds;
the multi-eye movement EOG signal detection and identification unit is used for extracting eye movement waveform characteristic parameters from an EOG signal section and identifying different eye movement modes; further description thereof: the invention provides a multi-eye movement EOG detection and identification algorithm based on ordered visual cues, which mainly comprises three steps of blink detection, pre-target selection and eyebrow raising verification, wherein the three steps comprise:
blink detection: in order to select a target button, a user needs to blink immediately after the button blinks, and after the button blinks each time, the system takes a blinking time point as a starting point and intercepts an EOG signal segment with the length of 600 milliseconds, wherein the EOG signal segment comprises 150 sampling points (the sampling frequency is 250 Hz); firstly, carrying out band-pass filtering (1-10 Hz) on the whole signal section, removing the interference of high-frequency myoelectricity, electroencephalogram and low-frequency head jitter, and then carrying out difference to obtain a differential signal section, wherein the waveform characteristic parameters to be extracted comprise: maximum value a of the filtered signal segment max Maximum value d of differential signal segment max Minimum value d of differential signal segment min Time t of the maximum and minimum of the differential signal section p And t v The waveform span parameter can be obtained by the following formula:
t d =|t p -t v | (1)
a successful blink detection requires that all of the following conditions be met:
a max ≥a th1 (1)
d min ≤d th1 ,d max ≥d th2 (2)
t th1 ≤t d ≤t th2 (3)
t p ≤t v (4)
wherein, a th1 Is a blink amplitude threshold, d th1 And d th2 Lower and upper threshold values, [ t ] of the blink differential signal, respectively th1 ,t th2 ]The range of the blink waveform span parameter is within;
(2) pre-target selection: since the occurrence of involuntary blinks is nearly a random process, we further tested whether the conscious blink waveform is located within a relatively fixed time period after the blinking time point by the following formula:
e≥|t p -T p | (5)
wherein, t p Is the peak time point of the differential signal; t is a unit of p Then t is obtained from the calibration procedure p The expected value of (d); e, defining the maximum deviation allowed between the actual peak time point and the expected value, and selecting the button corresponding to the EOG signal segment as the pre-target of the current round by the system as long as the detected blink waveform meets the time requirement;
(3) carrying out eyebrow lifting verification: only the above two steps are still not enough to avoid the influence of the unintentional blinking, and if the unintentional blinking satisfies the threshold condition of the blinking detection and is just located within the pre-target selection limit of any button, the erroneous output is caused, so that a verification process is specially added in the algorithm, specifically, after one button is selected as the pre-target, the system starts a verification time window of 600 milliseconds, and when the pre-target is the real target of the user, the eyebrow lifting action needs to be made as soon as possible for confirmation; if not, the user does not need to raise the eyebrow, and only if the eyebrow raising action occurs within the 600 ms verification time window, the algorithm extracts the similar waveform characteristic parameters as the blink detection from the EOG signal recorded in the 600 ms verification time window, and then uses these parameters for eyebrow raising detection, and a successful eyebrow raising detection must satisfy all of the following conditions:
a th2 ≤a max ≤a th3 (6)
d th3 ≤d max ≤d th4 (7)
t th3 ≤t d ≤t th4 (8)
t p ≥t v (9)
wherein, a th3 And a th4 Respectively, the lower and upper limits of the amplitude of the eyebrow raising filter waveform, d th3 And d th4 Then the lower and upper limits of the eyebrow raising difference waveform, [ t ] respectively th3 ,t th4 ]Then it is the range of the span of the eyebrow raising filter waveform. In addition to the threshold parameter and blink detection being different, the last condition relates to the relative position of the occurrence of the peaks and valleys of the difference signal as opposed to the occurrence of the blink difference signal, forming a pair of mutually exclusive conditions,the absolute distinction between the two is ensured;
the target object detection unit is used for identifying potential target object coordinates in a positioning environment by using a depth-of-field camera and a machine vision technology; further description thereof: the method comprises the steps that a cooperative control strategy is adopted to realize control over a mechanical arm, and used equipment comprises a human-computer interface based on EOG, a first somatosensory camera, a second somatosensory camera and an intelligent control system built in the mechanical arm, wherein the first camera is used for identifying an object in front of a wheelchair and obtaining a three-dimensional coordinate of the object in a camera space; the second camera faces the face of the user on the wheelchair for recognizing the user's face and locating the coordinates of the user's mouth in the three-dimensional space of the camera, as previously described, the user needs to stop the wheelchair near the target, and further adjusting the angle and the distance to ensure that the target object is positioned in the range which can be grabbed by the mechanical arm, in order to determine the size of the range, a camera is taken as an origin, an effective grabbing range is defined in the three-dimensional space, as shown in fig. 2, the effective grabbing range is a cuboid space, the length is 0.8 m, the width is 0.4 m, the height is 0.6 m, the distance from the origin of the first camera is 0.4 m, the length, the width and the height of the three sides of the robot arm are respectively parallel to X, Y, Z three axes of a robot arm space, and the space range is determined according to the length of the robot arm and the somatosensory depth of a camera I (the robot arm is difficult to reach when the distance is too far, and the somatosensory camera cannot accurately judge depth information when the distance is too close);
after a camera returns a two-dimensional image of a front object and depth information, a system can calculate three-dimensional coordinates of each pixel in a camera space according to the information, then, the image is divided according to a region growing algorithm, for each pixel, 3 pixels are selected from surrounding pixel points, the 3 pixels determine a plane in the space, a normal vector perpendicular to the plane is used as a feature vector of the pixel, meanwhile, a standard normal vector is defined and perpendicular to a vertical plane of the camera, then, the feature vector of each pixel is compared with the standard normal vector, the pixel of which the feature vector is almost parallel to the standard normal vector is selected as a 'growth seed', the system randomly selects one from the 'growth seeds' as a starting point and performs diffusion growth to the surrounding pixels according to a specified growth rule, stopping until the three-dimensional coordinates jump significantly (meaning that the edges of objects may come), repeating the steps until all 'growing seeds' are used up, obtaining a plurality of regional pixel sets, wherein each pixel set represents a possible object, taking the case of grabbing a water cup for drinking water, an algorithm further analyzes the number of pixels in each regional pixel set and the change condition of a pixel characteristic vector, and screens out regions (which indicate that the object has a certain size and radian and is similar to a water cup) with the number of pixels reaching a certain scale and the direction of the pixel characteristic vector changing beyond a certain degree, wherein the regions are marked as identified objects and fed back to a screen, and the three-dimensional coordinates of the central points of the regions are also transmitted to a control center;
regarding the positioning of the face and lips of a user, a face recognition function provided by a Kinect development library is called, the three-dimensional coordinates of the lips of the user in a second camera space can be accurately returned, and it needs to be pointed out that the coordinates of a recognition object and the coordinates of the lips of the user comprise the coordinates of the recognition object and the coordinates of the lips of the user, and the coordinates are finally converted into a mechanical arm space through a space coordinate conversion matrix;
the process of completing a successful 'moving grabbing task' is as follows: firstly, a user observes the approximate direction of a target object, then controls the wheelchair to bypass possible obstacles to approach the target by using an EOG man-machine interface, adjusts the angle and the distance to enable the target to be located in the effective range of grabbing by the mechanical arm, a camera I identifies and marks potential objects in front of the wheelchair, then feeds the potential objects back to a screen and distributes the potential objects to 3 virtual object buttons, the user needs to select and confirm one target object through eye movement, then a camera II returns the coordinates of the mouth of the user, a mechanical arm control console automatically plans the path of grabbing the object and delivering the object to the mouth of the user under certain constraint conditions and controls the mechanical arm to complete grabbing tasks, in the process, the user can carry out overall control on the grabbing process through the EOG man-machine interface, such as pause, restart, homing, placing back and hand rotating by 15 degrees, and a straw is placed in a bottle for the convenience of drinking water by the user, when the mechanical arm reaches the position near the mouth of the user, the fingers of the mechanical arm rotate and incline by a certain angle, so that the straw is ensured to face the user to help the user to smoothly finish the water drinking action;
the comprehensive identification unit is used for sending an action instruction to a controller of the electric wheelchair according to an identification result;
the execution unit comprises a wheelchair and a mechanical arm, controls the forward, backward, steering and other actions of the wheelchair according to the instruction result identified by the EOG, and controls the mechanical arm to complete the grabbing action according to the coordinates of the target object and the EOG instruction; further description thereof: the electric wheelchair is adopted, an action instruction is sent to a controller of the electric wheelchair according to the identification result of the comprehensive identification unit, and the mechanical arm is a KINOVA 6-axis mechanical arm which can automatically plan a moving path according to the provided target coordinate;
the execution unit comprises an electric wheelchair and a mechanical arm, wherein the electric wheelchair is used for realizing the movement of a user; the robotic arm may autonomously plan a path of movement according to the provided target coordinates.
In this embodiment, preferably, the system further includes a matching unit, and the matching unit is configured to match the eye movement waveform features extracted from the EOG signal segment.
Fig. 2 shows an effective range in which a graspable target object is located, which is defined in this embodiment, and we define an effective grasping range in a three-dimensional space by using a camera as an origin, as shown in fig. 2, the effective grasping range is a rectangular space, and is 0.8 m long, 0.4 m wide, 0.6 m high, and the distance from the origin of the camera is 0.4 m, and the three long, wide, and high sides thereof are respectively parallel to three X, Y, Z axes of the robot arm space, and the space range is determined according to the length of the robot arm and the depth sensed by the camera (the robot arm is difficult to reach when the distance is too far, and the depth information cannot be accurately determined by the camera when the distance is too close).
A wheelchair mechanical arm integrated control method based on electro-oculography and machine vision comprises the following steps:
the method comprises the following steps: the user blinks according to the blinking prompt of the buttons on the screen, the system identifies the blinking EOG signal after detecting the blinking EOG signal, and the blinking EOG signal is distributed to a certain button to serve as a preselected target;
step two: the user judges whether the preselected target is correct by himself, the preselected target is verified through the eyebrow raising action, after the preselected target passes the eyebrow raising action, the button is formally selected, the corresponding control command is triggered, and the user can control the electric wheelchair to move forward, move backward, accelerate and decelerate, turn left and right and stop in the mode; the control of the mechanical arm requires the camera to return the detected space three-dimensional coordinates of the surrounding objects, then a user selects a target object through eye movement, the mechanical arm plans a grabbing path according to the coordinates of the target object, and the user can generate pause and reset instructions through eye movement in the moving process of the mechanical arm.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (1)

1. The control method of the wheelchair mechanical arm integrated system based on the electro-oculogram and the machine vision is characterized in that the integrated system comprises an EOG acquisition amplifier, a graphical user interface, a multi-eye EOG signal detection and identification unit, a multi-eye EOG detection and identification algorithm, a target object detection unit, a comprehensive identification unit, a matching unit and an execution unit, wherein:
the EOG acquisition amplifier is used for acquiring and preprocessing an eye movement EOG signal around the eyes of a user; the acquisition frequency of the EOG acquisition amplifier is 250 Hz, the total gain is 1000, and 3 wet electrodes are used as an EOG signal acquisition channel, a reference channel and a common-mode signal denoising channel and are respectively attached to the back of the brow tip and left and right earlobes;
a graphical user interface for providing a user with visual eye movement opportunity prompts corresponding to the control instructions; the graphical user interface is displayed by a screen of a notebook computer arranged at the front part of the wheelchair seat, and comprises two levels of interfaces: (i) a wheelchair control interface; (ii) a robotic arm control interface; the mechanical arm is controlled by adopting a cooperative control strategy, and the used equipment comprises: the system comprises a human-computer interface of the EOG, a first somatosensory camera and a second somatosensory camera, wherein the first somatosensory camera is used for identifying an object in front of the wheelchair and obtaining a three-dimensional coordinate of the object in a camera space; the second somatosensory camera faces the face of the user on the wheelchair and is used for identifying the face of the user and positioning the coordinates of the mouth of the user in the three-dimensional space of the camera;
the multi-eye movement EOG signal detection and identification unit is used for extracting eye movement waveform characteristic parameters from an EOG signal section and identifying different eye movement modes;
the matching unit is used for matching the eye movement waveform characteristics extracted from the EOG signal section;
the multi-eye movement EOG detection and identification algorithm comprises blink detection, pre-target selection and eyebrow lifting verification;
a target object detection unit for identifying potential target object coordinates in a positioning environment;
the comprehensive identification unit is used for sending an action instruction to a controller of the electric wheelchair according to an identification result;
the execution unit comprises a wheelchair and a mechanical arm, controls the forward, backward and steering actions of the wheelchair according to the instruction result identified by the EOG, controls the mechanical arm to complete the grabbing action according to the coordinates of the target object and the EOG instruction, and is used for realizing the movement of a user; the mechanical arm can automatically plan a moving path according to the provided target coordinates;
the control method comprises the following steps:
the method comprises the following steps: the user blinks according to the blinking prompt of the buttons on the screen, the system identifies the blinking EOG signal after detecting the blinking EOG signal, and the blinking EOG signal is distributed to a certain button to serve as a preselected target;
step two: the user judges whether the preselected target is correct by self, and carries out verification through the eyebrow raising action, when the verification is passed, the button is formally selected and triggers a corresponding control command, and the user can control the forward movement, backward movement, acceleration and deceleration, left-turning and right-turning and stopping of the electric wheelchair in such a way; the control of the mechanical arm requires the camera to return the detected space three-dimensional coordinates of the surrounding objects, then a user selects a target object through eye movement, the mechanical arm plans a grabbing path according to the coordinates of the target object, and the user can generate pause and reset instructions through eye movement in the moving process of the mechanical arm.
CN201911133337.9A 2019-11-19 2019-11-19 Wheelchair mechanical arm integrated system based on electro-oculogram and machine vision and control method thereof Active CN110840666B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911133337.9A CN110840666B (en) 2019-11-19 2019-11-19 Wheelchair mechanical arm integrated system based on electro-oculogram and machine vision and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911133337.9A CN110840666B (en) 2019-11-19 2019-11-19 Wheelchair mechanical arm integrated system based on electro-oculogram and machine vision and control method thereof

Publications (2)

Publication Number Publication Date
CN110840666A CN110840666A (en) 2020-02-28
CN110840666B true CN110840666B (en) 2022-09-09

Family

ID=69602330

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911133337.9A Active CN110840666B (en) 2019-11-19 2019-11-19 Wheelchair mechanical arm integrated system based on electro-oculogram and machine vision and control method thereof

Country Status (1)

Country Link
CN (1) CN110840666B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106775023B (en) * 2017-01-09 2023-07-18 成都信息工程大学 Eye electric signal acquisition method and Bluetooth mouse system based on eye electric signal control
CN111283689A (en) * 2020-03-26 2020-06-16 长春大学 Device for assisting movement of limb dysfunction patient and control method
CN113778070B (en) * 2020-07-17 2024-04-19 北京京东振世信息技术有限公司 Robot control method and device
CN112089541A (en) * 2020-09-21 2020-12-18 深兰科技(上海)有限公司 Intelligent wheelchair control system and method
CN112381173B (en) * 2020-11-30 2022-06-14 华南理工大学 Image recognition-based mechanical arm multitask autonomous learning control method and system
CN112959342B (en) * 2021-03-08 2022-03-15 东南大学 Remote operation method for grabbing operation of aircraft mechanical arm based on operator intention identification
CN114452096A (en) * 2022-04-14 2022-05-10 季华实验室 Wheelchair control method and device
CN114947485A (en) * 2022-05-10 2022-08-30 深兰科技(上海)有限公司 Sound control visual drinking cup and intelligent drinking method based on sound control visual drinking cup
CN115590695A (en) * 2022-10-08 2023-01-13 华南脑控(广东)智能科技有限公司(Cn) Wheelchair control system based on electro-oculogram and face recognition

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441513A (en) * 2008-11-26 2009-05-27 北京科技大学 System for performing non-contact type human-machine interaction by vision
KR20100127121A (en) * 2009-05-25 2010-12-03 한양대학교 산학협력단 Electric wheel chair using eog signal
CN105710885A (en) * 2016-04-06 2016-06-29 济南大学 Service-oriented movable manipulator system
CN107007407A (en) * 2017-04-12 2017-08-04 华南理工大学 Wheelchair control system based on eye electricity
CN109366508A (en) * 2018-09-25 2019-02-22 中国医学科学院生物医学工程研究所 A kind of advanced machine arm control system and its implementation based on BCI
CN209207525U (en) * 2018-12-05 2019-08-06 济南大学 Wheelchair arm-and-hand system based on multimodal information fusion

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2898824B1 (en) * 2006-03-27 2009-02-13 Commissariat Energie Atomique INTELLIGENT INTERFACE DEVICE FOR ENTRYING AN OBJECT BY A MANIPULATOR ROBOT AND METHOD OF IMPLEMENTING SAID DEVICE
US10179407B2 (en) * 2014-11-16 2019-01-15 Robologics Ltd. Dynamic multi-sensor and multi-robot interface system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441513A (en) * 2008-11-26 2009-05-27 北京科技大学 System for performing non-contact type human-machine interaction by vision
KR20100127121A (en) * 2009-05-25 2010-12-03 한양대학교 산학협력단 Electric wheel chair using eog signal
CN105710885A (en) * 2016-04-06 2016-06-29 济南大学 Service-oriented movable manipulator system
CN107007407A (en) * 2017-04-12 2017-08-04 华南理工大学 Wheelchair control system based on eye electricity
CN109366508A (en) * 2018-09-25 2019-02-22 中国医学科学院生物医学工程研究所 A kind of advanced machine arm control system and its implementation based on BCI
CN209207525U (en) * 2018-12-05 2019-08-06 济南大学 Wheelchair arm-and-hand system based on multimodal information fusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《An EOG-Based Human–Machine Interface》;Qiyun Huang等;《IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING》;20180930;第2023-2032页 *

Also Published As

Publication number Publication date
CN110840666A (en) 2020-02-28

Similar Documents

Publication Publication Date Title
CN110840666B (en) Wheelchair mechanical arm integrated system based on electro-oculogram and machine vision and control method thereof
Barea et al. System for assisted mobility using eye movements based on electrooculography
Lee et al. A brain-controlled exoskeleton with cascaded event-related desynchronization classifiers
Singh et al. Human eye tracking and related issues: A review
Barea et al. Wheelchair guidance strategies using EOG
Duan et al. Design of a multimodal EEG-based hybrid BCI system with visual servo module
Zhang et al. An EOG-based human–machine interface to control a smart home environment for patients with severe spinal cord injuries
US20190387995A1 (en) Brain-Computer Interface Based Robotic Arm Self-Assisting System and Method
CN109366508A (en) A kind of advanced machine arm control system and its implementation based on BCI
CN108646915B (en) Method and system for controlling mechanical arm to grab object by combining three-dimensional sight tracking and brain-computer interface
Huang et al. An EOG-based wheelchair robotic arm system for assisting patients with severe spinal cord injuries
CN110727353A (en) Control component control method and device based on two-dimensional intention definition
Molina-Cantero et al. Communication technologies based on voluntary blinks: Assessment and design
Navarro et al. EOG-based wheelchair control
Taher et al. EEG control of an electric wheelchair for disabled persons
Jose Development of eog based human machine interface control system for motorized wheelchair
Ianez et al. Multimodal human-machine interface based on a brain-computer interface and an electrooculography interface
Vojtech et al. Integrated head-tilt and electromyographic cursor control
Law et al. A cap as interface for wheelchair control
Lenhardt et al. An augmented-reality based brain-computer interface for robot control
Pingali et al. Eye-gesture controlled intelligent wheelchair using Electro-Oculography
Tharwat et al. Eye-controlled wheelchair
Novak Biomechatronic applications of brain-computer interfaces
Chang et al. Bio-inspired gaze-driven robotic neck brace
Bos et al. Looking around with your brain in a virtual world

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant