CN113274206A - Electric wheelchair implementation method based on eye movement and deep learning - Google Patents

Electric wheelchair implementation method based on eye movement and deep learning Download PDF

Info

Publication number
CN113274206A
CN113274206A CN202110543762.6A CN202110543762A CN113274206A CN 113274206 A CN113274206 A CN 113274206A CN 202110543762 A CN202110543762 A CN 202110543762A CN 113274206 A CN113274206 A CN 113274206A
Authority
CN
China
Prior art keywords
module
eye movement
electric wheelchair
deep learning
wheelchair
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110543762.6A
Other languages
Chinese (zh)
Inventor
徐军
刘刚
欧阳文佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin University of Science and Technology
Original Assignee
Harbin University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin University of Science and Technology filed Critical Harbin University of Science and Technology
Priority to CN202110543762.6A priority Critical patent/CN113274206A/en
Publication of CN113274206A publication Critical patent/CN113274206A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/04Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs motor-driven
    • A61G5/041Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs motor-driven having a specific drive-type
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/10Parts, details or accessories
    • A61G5/1051Arrangements for steering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G2203/00General characteristics of devices
    • A61G2203/10General characteristics of devices characterised by specific control means, e.g. for adjustment or steering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G2203/00General characteristics of devices
    • A61G2203/10General characteristics of devices characterised by specific control means, e.g. for adjustment or steering
    • A61G2203/18General characteristics of devices characterised by specific control means, e.g. for adjustment or steering by patient's head, eyes, facial muscles or voice
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G2203/00General characteristics of devices
    • A61G2203/30General characteristics of devices characterised by sensor means
    • A61G2203/42General characteristics of devices characterised by sensor means for inclination
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G2203/00General characteristics of devices
    • A61G2203/70General characteristics of devices with special adaptations, e.g. for safety or comfort
    • A61G2203/72General characteristics of devices with special adaptations, e.g. for safety or comfort for collision prevention

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an electric wheelchair realization method based on eye movement and deep learning, and relates to the technical field of electric wheelchair control; the wheelchair control system comprises a data processing system, a safety system and an electric wheelchair control system. The data processing system comprises an image acquisition and preprocessing module and an eye movement depth learning module, wherein the image acquisition and preprocessing module is electrically connected with the eye movement depth learning module. The safety system comprises an ultrasonic array and a three-axis sensor. The electric wheelchair control system comprises a main control module and a steering engine module, wherein the main control module is electrically connected with the steering engine module, and the eye movement depth learning module, the ultrasonic array and the three-axis sensor are respectively electrically connected with the main control module. The invention combines the eye movement technology with the electric wheelchair, realizes a brand new wheelchair control mode, adopts the steering engine to control the electric wheelchair, and is convenient to reform the existing electric wheelchair. In order to improve the safety of the wheelchair during operation, a safety system is designed.

Description

Electric wheelchair implementation method based on eye movement and deep learning
Technical Field
The invention belongs to the technical field of electric wheelchair control, and particularly relates to an electric wheelchair implementation method based on eye movement and deep learning.
Background
With the development of society and the advancement of technology, wheelchair technology is also continuously developing. There are different kinds of wheelchairs in the market today for different user groups. But the common wheelchair or the electric wheelchair cannot be well controlled by aiming at the people who suffer from the gradually freezing disease, the Parkinson patient, the high paraplegia patient and other diseases and can not freely move hands. Therefore, wheelchairs for special people have become a research hotspot.
The eye movement control technology is used for man-machine interaction, hands of people can be freed to do other important things, and meanwhile the technology can help the disabled to realize control over the equipment. The reliable eye movement control technology can be widely applied to the aspects of medicine, traffic, military affairs and the like. Currently, there are two main types of methods for acquiring eye movement information: firstly, an electrode is pasted near the temple of a user by a contact method to collect an electric signal generated when the eyeball rotates left and right, so that the wheelchair is controlled. In practical use, the method needs to stick the electrode on the head of a user every time, so that the appearance and the comfort are influenced, and the electrode can fall off after long-term use. Second, the eye movement information is acquired using an eye tracker by using a non-contact method, which is high in cost although comfort is improved in use.
Deep learning is a method for performing characterization learning on data in machine learning, and has the advantage that non-supervised or semi-supervised feature learning and a layered feature extraction efficient algorithm are used for replacing manual feature acquisition. The traditional method for processing the eye images usually adopts a template matching method, and realizes the classification function through the matching degree of the template and the features of the input images, the accuracy rate of the traditional method is low, the eye image features can be automatically extracted through a deep learning method, and efficient and accurate classification is realized.
Disclosure of Invention
The invention aims to solve the technical problem that patients with diseases such as gradually freezing diseases, Parkinson's disease, high paraplegia and the like cannot use an electric wheelchair independently in the prior art, and provides an electric wheelchair implementation method based on eye movement and deep learning.
The invention discloses an electric wheelchair implementation method based on eye movement and deep learning. The data processing system comprises an image acquisition and preprocessing module and an eye movement deep learning module, wherein the image acquisition and preprocessing module is electrically connected with the eye movement deep learning module. The safety system comprises an ultrasonic array and a three-axis sensor. The electric wheelchair control system comprises a main control module and a steering engine module, wherein the main control module is electrically connected with the steering engine module, and the eye movement deep learning module, the ultrasonic array and the three-axis sensor are respectively electrically connected with the main control module.
Preferably, the image acquisition and preprocessing module acquires images of the face area of the user by using a high-definition camera, and performs preprocessing operation on the acquired images by using a key point detection method to intercept and transversely splice the left eye area and the right eye area together.
Preferably, the eye movement deep learning module adopts ResNet18 as a backbone network extraction feature, obtains required three classification results through a full connection layer, and takes the highest probability in the results as the final prediction result. In the stage of training the network, the loss function adopts cross entropy, and the formula is as follows:
Figure BDA0003072739810000021
in the formula, N is the number of input samples in each batch, C is the current category, M is the total number of categories, yicDenotes the ith sample of the C category, picRepresenting the probability of the ith sample in the C category.
Preferably, the ultrasonic array acquires distance information between the wheelchair and the periphery by respectively placing one ultrasonic sensor at the front, the back, the left and the right of the wheelchair.
Preferably, the electric wheelchair control system utilizes the master control to process data of the eye movement deep learning module, the sensor array and the three-axis sensor, and the processed result is used for realizing the control of the wheelchair through the steering engine module. The steering engine module respectively realizes the control of front and back and left and right of a rocker on the controller through two steering engines fixed on the electric wheelchair controller, and realizes the control of the movement of the electric wheelchair in any direction through the combination of different rotation angles of the two steering engines.
Compared with the prior art, the invention has the beneficial effects that:
(1) the invention combines the eye movement technology with the electric wheelchair, realizes a novel electric wheelchair control mode, and can well help patients with gradually-frozen symptoms, high paraplegia and the like.
(2) The invention uses deep learning technology and adopts convolution neural network to extract the features of the eye images, thus finishing classification. Compared with the traditional template matching method, the template matching method is quicker and more accurate.
(3) The steering engine is adopted to replace a control rocker of the hand-controlled electric wheelchair, so that the existing electric wheelchair is conveniently transformed.
(4) The invention designs a safety system for the electric wheelchair, which can prevent accidental collision caused by misoperation, detect the gradient of a driving road section and prevent the electric wheelchair from driving on a dangerous steep slope.
Drawings
FIG. 1 is a block diagram of the system of the present invention;
FIG. 2 is a flow chart of deep learning module training of the present invention;
FIG. 3 is a control flow chart of the present invention.
Detailed Description
In order that the objects, aspects and advantages of the invention will become more apparent, the invention will be described by way of example only, and in connection with the accompanying drawings. It is to be understood that such description is merely illustrative and not intended to limit the scope of the present invention. The structure, proportion, size and the like shown in the drawings are only used for matching with the content disclosed in the specification, so that the person skilled in the art can understand and read the description, and the description is not used for limiting the limit condition of the implementation of the invention, so the method has no technical essence, and any structural modification, proportion relation change or size adjustment still falls within the range covered by the technical content disclosed by the invention without affecting the effect and the achievable purpose of the invention. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present invention.
It should be noted that, in order to avoid obscuring the present invention with unnecessary details, only the structures and/or processing steps closely related to the scheme according to the present invention are shown in the drawings, and other details not so relevant to the present invention are omitted.
As shown in fig. 1, the following technical solutions are adopted in the present embodiment: the wheelchair control system comprises a data processing system, a safety system and an electric wheelchair control system. The data processing system comprises an image acquisition and preprocessing module and an eye movement deep learning module, wherein the image acquisition and preprocessing module is electrically connected with the eye movement deep learning module. The safety system comprises an ultrasonic array and a three-axis sensor. The electric wheelchair control system comprises a main control module and a steering engine module, wherein the main control module is electrically connected with the steering engine module, and the eye movement deep learning module, the ultrasonic array and the three-axis sensor are respectively electrically connected with the main control module. The electric wheelchair control system acquires and analyzes data of the data processing system and the safety system, and the obtained control instruction drives a rocker on the electric wheelchair controller through the steering engine module to realize control on the electric wheelchair.
The eye movement deep learning module is obtained by storing the weight of the best result through multiple training, firstly, parameters such as the learning rate, the iteration times, the batch size and the like of the model need to be set, the weight is initialized randomly, the eye movement images are input into the model according to the number set by the batch size, and the weight in the loss optimization model of the model output result and the real eye movement image classification label is calculated. And after the eye images are completely traversed, judging whether the model has good performance or not according to the accuracy of the model prediction result, and finally storing the weight file used for training. The specific training process is shown in fig. 2.
The electric wheelchair control system judges whether the current environment of the electric wheelchair is safe or not by acquiring data of the ultrasonic array and the three-axis sensor in the safety system, and a safety distance threshold and a safety inclination angle threshold of ultrasonic waves can be preset in the system. When the acquired data exceeds a set threshold value, a stop instruction is sent out, and the steering engine module realizes the stop operation. When the collected data are within the safety threshold range, the collected classification results of the deep learning module are converted into corresponding direction control instructions, the steering engine module realizes corresponding direction control operation, and the specific control flow is shown in fig. 3.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
Furthermore, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.

Claims (6)

1. An electric wheelchair implementation method based on eye movement and deep learning is characterized in that: the system comprises a data processing system, a safety system and an electric wheelchair control system; the data processing system comprises an image acquisition and preprocessing module and an eye movement deep learning module, wherein the image processing and preprocessing module is electrically connected with the eye movement deep learning module; the safety system comprises an ultrasonic array and a three-axis sensor; the electric wheelchair control system comprises a main control module and a steering engine module, wherein the main control module is electrically connected with the steering engine module, and the eye movement deep learning module, the ultrasonic array and the three-axis sensor are respectively electrically connected with the main control module.
2. The electric wheelchair implementation method based on eye movement and deep learning of claim 1, characterized in that: the image acquisition and preprocessing module acquires images of human face areas of users by using a high-definition camera, and performs preprocessing operation on the acquired images by a key point detection method to intercept and transversely splice left and right eye areas together.
3. The electric wheelchair implementation method based on eye movement and deep learning of claim 1, characterized in that: the eye movement deep learning module adopts ResNet18 as a backbone network to extract features, three required classification results are obtained through a full connection layer, and the highest probability in the results is used as a final prediction result; in the stage of training the network, the loss function adopts cross entropy, and the formula is as follows:
Figure FDA0003072739800000011
in the formula, N is the number of input samples in each batch, C is the current category, M is the total number of categories, yicDenotes the ith sample of the C category, picRepresenting the probability of the ith sample in the C category.
4. The electric wheelchair implementation method based on eye movement and deep learning of claim 1, characterized in that: the ultrasonic array is used for acquiring distance information between the wheelchair and the periphery by respectively arranging an ultrasonic sensor at the front, the back, the left and the right of the wheelchair.
5. The electric wheelchair implementation method based on eye movement and deep learning of claim 1, characterized in that: the three-axis sensor acquires the inclination angle information of the wheelchair by installing the three-axis sensor below the wheelchair.
6. The electric wheelchair implementation method based on eye movement and deep learning of claim 1, characterized in that: the electric wheelchair control system utilizes the master control to process data of the eye movement deep learning module, the sensor array and the three-axis sensor, and the processed result is used for realizing the control of the wheelchair through the steering engine module; the steering engine module respectively realizes the control of front and back and left and right of a rocker on the controller through two steering engines fixed on the electric wheelchair controller, and realizes the control of the movement of the electric wheelchair in any direction through the combination of different rotation angles of the two steering engines.
CN202110543762.6A 2021-05-19 2021-05-19 Electric wheelchair implementation method based on eye movement and deep learning Pending CN113274206A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110543762.6A CN113274206A (en) 2021-05-19 2021-05-19 Electric wheelchair implementation method based on eye movement and deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110543762.6A CN113274206A (en) 2021-05-19 2021-05-19 Electric wheelchair implementation method based on eye movement and deep learning

Publications (1)

Publication Number Publication Date
CN113274206A true CN113274206A (en) 2021-08-20

Family

ID=77279823

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110543762.6A Pending CN113274206A (en) 2021-05-19 2021-05-19 Electric wheelchair implementation method based on eye movement and deep learning

Country Status (1)

Country Link
CN (1) CN113274206A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115227494A (en) * 2022-07-20 2022-10-25 哈尔滨理工大学 Intelligent eye movement wheelchair based on deep learning

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110101851A (en) * 2010-03-10 2011-09-16 주식회사 콤슨테크놀러지 Wheelchair which can stand up
KR20110118965A (en) * 2010-04-26 2011-11-02 대구대학교 산학협력단 Autonomous wheelchair system using gaze recognition
CN203169463U (en) * 2013-03-22 2013-09-04 浙江大学 Electrically-powered wheelchair based on eye movement signal
CN106389029A (en) * 2016-08-31 2017-02-15 河南纵横精工机械科技有限公司 Multifunctional wheelchair seat
CN107247949A (en) * 2017-08-02 2017-10-13 北京智慧眼科技股份有限公司 Face identification method, device and electronic equipment based on deep learning
CN107616880A (en) * 2017-08-01 2018-01-23 南京邮电大学 A kind of intelligent electric wheelchair implementation method based on brain electricity idea and deep learning

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110101851A (en) * 2010-03-10 2011-09-16 주식회사 콤슨테크놀러지 Wheelchair which can stand up
KR20110118965A (en) * 2010-04-26 2011-11-02 대구대학교 산학협력단 Autonomous wheelchair system using gaze recognition
CN203169463U (en) * 2013-03-22 2013-09-04 浙江大学 Electrically-powered wheelchair based on eye movement signal
CN106389029A (en) * 2016-08-31 2017-02-15 河南纵横精工机械科技有限公司 Multifunctional wheelchair seat
CN107616880A (en) * 2017-08-01 2018-01-23 南京邮电大学 A kind of intelligent electric wheelchair implementation method based on brain electricity idea and deep learning
CN107247949A (en) * 2017-08-02 2017-10-13 北京智慧眼科技股份有限公司 Face identification method, device and electronic equipment based on deep learning

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115227494A (en) * 2022-07-20 2022-10-25 哈尔滨理工大学 Intelligent eye movement wheelchair based on deep learning

Similar Documents

Publication Publication Date Title
CN102542281B (en) Non-contact biometric feature identification method and system
CN102521505B (en) Brain electric and eye electric signal decision fusion method for identifying control intention
CN105559802A (en) Tristimania diagnosis system and method based on attention and emotion information fusion
CN106203497B (en) Finger vena area-of-interest method for screening images based on image quality evaluation
de San Roman et al. Saliency driven object recognition in egocentric videos with deep CNN: toward application in assistance to neuroprostheses
CN113729710A (en) Real-time attention assessment method and system integrating multiple physiological modes
CN113951900A (en) Motor imagery intention recognition method based on multi-mode signals
CN114155512A (en) Fatigue detection method and system based on multi-feature fusion of 3D convolutional network
CN113743471A (en) Driving evaluation method and system
CN114220130A (en) Non-contact gesture and palm print and palm vein fused identity recognition system and method
CN113274206A (en) Electric wheelchair implementation method based on eye movement and deep learning
CN115393830A (en) Fatigue driving detection method based on deep learning and facial features
CN111611963B (en) Face recognition method based on neighbor preservation canonical correlation analysis
CN111281403A (en) Fine-grained human body fatigue detection method and device based on embedded equipment
CN106446822A (en) Blink detection method based on circle fitting
WO2019218571A1 (en) Fatigued driving early warning system based on opencv technology
CN107016372A (en) Face identification method based on neutral net
Ma et al. Research on drowsy-driving monitoring and warning system based on multi-feature comprehensive evaluation
CN113989887A (en) Equipment operator fatigue state detection method based on visual characteristic information fusion
CN114067187A (en) Infrared polarization visible light face translation method based on countermeasure generation network
CN106384096A (en) Fatigue driving monitoring method based on blink detection
Pangestu et al. Electric Wheelchair Control Mechanism Using Eye-mark Key Point Detection.
Hu et al. Comprehensive driver state recognition based on deep learning and PERCLOS criterion
CN110458049A (en) A kind of behavior measure and analysis method based on more visions
CN109522810A (en) A kind of myoelectric limb hand gesture identification method based on community vote mechanism

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination