CN108652851B - Eye-controlled wheelchair control method based on visual positioning technology - Google Patents

Eye-controlled wheelchair control method based on visual positioning technology Download PDF

Info

Publication number
CN108652851B
CN108652851B CN201810051470.9A CN201810051470A CN108652851B CN 108652851 B CN108652851 B CN 108652851B CN 201810051470 A CN201810051470 A CN 201810051470A CN 108652851 B CN108652851 B CN 108652851B
Authority
CN
China
Prior art keywords
eye
user
wheelchair
angle
controlled
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810051470.9A
Other languages
Chinese (zh)
Other versions
CN108652851A (en
Inventor
王泽新
焦李成
王浩然
孙其功
李玲玲
郭雨薇
唐旭
张梦旋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201810051470.9A priority Critical patent/CN108652851B/en
Publication of CN108652851A publication Critical patent/CN108652851A/en
Application granted granted Critical
Publication of CN108652851B publication Critical patent/CN108652851B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G5/00Chairs or personal conveyances specially adapted for patients or disabled persons, e.g. wheelchairs
    • A61G5/10Parts, details or accessories
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61GTRANSPORT, PERSONAL CONVEYANCES, OR ACCOMMODATION SPECIALLY ADAPTED FOR PATIENTS OR DISABLED PERSONS; OPERATING TABLES OR CHAIRS; CHAIRS FOR DENTISTRY; FUNERAL DEVICES
    • A61G2203/00General characteristics of devices
    • A61G2203/10General characteristics of devices characterised by specific control means, e.g. for adjustment or steering
    • A61G2203/18General characteristics of devices characterised by specific control means, e.g. for adjustment or steering by patient's head, eyes, facial muscles or voice
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/60Other road transportation technologies with climate change mitigation effect
    • Y02T10/72Electric energy management in electromobility

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Image Analysis (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an eye-control wheelchair control method based on a visual positioning technology, which aims at the problems of single eye information and low communication reliability among multiple sensors in the prior art. The implementation process is as follows: (1) The central camera transmits each frame of color image shot in real time; (2) Determining the face position of a user for the color image of each frame and framing a rectangular frame; (3) Determining an eye area of a user in a face rectangular frame area; (4) Precisely locating the pupil position in the user's eye region; (5) determining the rotational orientation of the user's eyeball; and (6) controlling the wheelchair to rotate by a motor control system. The invention improves the stability of controlling the wheelchair through eye information, can continuously and reliably change the moving speed and direction of the wheelchair in the face of complex environment, and reduces the information loss of communication among the multi-sensor modules.

Description

Eye-controlled wheelchair control method based on visual positioning technology
Technical Field
The invention belongs to the technical field of computers, and further relates to an eye-controlled wheelchair control method based on a visual positioning technology in the technical field of man-machine interaction. The invention can be used by handicapped people including 'gradually frozen people', realizes the control of the eye-controlled wheelchair through the eyeball rotation of the user, and is used for the people with inconvenient actions such as 'gradually frozen people'.
Background
At present, the wheelchair becomes an indispensable riding-replacing tool for the elderly and disabled with inconvenient movement, and has very wide application objects. For disabled people including the gradually frozen people, the wheelchair can help the disabled people move and perform social activities, but many patients cannot control the wheelchair through manual control, only the freely movable part of the whole body has eyes, and the study of controlling the wheelchair through identifying eyeball movement is of great significance. At present, the technology for controlling the wheelchair by identifying eyeball movement is not developed completely, and the more mature method is to detect the eye movement state by using a sensor detection method, then establish a sight line model by using a mathematical geometric relation, and finally identify the movement information of the eyes of the person, thereby controlling the wheelchair correspondingly.
The Nanjing university of post provides a method for controlling an intelligent wheelchair by identifying eye actions in the patent literature 'an intelligent wheelchair man-machine interaction control system and method based on eye action identification' (patent application number: CN201710531252.0, publication number: CN 107260420A). According to the method, eye electromyographic signals are collected through an electromyographic signal collection module, the judgment of winks is realized according to the variation difference of electromyographic characteristics during winks, the control of the wheelchair direction is realized through the combination of a plurality of winks, the electromyographic signals are compared, and the control of the wheelchair speed is realized through calculating the difference value of the front electromyographic signals and the rear electromyographic signals. The method has the defects that the blinking actions of the person are identified through the electromyographic signal acquisition module, the blinking actions can be controlled only through one action of blinking, effective information is less, the one-to-many control is easy to cause unstable control in the control of the direction and the speed, and the control is difficult to be carried out only through the blinking actions when the continuous speed and the direction change are required in the complex environment.
Han Jun in its published paper "study on electric wheelchair control method based on brain electricity and eye electricity" ("Hangzhou university of electronic technology" 2015) a method for controlling wheelchair by processing the collected eye electrical signals is proposed. The method collects the eye electric signals under the eye glance of the subject, and the electric wheelchair is controlled to advance, turn left, turn right and stop by preprocessing the eye electric signals, extracting features and classifying modes and mapping the mode classification result into a command. The method has the defects that the eye information is predicted by processing among various sensor modules, information loss is easily caused in the communication of signals among the excessive sensor modules, and the control accuracy and the real-time performance are difficult to ensure.
At present, the technology for controlling the wheelchair by identifying the rotation of eyeballs has great defects, the information of the eyeballs is obtained by a plurality of sensors, the identified information is limited, the practical application is difficult, and the wheelchair for the disabled such as the gradually frozen person is not available.
Disclosure of Invention
The invention aims at overcoming the defects of the prior art, and provides an eye-control wheelchair control method based on a visual positioning technology.
The specific steps for achieving the purpose of the invention are as follows:
the method is characterized in that a vision positioning technology is utilized to control the movement of an eye-controlled wheelchair, left and right brackets are arranged on left and right universal wheels arranged on the eye-controlled wheelchair, a central camera for shooting eye movement information of a user is arranged on the right brackets, a central processor and a motor control system for executing an eye-controlled wheelchair algorithm and controlling the movement of the eye-controlled wheelchair are fixed on the back of the eye-controlled wheelchair, and the method comprises the following specific steps:
(1) The transmission center camera transmits each frame of color image shot in real time:
(1a) Shooting eye movement information of a user by using a camera arranged in front of the eye-controlled wheelchair;
(1b) Transmitting each frame of color image shot by the central camera in real time to a central processing unit fixed behind the eye-controlled wheelchair;
(2) Rectangular box for determining face position of user:
(2a) Dividing each frame of color image into 25 rectangular areas, and extracting the linear Haar characteristic values of the face of a user in each area by using a face linear Haar characteristic value calculation method;
(2b) A rectangular area where the linear Haar characteristic value of the human face is positioned is formed by a rectangular frame, and the rectangular area is used as a rectangular frame of the human face of a user;
(3) Determining an eye area of a user:
(3a) Equally dividing a face rectangular frame of a user into 25 rectangular areas, and calculating a gradient value of each pixel point in each rectangular area;
(3b) Summing the gradient values in each rectangular area, taking an average value, and framing a rectangular area with the gradient value larger than the average value by using a rectangular frame to serve as an eye area of a user;
(4) Determining an optimal center point:
determining an optimal center point of the eye region of the user from center points suspected to be the eye region of the user by using a center point positioning formula;
(5) Determining the rotation direction and rotation angle of eyeballs of a user:
(5a) Taking the position of the optimal center point when the eyes of a user look forward as a reference coordinate point, respectively subtracting the horizontal coordinate value of the reference coordinate point from the horizontal coordinate value of the optimal center point of the eye area of the user in each frame of image, and respectively positioning the eyeball rotation direction of the user by using the difference result of the horizontal coordinate value and the vertical coordinate value of the two points;
(5b) Taking absolute value of the result after the difference, and carrying out normalization processing in the interval of [0,1 ];
(5c) Continuously mapping the [0,1] interval into an angle value of rotation of the eye-controlled wheelchair, wherein 0 represents that the angle of rotation of the eye-controlled wheelchair is 0 degree, and 1 represents that the angle of rotation is 90 degrees;
(5d) Converting the positioned eyeball rotation azimuth and angle information into azimuth signals and angle signals, and transmitting the azimuth signals and the angle signals to a motor control system;
(6) Controlling the eye-controlled wheelchair to rotate:
the motor control system controls the rotating direction and angle of the motor according to the received azimuth signal and angle signal, thereby controlling the moving direction and angle of the eye-controlled wheelchair.
Compared with the prior art, the invention has the following advantages:
firstly, because the central camera for shooting eye movement information of a user is arranged on the right bracket, the whole facial area of the user is shot through the central camera, and the problems that effective information obtained by acquiring eye information through a single sensor in the prior art is less, and control instability is easily caused by one-to-many control in the control of the moving direction and the speed of the wheelchair are overcome. The invention improves the stability of controlling the wheelchair by identifying the eye movement information, and can continuously and reliably change the moving speed and direction of the wheelchair when facing complex environments.
Second, the invention directly converts the processing result of the shot image into a control instruction and transmits the control instruction to the motor control system, and a method of processing the eye signals by transmitting the eye signals among a plurality of modules is not adopted, so that the communication among the plurality of modules is not needed, and the problem of information loss in the multi-module communication process in the prior art is solved. The invention improves the reliability of information transmission and improves the accuracy and real-time performance of controlling the movement of the wheelchair.
Drawings
FIG. 1 is a flow chart of the present invention;
fig. 2 is a structural diagram of the eye-controlled wheelchair of the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
The specific steps of the present invention are further described with reference to fig. 1.
Step 1, transmitting each frame of color image shot in real time by the central camera.
And shooting eye movement information of a user by using a camera arranged in front of the eye-controlled wheelchair.
And transmitting each frame of color image shot by the central camera in real time to a central processing unit fixed behind the eye-controlled wheelchair.
And 2, determining a rectangular frame of the face position of the user.
And dividing each frame of color image into 25 rectangular areas, and extracting the linear Haar characteristic values of the face of the user in each area by using a face linear Haar characteristic value calculation method.
The specific steps of the face linear Haar characteristic value calculating method are as follows:
and averaging the RGB values of each pixel point in each region to obtain the gray value of each pixel point.
And summing the maximum gray value and the minimum gray value in all the pixel points, and then averaging to obtain an intermediate gray value.
The pixel point larger than the intermediate gray value is set as a white pixel point, and the pixel point smaller than the intermediate gray value is set as a black pixel point.
And subtracting the sum of all the black pixel values from the sum of all the white pixel values to obtain the linear Haar characteristic value of the face of the user in the region.
And using a rectangular frame to obtain a rectangular region where the linear Haar characteristic value of the face is larger than 0 as a rectangular frame of the face of the user.
And 3, determining the eye area of the user.
And equally dividing the human face rectangular frame of the user into 25 rectangular areas, and calculating the gradient value of each pixel point in each rectangular area.
And summing the gradient values in each rectangular area, taking an average value, and framing a rectangular area with the gradient value larger than the average value by using a rectangular frame to serve as an eye area of a user.
And 4, determining the optimal center point.
And determining the optimal center point of the eye region of the user from the center points suspected to be the eye region of the user by using a center point positioning formula.
The center point positioning formula is as follows:
Figure BDA0001552478090000051
wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0001552478090000052
indicating the optimal centre point in the region of the eyes of the user,/->
Figure BDA0001552478090000053
Represents the inverse operation of the maximum value of c, c represents the suspected center points in the eye region of the user, N represents the total number of suspected center points in the eye region of the user, Σ represents the summation operation, i represents the serial number of the pixel point on the edge of the eye region of the user, x i Represents the ith pixel point on the edge of the user's eye area, |·|| 2 Representing a 2-norm operation, T represents a transpose operation, g i Representing the gradient value of the ith pixel point on the edge of the user's eye area.
And 5, determining the rotation direction and rotation angle of the eyeballs of the user.
The position of the optimal center point is taken as a reference coordinate point when the eyes of a user look forward, the horizontal coordinate value of the reference coordinate point is subtracted from the horizontal coordinate value of the optimal center point of the eye area of the user in each frame of image, and the rotation direction of the eyeball of the user is positioned by using the difference result of the horizontal coordinate value and the vertical coordinate value of the two points.
The positioning of the eyeball rotation direction of the user by using the result of difference between the two points of abscissa and ordinate values means that:
when the result of the difference between the horizontal coordinate values of the two points is positive, the eyeball movement of the user is positioned and rotated leftwards.
When the result of the difference between the horizontal coordinate values of the two points is negative, the eyeball movement of the user is positioned to rotate to the right.
And when the result of difference between the longitudinal coordinate values of the two points is positive, the eyeball movement of the user is positioned and rotated upwards.
When the result of the difference between the longitudinal coordinate values of the two points is negative, the eyeball movement of the user is positioned and rotated downwards.
And taking absolute value of the result after the difference, and carrying out normalization processing in the interval of [0,1 ].
Continuously mapping the [0,1] interval into an angle value of rotation of the eye-controlled wheelchair, wherein 0 represents that the angle of rotation of the eye-controlled wheelchair is 0 degree, and 1 represents that the angle of rotation is 90 degrees.
And converting the positioned eyeball rotation azimuth and angle information into azimuth signals and angle signals, and transmitting the azimuth signals and the angle signals to a motor control system.
And 6, controlling the eye-controlled wheelchair to rotate.
The motor control system controls the rotating direction and angle of the motor according to the received azimuth signal and angle signal, thereby controlling the moving direction and angle of the eye-controlled wheelchair.
The direction and the angle for controlling the rotation of the motor according to the received azimuth signal and the angle signal are as follows:
if the azimuth signal is that the eyeball of the positioned user rotates leftwards, the motor control system controls the motor to rotate leftwards, and controls the rotating angle of the motor according to the received angle signal, so that the eye-controlled wheelchair is controlled to rotate leftwards.
If the azimuth signal is that the eyeball of the positioned user rotates to the right, the motor control system controls the motor to rotate to the right, and controls the motor to rotate according to the received angle signal, so that the eye-controlled wheelchair is controlled to rotate to the right.
If the azimuth signal is that the eyeball of the positioned user rotates upwards, the motor control system controls the motor to rotate forwards, and controls the rotating angle of the motor according to the received angle signal, so that the eye-controlled wheelchair is controlled to move forwards.
If the azimuth signal is that the eyeball of the positioned user rotates downwards, the motor control system controls the motor to rotate backwards, and controls the rotating angle of the motor according to the received angle signal, so that the eye-controlled wheelchair is controlled to move backwards.
The specific steps implemented by the present invention will be further described with reference to the schematic structural diagram of the eye-controlled wheelchair of fig. 2.
The eye control wheelchair is provided with a left bracket 1 and a right bracket 2, the right bracket 2 is provided with a central camera 3 for shooting eye movement information of a user, color images of the eyes of the user shot by the camera in real time are sent to a central processor 4, the central processor 4 processes the received color images in real time, recognizes information such as eyeball rotation direction and angle, converts the information into corresponding control signals, and transmits the corresponding control signals to a motor control system 5 in real time. The motor control system 5 controls the rotating direction and angle of the motor according to the received control signals, so as to control the eye-controlled wheelchair to move left, right, forward, backward and the like.
When the eye-controlled wheelchair is used, the left bracket and the right bracket are firstly opened to help a user to sit in the wheelchair, the left bracket and the right bracket are closed after the sitting posture is adjusted to a comfortable position, the azimuth and the direction of the central camera are adjusted according to the sitting posture of the user, and the eye-controlled wheelchair is started. After the eye-controlled wheelchair is started, the central camera starts to shoot eye color images of a user and transmits the color images to the central processor for processing, the central processor sends processed control signals to the motor control system, and the motor control system controls the motor to rotate according to the received control signals, so that the eye-controlled wheelchair is driven to correspondingly move. Specifically, when the eyeball of the user rotates leftwards (rightwards), the central processing unit processes the eyeball and then controls the motor to drive the eye-controlled wheelchair to rotate leftwards (rightwards) through the motor control system; when the eyeballs of the user rotate upwards (downwards), the central processing unit processes the eyeballs and then controls the motor to drive the eye-controlled wheelchair to move forwards (backwards) through the motor control system, and in the use process, the left bracket and the right bracket which are closed are used for protecting the safety of the user. After the use of the eye-controlled wheelchair is finished, the power supply of the eye-controlled wheelchair is turned off, and the left bracket and the right bracket are opened to help a user to get up and leave the eye-controlled wheelchair.

Claims (5)

1. The eye control wheelchair control method based on the visual positioning technology is characterized in that a central camera for shooting eye movement information of a user is arranged on a right bracket of the eye control wheelchair, color images of eyes of the user shot by the camera in real time are sent to a central processor for processing, and the color images are used for controlling the rotating direction and angle of a motor so as to realize control of the moving direction and angle of the eye control wheelchair, and the method comprises the following specific steps:
(1) The transmission center camera transmits each frame of color image shot in real time:
(1a) Shooting eye movement information of a user by using a camera arranged in front of the eye-controlled wheelchair;
(1b) Transmitting each frame of color image shot by the central camera in real time to a central processing unit fixed behind the eye-controlled wheelchair;
(2) Rectangular box for determining face position of user:
(2a) Dividing each frame of color image into 25 rectangular areas, and extracting the linear Haar characteristic values of the face of a user in each area by using a face linear Haar characteristic value calculation method;
(2b) A rectangular area where the linear Haar characteristic value of the human face is positioned is formed by a rectangular frame, and the rectangular area is used as a rectangular frame of the human face of a user;
(3) Determining an eye area of a user:
(3a) Equally dividing a face rectangular frame of a user into 25 rectangular areas, and calculating a gradient value of each pixel point in each rectangular area;
(3b) Summing the gradient values in each rectangular area, taking an average value, and framing a rectangular area with the gradient value larger than the average value by using a rectangular frame to serve as an eye area of a user;
(4) Determining an optimal center point:
determining an optimal center point of the eye region of the user from center points suspected to be the eye region of the user by using a center point positioning formula;
(5) Determining the rotation direction and rotation angle of eyeballs of a user:
(5a) Taking the position of the optimal center point when the eyes of a user look forward as a reference coordinate point, respectively subtracting the horizontal coordinate value of the reference coordinate point from the horizontal coordinate value of the optimal center point of the eye area of the user in each frame of image, and respectively positioning the eyeball rotation direction of the user by using the difference result of the horizontal coordinate value and the vertical coordinate value of the two points;
(5b) Taking absolute value of the result after the difference, and carrying out normalization processing in the interval of [0,1 ];
(5c) Continuously mapping the [0,1] interval into an angle value of rotation of the eye-controlled wheelchair, wherein 0 represents that the angle of rotation of the eye-controlled wheelchair is 0 degree, and 1 represents that the angle of rotation is 90 degrees;
(5d) Converting the positioned eyeball rotation azimuth and angle information into azimuth signals and angle signals, and transmitting the azimuth signals and the angle signals to a motor control system;
(6) Controlling the eye-controlled wheelchair to rotate:
the motor control system controls the rotating direction and angle of the motor according to the received azimuth signal and angle signal, thereby controlling the moving direction and angle of the eye-controlled wheelchair.
2. The eye-controlled wheelchair control method based on the visual positioning technology according to claim 1, wherein the specific steps of the face linear Haar eigenvalue calculation method in step (2 a) are as follows:
the first step, averaging red, green and blue RGB values of each pixel point in each region to obtain a gray value of each pixel point;
secondly, summing the maximum gray value and the minimum gray value in all pixel points, and then averaging to obtain an intermediate gray value;
thirdly, setting the pixel points with the gray values larger than the middle gray value as white pixel points and setting the pixel points with the gray values smaller than the middle gray value as black pixel points;
and fourthly, subtracting the sum of all the black pixel values from the sum of all the white pixel values to obtain the linear Haar characteristic value of the face of the user in the area.
3. The eye-controlled wheelchair control method based on the visual positioning technique according to claim 1, wherein the center point positioning formula in the step (4) is as follows:
Figure FDA0001552478080000021
wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure FDA0001552478080000022
indicating the optimal centre point in the region of the eyes of the user,/->
Figure FDA0001552478080000023
The maximum value of c is represented by the inverse operation, c represents the suspected center points in the eye region of the user, N represents the total number of the suspected center points in the eye region of the user, sigma represents the summation operation, i represents the serial number of the pixel point on the edge of the eye region of the user, xi represents the ith pixel point on the edge of the eye region of the user, and I is I 2 Representing a 2-norm operation, T represents a transpose operation, g i Representing the gradient value of the ith pixel point on the edge of the user's eye area.
4. The method for controlling an eye-controlled wheelchair based on visual positioning technology according to claim 1, wherein the positioning of the rotational orientation of the eyeball of the user by using the difference between the two-point abscissa values in the step (5 a) means:
A. when the result of the difference between the horizontal coordinate values of the two points is positive, positioning the eyeball movement of the user to rotate leftwards;
B. when the result of the difference between the horizontal coordinate values of the two points is negative, the eyeball movement of the user is positioned to rotate to the right;
C. when the result of difference between the longitudinal coordinate values of the two points is positive, the eyeball movement of the user is positioned and rotated upwards;
D. when the result of the difference between the longitudinal coordinate values of the two points is negative, the eyeball movement of the user is positioned and rotated downwards.
5. The method for controlling an eye-controlled wheelchair based on the visual positioning technology according to claim 1, wherein in the step (6), the direction and the angle of the motor rotation are controlled according to the received azimuth signal and the angle signal:
if the azimuth signal is that the eyeball of the positioned user rotates leftwards, the motor control system controls the motor to rotate leftwards, and controls the rotating angle of the motor according to the received angle signal, so that the eye-controlled wheelchair is controlled to rotate leftwards;
if the azimuth signal is that the eyeball of the positioned user rotates to the right, the motor control system controls the motor to rotate to the right, and controls the angle of the motor to rotate according to the received angle signal, so as to control the eye-controlled wheelchair to rotate to the right;
if the azimuth signal is that the eyeball of the positioned user rotates upwards, the motor control system controls the motor to rotate forwards, and controls the rotating angle of the motor according to the received angle signal, so that the eye-controlled wheelchair is controlled to move forwards;
if the azimuth signal is that the eyeball of the positioned user rotates downwards, the motor control system controls the motor to rotate backwards, and controls the rotating angle of the motor according to the received angle signal, so that the eye-controlled wheelchair is controlled to move backwards.
CN201810051470.9A 2018-01-19 2018-01-19 Eye-controlled wheelchair control method based on visual positioning technology Active CN108652851B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810051470.9A CN108652851B (en) 2018-01-19 2018-01-19 Eye-controlled wheelchair control method based on visual positioning technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810051470.9A CN108652851B (en) 2018-01-19 2018-01-19 Eye-controlled wheelchair control method based on visual positioning technology

Publications (2)

Publication Number Publication Date
CN108652851A CN108652851A (en) 2018-10-16
CN108652851B true CN108652851B (en) 2023-06-30

Family

ID=63784054

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810051470.9A Active CN108652851B (en) 2018-01-19 2018-01-19 Eye-controlled wheelchair control method based on visual positioning technology

Country Status (1)

Country Link
CN (1) CN108652851B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200229998A1 (en) * 2018-11-14 2020-07-23 Ohad Paz Smart tilting/lifting chair with the ability to tilt user to vertical position
CN112835444A (en) * 2019-11-25 2021-05-25 七鑫易维(深圳)科技有限公司 Method, device and equipment for adjusting use angle of eye control all-in-one machine and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105105938A (en) * 2015-07-14 2015-12-02 南京邮电大学 Intelligent wheelchair control method and system based on face orientation identification and tracking
CN105892691A (en) * 2016-06-07 2016-08-24 京东方科技集团股份有限公司 Method and device for controlling travel tool and travel tool system
KR101781361B1 (en) * 2017-01-25 2017-09-26 대한민국 A Method Identifying A Personnel By Comparing Face Area
CN107260420A (en) * 2017-07-03 2017-10-20 南京邮电大学 Intelligent wheel chair human-computer interactive control system and method based on eye motion recognition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105105938A (en) * 2015-07-14 2015-12-02 南京邮电大学 Intelligent wheelchair control method and system based on face orientation identification and tracking
CN105892691A (en) * 2016-06-07 2016-08-24 京东方科技集团股份有限公司 Method and device for controlling travel tool and travel tool system
KR101781361B1 (en) * 2017-01-25 2017-09-26 대한민국 A Method Identifying A Personnel By Comparing Face Area
CN107260420A (en) * 2017-07-03 2017-10-20 南京邮电大学 Intelligent wheel chair human-computer interactive control system and method based on eye motion recognition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
混合肌电眼电及视觉的智能轮椅控制系统研究;张毅;崔润东;罗元;;控制工程(第01期);全文 *

Also Published As

Publication number Publication date
CN108652851A (en) 2018-10-16

Similar Documents

Publication Publication Date Title
US11602300B2 (en) Brain-computer interface based robotic arm self-assisting system and method
CN110210323B (en) Drowning behavior online identification method based on machine vision
CN108171218A (en) A kind of gaze estimation method for watching network attentively based on appearance of depth
CN102841354B (en) Vision protection implementation method of electronic equipment with display screen
CN104238732B (en) Device, method and computer readable recording medium for detecting facial movements to generate signals
CN108652851B (en) Eye-controlled wheelchair control method based on visual positioning technology
CN110728241A (en) Driver fatigue detection method based on deep learning multi-feature fusion
CN109366508A (en) A kind of advanced machine arm control system and its implementation based on BCI
CN112114670B (en) Man-machine co-driving system based on hybrid brain-computer interface and control method thereof
CN113221865A (en) Single-camera binocular iris image acquisition method and device
CN111735541A (en) Mobile body temperature monitoring robot
CN110716578A (en) Aircraft control system based on hybrid brain-computer interface and control method thereof
Karuppiah et al. Automation of a wheelchair mounted robotic arm using computer vision interface
CN108268858A (en) A kind of real-time method for detecting sight line of high robust
CN113160260B (en) Head-eye double-channel intelligent man-machine interaction system and operation method
CN108509025A (en) A kind of crane intelligent Lift-on/Lift-off System based on limb action identification
McMurrough et al. 3D point of gaze estimation using head-mounted RGB-D cameras
CN108743086B (en) Intelligent wheelchair control method for tracking targets of front accompanying personnel
CN113887374B (en) Brain control water drinking system based on dynamic convergence differential neural network
CN215814080U (en) Head-eye double-channel intelligent man-machine interaction system
CN110674751A (en) Device and method for detecting head posture based on monocular camera
Taher et al. An extended eye movement tracker system for an electric wheelchair movement control
Viswanatha et al. An Intelligent Camera Based Eye Controlled Wheelchair System: Haar Cascade and Gaze Estimation Algorithms
CN111178216B (en) Examination room abnormal behavior identification method based on gesture space-time characteristics
Shao et al. Design of intentional eye-blink signal acquisition and control system based on pressure sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant