GB2464486A - Control of a robotic arm by the recognition and analysis of facial gestures. - Google Patents

Control of a robotic arm by the recognition and analysis of facial gestures. Download PDF

Info

Publication number
GB2464486A
GB2464486A GB0818942A GB0818942A GB2464486A GB 2464486 A GB2464486 A GB 2464486A GB 0818942 A GB0818942 A GB 0818942A GB 0818942 A GB0818942 A GB 0818942A GB 2464486 A GB2464486 A GB 2464486A
Authority
GB
United Kingdom
Prior art keywords
disabled person
face
robotic arm
facial
mark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0818942A
Other versions
GB0818942D0 (en
Inventor
Aaron Shafir
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shafir Production Systems Ltd
Original Assignee
Shafir Production Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shafir Production Systems Ltd filed Critical Shafir Production Systems Ltd
Priority to GB0818942A priority Critical patent/GB2464486A/en
Publication of GB0818942D0 publication Critical patent/GB0818942D0/en
Priority to PCT/IB2009/054546 priority patent/WO2010044073A1/en
Publication of GB2464486A publication Critical patent/GB2464486A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F4/00Methods or devices enabling patients or disabled persons to operate an apparatus or a device not forming part of the body 
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • G06K9/00221
    • G06K9/00335
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Vascular Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Manipulator (AREA)

Abstract

A system for aiding a disabled person in activities such as eating and drinking. The system includes a robotic arm; at least one camera to produce image(s) of the face of the disabled person; an image processing module that processes the image(s); and a control sub-system for controlling the position of the robotic arm based on repositioning of one or more facial elements by the disabled person. The facial elements can be naturally occurring facial features or marks applied to the face, for example stickers which may be visible to the camera(s) but transparent to the eye, or marks projected onto the face by the camera(s).

Description

SYSTEM AND METHOD FOR AIDING A DISABLED PERSON
FIELD OF THE INVENTION
The present invention relates to a system for aiding a disabled person, particularly to aid disabled person with limited use of the upper limbs.
BACKGROUND OF THE INVENTION
Disabled people have difficulties in carrying out simple every day operations such as eating and drinking.
To help such disabled persons, patent application JP09224965 discloses a meal support robot dedicated to handling food. A person having handicap in an upper limb can independently operate the robot by providing a light projecting part for projecting a directional light, a light receiving part for receiving the light, and a holding part which is elastically deformed by contact with an operator and irradiating the light in a desired position on the light receiving part.
U.S. patent 6,961,623 discloses a remote control method for use by disabled person, e.g. to actuate a muscle stimulator cuff, which involves triggering a signal by using mechanical vibrations detected to control operation of the device or process.
SUMMARY OF THE INVENTION
According to one aspect of the present invention there is provided a system for aiding a disabled person comprising: at least one robotic arm; at least one camera for producing at least one image of the face of the disabled person; an image processing module for processing the image; and a control sub-system for controlling the position of the at least one robotic arm, based on repositioning of one or more facial elements by the disabled person.
The term "facial elements" herein the specification and claims is used in its broadest sense and includes: facial movement or gestures (e.g. eye, ears, mouth and nose; smiling, tongue movement, raising an eyebrow, and the like); head movements (e.g. nodding yes or no or tilting, etc). The term also relates to the use of features of the face including natural features (wrinkles, the tip of the nose, edge of a lip, and so on) and artificial features such as marks applied on the face (e.g. a sticker with an "x" written placed on the face or an "x" projected onto the face such as by a projector, laser, etc., which capability according to some embodiments is incorporated into the camera(s)).
According to another aspect, the present invention relates to a method for aiding a disabled person comprising detecting one or more facial elements of the disabled person and moving an appliance attachable to a controllable robotic arm with corresponding to the detected facial movement.
The present system and method can be used for a variety of activities including eating/feeding, drawing, writing, teeth brushing and the like.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will be understood more clearly and other features and advantages shall become apparent from the following detailed description of exemplary embodiments, with reference to the following figures, wherein: Fig. 1 is an isometric view of an exemplary system in accordance with the present invention as operated by a person in a wheel chair; Fig. 2A is an isometric front view of the disable person face; Fig. 2B is an isometric profile view of the disable person face; Fig. 3 is an isometric side view of a robotic arm; Fig. 4A is a front view of the disable person face including arrows designating the person face movement; and Fig. 4B is a profile view of the disable person face including arrows designating the person face movement.
DESCRIPTION OF EMBODIMENTS OF THE PRESENT INVENTION
Fig. 1 illustrates an embodiment of a system for aiding a disabled person in accordance with the present invention, operated by the disabled person, for example, in a chair or wheel chair 32. A digital camera 34 is attached to a chair support 36 of chair 32. Resting upon a table 38, a food plate 40, a robotic arm 42, and a storage means 44 which stores appliances that can be attached to a robotic catch 45, disposed on an end 46 of the robotic arm. Examples of such appliances include a fork, a cup and so forth. Robotic arm 42 has freedom to move in one or more axes and preferably about those axes. Examples of such robotic arms are the VP 5-axis series and VP 6-axis series arms (DENSO Robotics, 3900 Via Oro Avenue, Long Beach, CA 90810, USA), which can perform in five and six freedoms of movement, respectively.
Typically, an additional camera 48 is attached to table 38. Camera 48 is images a front view of face 58 of disabled person 30 and camera 34 is used for imaging the profile of the disabled person's face. The digital images are stored in a digital memory means, not shown, for further processing.
According to some embodiments, the system further includes a voice recognition sub-system, not shown, for recognizing voice commands of disabled person 30. An example of such voice commands are, "change tool", which commands robot arm 42 to substitute the appliance currently attached to catch 45 with another appliance that is stored in storage means 44. The command!!open catch" for example is another voice command that commands catch 45 of robotic arm 42 to open.
An isometric front view and isometric profile view of disabled person 30 is shown in Figs. 2A and 2B respectively, to which references are now made.
In this example, on the face 58 of the disabled person 30 are placed four marks 60, 62, 64 and 66 (e.g. via stickers or projected thereon). These marks can be natural marks appearing on face 58, facial gestures or marks artificially placed on the disabled person face 58. Marks 60, 62, 64 and 66 are detected by cameras 34 and 48 as known in the art, implemented by an image processing module, not shown. Examples of useful natural marks are wrinkles and moles.
In some embodiments of the present invention, a transparent sticker which is invisible to the human eye but is detectable by cameras 34 and 48 can be applied on the disabled person's face 58. Mark 60 is positioned on the forehead of disabled person 30. The disabled person moving his forehead is accompanied by a corresponding movement of mark 60. Mark 64 is disposed on the chin of the disabled person. When the disabled person moves his chin, mark 60 moves as well. Marks 62 and 66 are positioned on the cheeks of the disabled person. In some embodiments of the present invention marks 62 and 66 are moved by moving the tongue of the disabled person towards the person's right and left cheeks, respectively.
A side view of robotic arm 42 is shown in Fig. 3 to which reference is now made. Robotic arm 42 can be described as being controllable with reference to a Cartesian grid. Double headed arrow 70 designates the is movement direction of robotic arm 42 in the x-axis. Double headed arrow 72 designates the movement direction of robotic arm 42 in the z-axis. Double headed arrow 74 designates the movement direction of robotic arm 42 in the y-axis. Rotational movements around axes 70, 72 and 74 are designated by arrows 76, 78 and 80 respectively.
A front and profile view of the face 58 of disabled person 30, including arrows describing the facial movement, is shown in Figs. 4A and 4B to which reference is now made. The movements of the disabled person's head is analyzed by determining the distance between any of marks 60, 62, 64 and 66 before and after a facial movement(s) or facial gestures respectively are performed. Nodding of the disabled person's head up and down, in the direction designated by double headed arrow 90, is accompanied by a corresponding movement of marks 60, 64. Sideways turning of the disabled person's head in the direction designated by double headed arrow 92, is accompanied by a corresponding movement of marks 60 and 64 to the left or to the right.
Sideways tiliting of the disabled person head in the direction designated by double headed arrow 94, is accompanied by a corresponding movement of mark 60 to the right, and mark 64 to the left, or, mark 60 moves to the left and mark 64 moves to the right. The movements of the disabled person's chin in the direction designated by double headed arrow 96 moves mark 64 up or down in respect to mark 60. Movement of the disabled person's chin in the direction designated by arrows 98 is accompanied by a corresponding movement of mark 64 left or right with respect to mark 60. A movement of the disabled person's tongue in the direction designated by arrow 100 can cause mark 66 to move to the left with respect to marks 60 and 64. The movement of the disabled person's tongue in the direction designated by arrow 102 moves mark 62 to the right with respect to marks 60 and 64.
A control sub-system, not shown, controls the movement of robotic arm 42. Robotic arm 42 is controlled/driven by analyzing the aforementioned movement(s) of the head of disabled person 30 and issuing commands to driving mechanisms, not shown, for controlling the robotic arm (Fig. 3). Head is movement in the directions designated by double headed arrow 90 will actuate robotic arm 42 in the directions designated by double headed arrow 70. A movement in direction 92 will actuate robotic arm 42 in the direction designated by double headed arrow 72. A movement in direction 96 will actuate robotic arm 42 in the direction designated by double headed arrow 74. A movement in direction 98 will actuate robotic arm 42 in the direction designated by arrow 76.
A movement in direction 94 will actuate robotic arm 42 in the direction designated by arrow 78. A movement in direction 100 or direction 102 will actuate robotic arm 42 in the direction designated by arrow 80.
In some embodiments of the present invention marks 60, 62, 64 and 66 can be projected by one of the cameras 34 and 48 or an additional other component onto the person's face 58; and robotic arm 42 is controlled by the relative movement of the face and the mark.

Claims (13)

  1. CLAIMS1. A system for aiding a disabled person comprising: * at least one robotic arm; * at least one camera for producing at least one image of the face of said disabled person; * an image processing module for processing said image; and * a control sub-system for controlling the position of said at least one robotic arm, based on repositioning of one or more facial elements by said disabled person.
  2. 2. A system as in claim 1, comprising two cameras, wherein one camera images a front view of the face of said disabled person and a second camera images a profile of the face of said disabled person.
  3. 3. A system as in claim 1, wherein said system further comprises storage means which stores appliances that can be attached to said at least one robotic arm.
  4. 4. A system as in claim 1, wherein said robotic arm is adapted to move in a plurality of axes.
  5. 5. A system as in claim 1, further comprises a voice recognition sub-system for recognizing voice commands of said disabled person.
  6. 6. A system as in claim 1, wherein the facial element is a natural mark existing on the disabled person's face.
  7. 7. A system as in claim 6, wherein said artificial mark is a transparent sticker which is invisible to a human eye but can be detectable by said camera(s).
  8. 8. A system as in claim 1, wherein the facial element(s) are applied on the face of the disabled person.
  9. 9. A system as in claim 8, wherein the applied facial element(s) are a mark(s) disposed on the face of the disabled person via a sticker.
  10. 10. A system as in claim 8, wherein the applied facial element(s) are a mark(s) projected on the face of the disabled person.
  11. 11. A system as in claim 10, wherein the camera(s) are adapted to project the mark.
  12. 12. A method for aiding a disabled person comprising detecting one or is more facial elements of said disabled person and moving a robotic arm corresponding to movement of the detected facial element(s).
  13. 13. A method according to claim 12, wherein prior to detecting one or more facial elements, a facial element is applied to the face of the disabled person.
GB0818942A 2008-10-16 2008-10-16 Control of a robotic arm by the recognition and analysis of facial gestures. Withdrawn GB2464486A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB0818942A GB2464486A (en) 2008-10-16 2008-10-16 Control of a robotic arm by the recognition and analysis of facial gestures.
PCT/IB2009/054546 WO2010044073A1 (en) 2008-10-16 2009-10-15 System and method for aiding a disabled person

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0818942A GB2464486A (en) 2008-10-16 2008-10-16 Control of a robotic arm by the recognition and analysis of facial gestures.

Publications (2)

Publication Number Publication Date
GB0818942D0 GB0818942D0 (en) 2008-11-19
GB2464486A true GB2464486A (en) 2010-04-21

Family

ID=40084106

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0818942A Withdrawn GB2464486A (en) 2008-10-16 2008-10-16 Control of a robotic arm by the recognition and analysis of facial gestures.

Country Status (2)

Country Link
GB (1) GB2464486A (en)
WO (1) WO2010044073A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102720249A (en) * 2011-03-29 2012-10-10 梁剑文 Washbowl with mechanic hand

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4207959A (en) * 1978-06-02 1980-06-17 New York University Wheelchair mounted control apparatus
WO1999056274A1 (en) * 1998-04-28 1999-11-04 Deluca Michael J Vision pointer method and apparatus
WO1999064961A1 (en) * 1998-06-08 1999-12-16 Microsoft Corporation Method and system for capturing and representing 3d geometry, color and shading of facial expressions
EP1055224A2 (en) * 1996-12-09 2000-11-29 Tracer Round Associates, Ltd. Wheelchair voice control apparatus
US6163322A (en) * 1998-01-19 2000-12-19 Taarna Studios Inc. Method and apparatus for providing real-time animation utilizing a database of postures
EP1179765A2 (en) * 2000-07-06 2002-02-13 Universita Di Modena E Reggio Emilia, Dipartimento Di Scienze Dell Ingegneria, Dipartimento Di Scienze Biomediche System for the interaction between the ocular movement of a subject and a personal computer
US20040174496A1 (en) * 2003-03-06 2004-09-09 Qiang Ji Calibration-free gaze tracking under natural head movement
US20040179008A1 (en) * 2003-03-13 2004-09-16 Sony Corporation System and method for capturing facial and body motion
EP1667049A2 (en) * 2004-12-03 2006-06-07 Invacare International Sàrl Facial feature analysis system
US20070217891A1 (en) * 2006-03-15 2007-09-20 Charles Folcik Robotic feeding system for physically challenged persons
JP2007310914A (en) * 2007-08-31 2007-11-29 Nippon Telegr & Teleph Corp <Ntt> Mouse alternating method, mouse alternating program and recording medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5323470A (en) * 1992-05-08 1994-06-21 Atsushi Kara Method and apparatus for automatically tracking an object
US5463463A (en) * 1994-01-25 1995-10-31 Mts System Corporation Optical motion sensor
US20020126090A1 (en) * 2001-01-18 2002-09-12 International Business Machines Corporation Navigating and selecting a portion of a screen by utilizing a state of an object as viewed by a camera
JP2008522708A (en) * 2004-12-07 2008-07-03 タイラートン インターナショナル インコーポレイテッド Apparatus and methods for training, rehabilitation, and / or support
JP5258558B2 (en) * 2005-05-31 2013-08-07 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Method for control of equipment

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4207959A (en) * 1978-06-02 1980-06-17 New York University Wheelchair mounted control apparatus
EP1055224A2 (en) * 1996-12-09 2000-11-29 Tracer Round Associates, Ltd. Wheelchair voice control apparatus
US6163322A (en) * 1998-01-19 2000-12-19 Taarna Studios Inc. Method and apparatus for providing real-time animation utilizing a database of postures
WO1999056274A1 (en) * 1998-04-28 1999-11-04 Deluca Michael J Vision pointer method and apparatus
WO1999064961A1 (en) * 1998-06-08 1999-12-16 Microsoft Corporation Method and system for capturing and representing 3d geometry, color and shading of facial expressions
EP1179765A2 (en) * 2000-07-06 2002-02-13 Universita Di Modena E Reggio Emilia, Dipartimento Di Scienze Dell Ingegneria, Dipartimento Di Scienze Biomediche System for the interaction between the ocular movement of a subject and a personal computer
US20040174496A1 (en) * 2003-03-06 2004-09-09 Qiang Ji Calibration-free gaze tracking under natural head movement
US20040179008A1 (en) * 2003-03-13 2004-09-16 Sony Corporation System and method for capturing facial and body motion
EP1667049A2 (en) * 2004-12-03 2006-06-07 Invacare International Sàrl Facial feature analysis system
US20070217891A1 (en) * 2006-03-15 2007-09-20 Charles Folcik Robotic feeding system for physically challenged persons
JP2007310914A (en) * 2007-08-31 2007-11-29 Nippon Telegr & Teleph Corp <Ntt> Mouse alternating method, mouse alternating program and recording medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Bley F et al., "Supervised navigation and manipulation for impaired wheelchair users", Systems, Man and Cybernetics, vol 3, Pg 2790-2796, 10/10/2004, ISBN 978-0-7803-8566-5 *

Also Published As

Publication number Publication date
WO2010044073A1 (en) 2010-04-22
GB0818942D0 (en) 2008-11-19

Similar Documents

Publication Publication Date Title
US9517559B2 (en) Robot control system, robot control method and output control method
JP2019535928A (en) Eyelash evaluation method and apparatus
US20070265495A1 (en) Method and apparatus for field of view tracking
CN109571513B (en) Immersive mobile grabbing service robot system
JP5186723B2 (en) Communication robot system and communication robot gaze control method
Maimon-Mor et al. Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking
KR20110098966A (en) Electronic data input system
WO2004041078A3 (en) Housing device for head-worn image recording and method for control of the housing device
WO2019087495A1 (en) Information processing device, information processing method, and program
CN106214163B (en) Recovered artifical psychological counseling device of low limbs deformity correction postoperative
JP2024009862A (en) Information processing apparatus, information processing method, and program
Li et al. An egocentric computer vision based co-robot wheelchair
JPH1080886A (en) Vision control robot
Buckley et al. Sensor suites for assistive arm prosthetics
GB2464486A (en) Control of a robotic arm by the recognition and analysis of facial gestures.
KR20110049703A (en) Surgical robot system and laparoscope handling method thereof
Ababneh et al. Gesture controlled mobile robotic arm for elderly and wheelchair people assistance using kinect sensor
US20220257441A1 (en) Nursing bed system and nursing bed posture changing device
Chu et al. Hands-free assistive manipulator using augmented reality and tongue drive system
JP6158665B2 (en) Robot, robot control method, and robot control program
KR101114234B1 (en) Surgical robot system and laparoscope handling method thereof
Sharma et al. Eye gaze controlled robotic arm for persons with ssmi
JP2003266353A (en) Robot device and control method therefor
JP2004227276A (en) Human communication behavior recording system and method
CN206671687U (en) A kind of intelligent display for wear-type

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)