CN112757302A - Control method of portable dining-assistant robot - Google Patents

Control method of portable dining-assistant robot Download PDF

Info

Publication number
CN112757302A
CN112757302A CN202110012450.2A CN202110012450A CN112757302A CN 112757302 A CN112757302 A CN 112757302A CN 202110012450 A CN202110012450 A CN 202110012450A CN 112757302 A CN112757302 A CN 112757302A
Authority
CN
China
Prior art keywords
assistant robot
mouth
dining
eyes
face recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110012450.2A
Other languages
Chinese (zh)
Inventor
陈殿生
赵学毅
袁福
罗亚哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN202110012450.2A priority Critical patent/CN112757302A/en
Publication of CN112757302A publication Critical patent/CN112757302A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • B25J11/009Nursing, e.g. carrying sick persons, pushing wheelchairs, distributing drugs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0003Home robots, i.e. small robots for domestic use
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Nursing (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Manipulator (AREA)

Abstract

The invention designs a control method of a portable dining-assistant robot, which is based on a face recognition positioning and stopping triggering mode. The dining assistant robot identifies the mouth position of the user: and (3) measuring distance by adopting a binocular camera imaging and point calibration method. Thereby obtaining the spatial position of the mouth. The dining assistant robot identifies the intention of the user: detecting whether the old man is nodding or not, and starting the mechanical arm when nodding; shaking the head, the execution unit stops working, and interference caused by other conditions such as shaking and the like is eliminated. According to the technology, a low-cost and high-efficiency positioning mode is realized.

Description

Control method of portable dining-assistant robot
Technical Field
The invention relates to the technical field of robots, in particular to the field of robot control, and discloses a control method based on facial recognition and binocular ranging.
Background
The weak self-care ability of the empty nest old people is a problem to be solved urgently in the society at present. About 900 million empty nesters have difficulty eating food.
The dining-assistant robot designed for the old has great market demand. The dining-assistant robot can enhance the self-care ability of the old, reduce the psychological pressure of the old and reduce the burden of children. Considering economic factors, the dining-assistant robot is economical and practical; considering the cultural degree, the dining assistant robot should be easy to operate.
Disclosure of Invention
The invention designs a control method of a portable dining-assistant robot, which is based on a face recognition positioning and stopping triggering mode to solve the economic problem and the operation problem mentioned in the background technology.
A control method of a portable dining-assistant robot is characterized in that the system comprises the following steps: camera group, stop trigger system.
The camera group collects images, recognizes faces, takes the lower half part of the faces as an interested area, and then recognizes the mouth.
Preferably, after the mouth is identified, a binocular distance measurement principle is utilized, firstly, a Zhang Zhengyou calibration method is adopted, and then the position of the mouth is located by combining a human eye parallax principle.
The stopping triggering system judges whether the old people shake the head or not by detecting the moving distance and frequency of the eyes in unit time based on the recognition of the eyes, and when the old people shake the head, the execution unit stops working.
The invention has high identification precision and low hardware cost; the old can choose to eat or not by nodding and shaking the head, and the operation is simple.
Drawings
FIG. 1 is a system architecture diagram of the present invention.
Fig. 2 is a flow chart of binocular positioning.
Detailed Description
The present invention is directed to a novel control method and provides a novel human-computer interaction method to solve the above-mentioned problems in the background art.
The invention provides a technical scheme that a dining assistant robot identifies the intention of a user: detecting the distance and frequency of the movement of the eyes of the person in unit time, and judging whether the old man nods or not, wherein the mechanical arm is started when the old man nods; when the old person shaking the head is detected, the execution unit stops working, and interference caused by other conditions such as shaking is eliminated. To achieve this function, the programming principle is as follows: firstly, the face is calibrated, and the positions of eyes and the positions of eyeballs are calibrated. And then listing a position coordinate matrix and a translation matrix, and judging the moving pose of the eyeball by utilizing matrix transformation. Only parallel movement with the left and right limit distance larger than 4mm is effective data, and the others are judged as interference. Three dishes are arranged in the dinner plate, and the dishes are switched leftwards when the eyeball moves leftwards; when the eyeball moves to the right, dishes are switched to the right.
The invention provides a technical scheme that a dining assistant robot identifies the mouth position of a user: firstly, eliminating distortion of an image collected by a camera, calculating parallax formed by a target point on a left view and a right view, and firstly matching two corresponding image points of the target point on the left view and the right view. To reduce the matching search range, we can use epipolar constraint to reduce the matching of corresponding points from two-dimensional search to one-dimensional search. And then binocular correction is carried out, and the function is to strictly correspond the two images after distortion removal, so that epipolar lines of the two images are exactly on the same horizontal line, thus any point on one image and the corresponding point on the other image have the same line number, and the corresponding point can be matched only by one-dimensional search on the line.
The binocular camera adopts a distance measurement principle similar to human eyes. Human eyes can perceive the distance of an object because the images of the same object presented by the two eyes are different, which is also called as parallax. The farther the object distance is, the smaller the parallax error is; conversely, the greater the parallax. The magnitude of the parallax corresponds to the distance between the object and the eyes, and the spatial position of the mouth is obtained.
The invention realizes the depth information positioning through binocular ranging, and then judges the coordinates of the mouth position by using the parameters built in the camera. The internal parameters of each camera are obtained first, the relative position between the two cameras (namely the translation vector t and the rotation matrix R of the right camera relative to the left camera) is measured through calibration, and the stereoscopic coordinate of the depth position is obtained through the matrix transformation of the projection matrix.

Claims (4)

1. A control method of a portable dining assistant robot is characterized in that: the method is based on face recognition positioning and stopping trigger modes. The system comprises: camera group, stop trigger system.
2. The face recognition localization method of claim 1, wherein the camera group collects images, recognizes faces, and recognizes mouths by using the lower half of the faces as an interested area. After the mouth is identified, the binocular distance measuring principle is utilized, firstly, a Zhang Zhengyou calibration method is adopted, and then the position of the mouth is positioned by combining the human eye parallax principle.
3. The stop trigger system according to claim 1, wherein the system determines whether the elderly is shaking the head by detecting the distance and frequency of movement of the eyes in a unit time based on the recognition of the eyes, and the execution unit stops working when the elderly is detected to shake the head.
4. The method as claimed in claim 1, wherein the face recognition determines whether to switch dishes by detecting the distance and frequency of the movement of the human eyes in a unit time. The function can help patients with quadriplegia and aphasia to select proper dishes.
CN202110012450.2A 2021-01-06 2021-01-06 Control method of portable dining-assistant robot Pending CN112757302A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110012450.2A CN112757302A (en) 2021-01-06 2021-01-06 Control method of portable dining-assistant robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110012450.2A CN112757302A (en) 2021-01-06 2021-01-06 Control method of portable dining-assistant robot

Publications (1)

Publication Number Publication Date
CN112757302A true CN112757302A (en) 2021-05-07

Family

ID=75701527

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110012450.2A Pending CN112757302A (en) 2021-01-06 2021-01-06 Control method of portable dining-assistant robot

Country Status (1)

Country Link
CN (1) CN112757302A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020064438A1 (en) * 2000-05-08 2002-05-30 Osborne William Joseph Self-feeding apparatus with hover mode
JP2008125696A (en) * 2006-11-20 2008-06-05 Yamaguchi Univ Meal support system
US20130203024A1 (en) * 2011-10-10 2013-08-08 Jonathan P. Dekar Method and apparatus for monitoring food consumption by an individual
US20170095382A1 (en) * 2014-03-21 2017-04-06 Rensselaer Polytechnic Institute Mobile human-friendly assistive robot
CN109531590A (en) * 2018-11-28 2019-03-29 台州学院 A kind of dining assistant robot
CN109605385A (en) * 2018-11-28 2019-04-12 东南大学 A kind of rehabilitation auxiliary robot of mixing brain-computer interface driving
CN110827974A (en) * 2019-11-08 2020-02-21 上海第二工业大学 Intelligent auxiliary feeding nursing system and auxiliary feeding method thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020064438A1 (en) * 2000-05-08 2002-05-30 Osborne William Joseph Self-feeding apparatus with hover mode
JP2008125696A (en) * 2006-11-20 2008-06-05 Yamaguchi Univ Meal support system
US20130203024A1 (en) * 2011-10-10 2013-08-08 Jonathan P. Dekar Method and apparatus for monitoring food consumption by an individual
US20170095382A1 (en) * 2014-03-21 2017-04-06 Rensselaer Polytechnic Institute Mobile human-friendly assistive robot
CN109531590A (en) * 2018-11-28 2019-03-29 台州学院 A kind of dining assistant robot
CN109605385A (en) * 2018-11-28 2019-04-12 东南大学 A kind of rehabilitation auxiliary robot of mixing brain-computer interface driving
CN110827974A (en) * 2019-11-08 2020-02-21 上海第二工业大学 Intelligent auxiliary feeding nursing system and auxiliary feeding method thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李彦涛,等: "《基于xPC的助餐机器人实时控制系统研究》", 《康复医学工程》 *
李彦涛: "《助餐机器人样机研制及控制研究》", 《中国博士学位论文全文数据库信息科技辑》 *

Similar Documents

Publication Publication Date Title
US9298254B2 (en) Movable display apparatus, robot having movable display apparatus and display method thereof
CN107004275B (en) Method and system for determining spatial coordinates of a 3D reconstruction of at least a part of a physical object
CN110605714B (en) Hand-eye coordination grabbing method based on human eye fixation point
EP3288465B1 (en) In-device fusion of optical and inertial positional tracking of ultrasound probes
US9235753B2 (en) Extraction of skeletons from 3D maps
Hennessey et al. Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions
JP5137833B2 (en) Gaze direction detection device and gaze direction detection method
WO2015013022A1 (en) Method and computations for calculating an optical axis vector of an imaged eye
JP5828167B2 (en) Gaze direction estimation apparatus, gaze direction estimation method, and program for causing computer to execute gaze direction estimation method
JP5001930B2 (en) Motion recognition apparatus and method
Hennessey et al. Fixation precision in high-speed noncontact eye-gaze tracking
JP2023512563A (en) Systems and methods for determining in-vivo depth perception in surgical robotic systems
WO2016142489A1 (en) Eye tracking using a depth sensor
KR101396488B1 (en) Apparatus for signal input and method thereof
JP6288770B2 (en) Face detection method, face detection system, and face detection program
Arar et al. Robust gaze estimation based on adaptive fusion of multiple cameras
CN112757302A (en) Control method of portable dining-assistant robot
WO2021097332A1 (en) Scene perception systems and methods
CN117238031A (en) Motion capturing method and system for virtual person
WO2016176452A1 (en) In-device fusion of optical and inertial positional tracking of ultrasound probes
Kohlbecher et al. Low-latency combined eye and head tracking system for teleoperating a robotic head in real-time
US20230085970A1 (en) Three-dimensional (3d) modeling
JP4565445B2 (en) Face information measurement system
JP2020081756A (en) Face image processing device, image observation system, and pupil detection system
JP3834636B2 (en) Method and system for correcting deviation of head movement in gaze position measurement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210507

WD01 Invention patent application deemed withdrawn after publication