WO2021118388A1 - Système de manipulation d'objets - Google Patents

Système de manipulation d'objets Download PDF

Info

Publication number
WO2021118388A1
WO2021118388A1 PCT/RU2019/000928 RU2019000928W WO2021118388A1 WO 2021118388 A1 WO2021118388 A1 WO 2021118388A1 RU 2019000928 W RU2019000928 W RU 2019000928W WO 2021118388 A1 WO2021118388 A1 WO 2021118388A1
Authority
WO
WIPO (PCT)
Prior art keywords
manipulator
module
unit
control
user
Prior art date
Application number
PCT/RU2019/000928
Other languages
English (en)
Russian (ru)
Inventor
Андрей Владимирович НОВИКОВ
Владимир Николаевич ГЕРАСИМОВ
Роман Александрович ГОРБАЧЕВ
Никита Евгеньевич ШВИНДТ
Владимир Иванович НОВИКОВ
Андрей Евгеньевич ЕФРЕМЕНКО
Дмитрий Леонидович ШИШКОВ
Михаил Нилович ЗАРИПОВ
Филипп Александрович КОЗИН
Алексей Михайлович СТАРОСТЕНКО
Original Assignee
федеральное государственное автономное образовательное учреждение высшего образования "Московский физико-технический институт (национальный исследовательский университет)"
Общество С Ограниченной Ответственностью "Нейроассистивные Технологии"
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by федеральное государственное автономное образовательное учреждение высшего образования "Московский физико-технический институт (национальный исследовательский университет)", Общество С Ограниченной Ответственностью "Нейроассистивные Технологии" filed Critical федеральное государственное автономное образовательное учреждение высшего образования "Московский физико-технический институт (национальный исследовательский университет)"
Priority to PCT/RU2019/000928 priority Critical patent/WO2021118388A1/fr
Publication of WO2021118388A1 publication Critical patent/WO2021118388A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators

Definitions

  • the invention relates to robotics, in particular, to robotic systems for manipulating objects by means of a manipulator with a gripping device, and can be used, inter alia, to help people with impaired motor functions caused by the consequences of brain and spinal cord injuries, stroke, neurodegenerative diseases and dr.
  • the declared object manipulation system performs the functions of gripping, holding and moving objects in the environment of a user with limited or impaired motor functions, often moving in a wheelchair.
  • the system can have a modular architecture, personalized for a specific user, and can be used in a hospital or at home.
  • manipulator must be capable of gripping objects of various shapes under the control of the user, while it must be protected from adverse factors such as dust and water.
  • RF patent RU2700246 discloses a method and a system for gripping an object using a robotic object, in which the robotic device is trained using a machine learning algorithm to recognize and memorize the gripping points of objects. Training is performed on data characterizing photographic images of objects and corresponding three-dimensional models of objects in various forms. This makes it possible to improve the accuracy of recognition of the area of object capture by the robotic device. Despite this, the recognition accuracy of the capture area of the object may be insufficient for the application of the known method and system by a user with limited or impaired motor functions. In addition, such a system requires rather long training, and the choice of an object of manipulation from several objects is not provided at all.
  • a robot operating in a work environment can be instructed to determine the work offset.
  • Work offset can describe the location and angular orientation of the work plane of the work environment relative to the base plane of the robot.
  • the robot can identify the work plane.
  • the robot can be controlled to contact one or more points on the work plane.
  • the robot can determine the corresponding locations of the points of the contacting points relative to the reference plane based on the corresponding positions of the robot at the corresponding moments of contact.
  • the robot can determine the location and angular orientation of the work plane relative to the reference plane based on the determined corresponding point locations of the contact points.
  • U.S. Patent Application US2019240843 describes a method for manipulating a deformable object, which includes determining the appropriate three-dimensional position of one or more markers on a deformable object held by a robotic arm, as well as determining a deformable model of the deformable object by displaying the movement of the robot arm and the movement of one or more markers.
  • a robotic arm is controlled based on a defined deformation model to manipulate the deformable object and move one or more markers to the appropriate position.
  • US patent application US2019143506 describes a system that is a robotic arm including a daisy chain containing a first link, a first link and a second link. The second link is between the first link and the first link in the daisy chain.
  • the system further includes a processing unit including one or multiple processors.
  • the processing unit is configured to receive data of the first communication line from the first system of sensors located in the first communication line, generate a first estimate of the connection state of the first connection based on the data of the first communication line and the kinematic model of the robotic arm, and control the first link based on the first estimate of the state connections.
  • the technical result of the invention is to provide high accuracy in determining the object of manipulation among a variety of surrounding objects with a compact and lightweight design and a clear definition of the user's intentions to manipulate the selected objects of manipulation.
  • the object manipulation system comprises a manipulator module, at least one gripper module, a control module, a user interface module, and a fastening module.
  • the manipulator module contains at least one manipulator drive, which includes a manipulator motor and a manipulator gear.
  • Gripper module is installed on the manipulator module and contains at least one gripper drive including the gripper motor.
  • the control module contains a control unit for the manipulator module for controlling the manipulator drive and executing the control algorithm for the manipulator module, 5 as well as a control unit for the gripping device module for controlling the gripping device drive and executing the control algorithm for the gripping device module.
  • the user interface module is designed to control the manipulation system, select an object and perform an operation on the object and contains an eye tracking unit for determining the direction of the user's gaze, a computer vision unit 0 for performing a stereo reconstruction of the area around the object (that is, an area in the service area of the manipulator where required manipulation object), constructing a disparity map and determining the characteristics of the object, including at least the geometric dimensions of the object and the distance to the object, as well as a neuro-headset unit for recording bioelectric potentials of the user's brain activity and transferring them to the control module.
  • the attachment module is designed to attach the manipulator module to an external device, such as a wheelchair.
  • the declared object manipulation system allows you to determine the objects that the user wants to control, and actually control them, where O under control is understood, in particular, to capture an object, hold an object and move objects.
  • an absolute angle sensor can be installed on the manipulator motor shaft.
  • control module may additionally contain PID controllers for current, speed, position and torque.
  • An absolute angle sensor can be installed on the output shaft of the manipulator gearbox.
  • the eye tracking unit is configured to determine the coordinates of the gaze direction vector.
  • the eye tracking unit can be configured to determine the coordinates of the intersection of the gaze direction vector with the plane in which the object is located.
  • the vision unit is configured to perform stereo reconstruction of the area around the object with a given frequency.
  • the technical vision unit contains a stereo pair, including the stereo pair cameras, a stereo image processing unit and a unit for matching images of the area around the object obtained from different angles from the stereo pair cameras and from the eye tracking unit camera.
  • the technical vision unit is preferably located above the head of the disabled operator, and the eye tracking unit is located on the head of the operator wearing glasses, and thus two images are obtained from different angles.
  • the headset unit can contain dry active electrodes for recording bioelectric potentials of the user's brain activity.
  • MM manipulator module
  • MZU gripper module
  • MU 3 - control module
  • BUMM 3.1 - manipulator module control unit
  • MIP user interface module
  • BCG eye tracking unit
  • the object manipulation system is designed to perform operations on objects, in particular, to grip, hold and move objects.
  • the manipulation system should allow the specified operations to be performed by a user with limited mobility, in particular, with limited mobility of the upper limbs, and at the same time guarantee the exact execution of exactly those movements that the user wishes to perform.
  • the manipulation system contains a manipulator module 1 (also MM), at least one gripper module 2 (also MZU), a control module 3 (also MU), a user interface module 4 (also MIP) and a fastening module 5 (also MK ).
  • manipulator module 1 also MM
  • gripper module 2 also MZU
  • control module 3 also MU
  • user interface module 4 also MIP
  • fastening module 5 also MK
  • MM 1 is designed to move in the space of MM 2 and can be installed on any object, including a wheelchair, by means of MK 5 installed at one end of MM 1 or near it.
  • any known design of a robotic arm can be selected, for example, described in RU2700246 or US2007095582.
  • O the number of links MM 1 is from four to six: a smaller number of links may not provide accurate and fast positioning of the MM 2 in a space with many objects, in a larger number of links not only complicates the design of the entire handling system, but also increases its weight, which is critical, in particular, when using it on a wheelchair.
  • US2007095582 also: 5 shows a possible option for attaching MM 1 to a wheelchair by means of MK 5.
  • MM 1 contains at least one manipulator drive, which includes a manipulator motor and a manipulator gearbox (not shown in the figure). It is preferable, but not necessary, if each link MM 1 contains its own manipulator drive, which makes it possible to more accurately position the MM 2 in space.
  • An angle sensor can be additionally installed on the shaft of the manipulator motor of one or more manipulator drives. Even when using precision motors of the manipulator, errors in the positioning of the MM 1 sections are possible, especially critical for the section farthest from the MK 5, on which the MZU 2 is installed, in a confined space and / or a large number of objects.
  • the presence of a rotation angle sensor on the manipulator motor shaft allows precise control of the shaft rotation angle. In this case, the most preferable is the use of an absolute encoder of the angle of rotation, which avoids the accumulation of errors in determining the angle of rotation of the shaft.
  • the output shaft of the manipulator gearbox of one or more manipulator drives the higher the accuracy of determining the rotation of this shaft, the more accurately the MZU 2 is positioned in space. Therefore, in this case, too, it is preferable to install a rotation angle sensor on the output shaft of the manipulator gearbox of one or more manipulator drives, most preferably an absolute rotation angle sensor, which avoids accumulation of errors in determining the shaft rotation angle.
  • MZU 2 is installed at the other end of MM 1 from MK 5 or near it and is designed to grip and hold objects to be manipulated. Any known device for a similar purpose, for example, described in RU118579 or US8936289, can be selected as MZU 2. In this case, it is preferable if the design of the MCD 2 is made in such a way that one drive of the gripper, including the motor of the gripper, is sufficient to control the fingers of the MCD 2. This provides MZU 2 compactness and low weight, which means accurate and fast positioning of MZU 2 in a space with many objects.
  • MU 3 is an electronic device and is designed to execute the control algorithm MM 1 and, accordingly, control MM 1, as well as to execute the control algorithm MZU 2 and, accordingly, control MZU 2.
  • MU 3 contains, respectively, a control unit for the manipulator module 3.1 (also BUMM) and a control unit for the gripper module 3.2 (also BUMZU).
  • MU 3 can be an electronic device based on a 32-bit microcontroller, for example, STM32F103C8T6.
  • a 32-bit microcontroller for example, STM32F103C8T6.
  • voltage stabilizers On the MU 3 board, voltage stabilizers, a CAN bus driver, drivers for controlling the manipulator motors (in BUMM 3.1), drivers for controlling the gripper motors (in BUMZU 3.2), ports for connecting absolute position and speed sensors, if any of they are used in MM 1 and / or MZU 2, an expansion port for connecting external modules, interfaces to encoders and strain gauges.
  • BUMM 3.1 and BUMZU 3.2 can share the specified microcontroller to execute the MM 1 control algorithm and the MCU 2 control algorithm.
  • MU 3 additionally contains proportional-integral-differentiating controllers (PID controllers) of current, speed, position and torque to control the manipulator motors and the gripper motors.
  • PID controllers proportional-integral-differentiating controllers
  • MIP 4 contains an eye tracking unit 4.1 (also BTG), a technical vision unit 4.2 (also BTZ), and a neuro headset unit 4.3 (also WB).
  • eye tracking unit 4.1 also BTG
  • technical vision unit 4.2 also BTZ
  • WB neuro headset unit
  • BTG 4.1 is designed to determine the direction of the user's gaze, which is then used to accomplish the task of user interaction with the manipulation system.
  • BTG 4.1 by means of the camera (or cameras) included in it, can provide the determination of the coordinates of the gaze direction vector and, additionally, the coordinates of the intersection of the gaze direction vector with the plane in which the manipulation object is located.
  • BTZ 4.2 carries out a stereo reconstruction of the area around the object, that is, the area in the movement zone of the MZU 2, in which the manipulation object can be located. Stereo reconstruction methods are widely presented in the prior art and do not require further explanation. In this case, stereo reconstruction can be carried out at a given frequency. BTZ 4.2 also builds a disparity map, that is, a map of the relative position of points displayed on the retinas of the left and right eyes. In addition, another function of BTZ 4.2 is to determine the characteristics of the manipulated object, such as the geometric dimensions of the object, the distance to the object.
  • BTZ 4.2 may contain a stereopair including stereo pair cameras, a stereo image processing unit and a unit for comparing images of the area around the object obtained from different angles from the stereo pair cameras and from the BTG 4.1 camera. Different angles are achieved in particular by the fact that BTZ 4.2 is placed approximately above the user's head, and BTG 4.1 - in glasses, thus 5 providing two images of the area around the object from different angles.
  • BN 4.3 is designed to record bioelectric potentials of the user's brain activity and transfer them to MU 3.
  • BN 4.3 can contain dry active electrodes.
  • the user fixes attention on the object with which he wants to perform manipulations.
  • BTG 4.1 determines the direction of the user's gaze.
  • BTZ 4.2 through its cameras, for example, stereo pair 5 cameras, captures the surrounding environment, based on data from BTG 4.1, calculates the fixation point of the user's gaze on the resulting image, recognizes the object at this point, determines the distance from the user's head to the object and the distance from MZU 2 to the object.
  • the user chooses the action he wants to perform with the recognized object, for example, move the object from one point in space to another, press a key or switch, etc.
  • Recognition of the user's intention in the claimed manipulation system can be implemented in 5 different ways.
  • the first option for recognizing the user's intention is carried out by means of BN 4.3, in which signals from the surface electroencephalogram are recorded.
  • the second option for recognizing the user's intention is carried out by means of BTG 4.1, in which the user selects one of the proposed actions by means of fixing his gaze on one of 0 icons displayed on the optically transparent display of augmented reality BTG 4.1.
  • BTG 4.1 and BTZ 4.2 jointly define a new point.
  • MU 3 by means of BUMM 3.1 and BUMZU 3.2, gives the corresponding commands to MM 1 to bring MZU 2 to the point of location of the manipulation object, about capturing and holding this object MZU 2, moving the manipulation object by means of MM 1 to a new point and release of this object MZU 2.
  • the user can use the object manipulation system for different needs. So the manipulation system can open and close doors, turn on the light, pour drinks, hold a glass of water 5, etc. This is not the end of its functions: it can also remove toys after children, feed a dog, support a person during trips, etc.
  • a distinctive feature of the declared manipulation system is the high accuracy of determining the object of manipulation among many others. objects, compact and lightweight design and a clear definition of the user's intentions to manipulate the selected manipulation objects.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

L'invention se rapporte au domaine des systèmes robotisés pour manipuler des objets à l'aide d'un manipulateur comprenant un dispositif de saisie, et peut être utilisée afin d'aider des personnes présentant un dysfonctionnement des fonctions motrices. Ce système de manipulation d'objets comprend un module de manipulateur (MM), au moins un module de dispositif de saisie (MDS) disposé sur le MM, un module de commande (MC), un module d'interface d'utilisateur (MIU) et un module de fixation (MF). Le MM comprend au moins un actionneur de manipulateur comprenant un moteur de manipulateur et un réducteur de manipulateur. Le MC comprend une unité de commande de MM et une unité de commande de MDS. Le MIU comprend une unité de suivi des yeux, une unité de vision technique et une unité de casque neuronal. L'utilisateur peut via le MIU ouvrir et fermer des portes, allumer la lumière, verser une boisson, tenir un verre avec de l'eau, etc. Ce système de manipulation assure une grande précision de détermination de l'objet à manipuler parmi une pluralité d'objets environnants, ainsi qu'une détermination stricte des intentions de l'utilisateur concernant la manipulation d'objets à manipuler choisis, et possède une structure compacte et légère.
PCT/RU2019/000928 2019-12-10 2019-12-10 Système de manipulation d'objets WO2021118388A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/RU2019/000928 WO2021118388A1 (fr) 2019-12-10 2019-12-10 Système de manipulation d'objets

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/RU2019/000928 WO2021118388A1 (fr) 2019-12-10 2019-12-10 Système de manipulation d'objets

Publications (1)

Publication Number Publication Date
WO2021118388A1 true WO2021118388A1 (fr) 2021-06-17

Family

ID=76330574

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/RU2019/000928 WO2021118388A1 (fr) 2019-12-10 2019-12-10 Système de manipulation d'objets

Country Status (1)

Country Link
WO (1) WO2021118388A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SU1134361A1 (ru) * 1982-10-06 1985-01-15 Институт Технической Кибернетики Ан Бсср Очувствленный схват манипул тора
SU1296404A1 (ru) * 1985-03-04 1987-03-15 Донецкое Отделение Института "Гипроуглеавтоматизация" След ща система двустороннего действи
US20140276944A1 (en) * 2013-03-14 2014-09-18 Board Of Regents Of The University Of Nebraska Methods, Systems, and Devices Relating to Robotic Surgical Devices, End Effectors, and Controllers
EP2813194A1 (fr) * 2013-06-12 2014-12-17 Georg-August Universität Göttingen Commande de dispositif de membre
RU182738U1 (ru) * 2018-03-12 2018-08-29 Общество с ограниченной ответственностью "Нейроботикс Трейдинг" Сухой активный электрод для нейрокомпьютерного интерфейса
CN108646915A (zh) * 2018-05-03 2018-10-12 东南大学 结合三维视线跟踪和脑机接口控制机械臂抓取物体的方法和系统
CN109875777A (zh) * 2019-02-19 2019-06-14 西安科技大学 一种带有取物功能的轮椅及其取物控制方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SU1134361A1 (ru) * 1982-10-06 1985-01-15 Институт Технической Кибернетики Ан Бсср Очувствленный схват манипул тора
SU1296404A1 (ru) * 1985-03-04 1987-03-15 Донецкое Отделение Института "Гипроуглеавтоматизация" След ща система двустороннего действи
US20140276944A1 (en) * 2013-03-14 2014-09-18 Board Of Regents Of The University Of Nebraska Methods, Systems, and Devices Relating to Robotic Surgical Devices, End Effectors, and Controllers
EP2813194A1 (fr) * 2013-06-12 2014-12-17 Georg-August Universität Göttingen Commande de dispositif de membre
RU182738U1 (ru) * 2018-03-12 2018-08-29 Общество с ограниченной ответственностью "Нейроботикс Трейдинг" Сухой активный электрод для нейрокомпьютерного интерфейса
CN108646915A (zh) * 2018-05-03 2018-10-12 东南大学 结合三维视线跟踪和脑机接口控制机械臂抓取物体的方法和系统
CN109875777A (zh) * 2019-02-19 2019-06-14 西安科技大学 一种带有取物功能的轮椅及其取物控制方法

Similar Documents

Publication Publication Date Title
US10755096B2 (en) 3D gaze control of robot for navigation and object manipulation
US10582974B2 (en) Estimation of a position and orientation of a frame used in controlling movement of a tool
Asfour et al. Toward humanoid manipulation in human-centred environments
US10660717B2 (en) Robotic interface positioning determination systems and methods
CN107921645B (zh) 远程操作机器人系统
US8498745B2 (en) Robot apparatus and gripping method for use in robot apparatus
EP2739231B1 (fr) Dispositif de support d'opération
EP2617530B1 (fr) Dispositif d'entrée de contrôle maître et manipulateur maître-esclave
Natale et al. A sensitive approach to grasping
WO2011065034A1 (fr) Procédé de commande de l'action d'un robot, et système de robot
JPWO2018097223A1 (ja) ロボット制御システム、機械制御システム、ロボット制御方法、機械制御方法、および記録媒体
Vogel et al. EDAN: An EMG-controlled daily assistant to help people with physical disabilities
Fishel et al. Tactile telerobots for dull, dirty, dangerous, and inaccessible tasks
CN110709211B (zh) 机器人系统和机器人系统的控制方法
CN111319039B (zh) 机器人
Wang et al. Free-view, 3d gaze-guided, assistive robotic system for activities of daily living
CN112008692A (zh) 示教方法
Song et al. KARES: intelligent rehabilitation robotic system for the disabled and the elderly
Schwaner et al. MOPS: A modular and open platform for surgical robotics research
CN113084784A (zh) 辅助头顶作业的穿戴式外肢体机器人
JP3482228B2 (ja) 視線検出によるマニピュレータ制御システム
WO2021118388A1 (fr) Système de manipulation d'objets
Chu et al. Hands-free assistive manipulator using augmented reality and tongue drive system
Farahmand et al. An intelligent assistive robotic manipulator
Lee et al. A self-reliance assistive tool for disable people

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19955505

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19955505

Country of ref document: EP

Kind code of ref document: A1