EP3645218A1 - Dispositif de gestion des deplacements d'un robot et robot de soin associe - Google Patents

Dispositif de gestion des deplacements d'un robot et robot de soin associe

Info

Publication number
EP3645218A1
EP3645218A1 EP18731147.7A EP18731147A EP3645218A1 EP 3645218 A1 EP3645218 A1 EP 3645218A1 EP 18731147 A EP18731147 A EP 18731147A EP 3645218 A1 EP3645218 A1 EP 3645218A1
Authority
EP
European Patent Office
Prior art keywords
generic model
point
dimensional representation
robot
treated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP18731147.7A
Other languages
German (de)
English (en)
French (fr)
Inventor
François EYSSAUTIER
Guillaume GIBERT
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Capsix
Original Assignee
Capsix
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Capsix filed Critical Capsix
Publication of EP3645218A1 publication Critical patent/EP3645218A1/fr
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0064Body surface scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1077Measuring of profiles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F7/00Heating or cooling appliances for medical or therapeutic treatment of the human body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H23/00Percussion or vibration massage, e.g. using supersonic vibration; Suction-vibration massage; Massage with moving diaphragms
    • A61H23/02Percussion or vibration massage, e.g. using supersonic vibration; Suction-vibration massage; Massage with moving diaphragms with electric or magnetic drive
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H9/00Pneumatic or hydraulic massage
    • A61H9/005Pneumatic massage
    • A61H9/0057Suction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/401Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by control arrangements for measuring, e.g. calibration and initialisation, measuring workpiece for machining purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1657Movement of interface, i.e. force application means
    • A61H2201/1659Free spatial automatic movement of interface within a working area, e.g. Robot
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/85Contour of the body
    • A61H2230/855Contour of the body used as a control parameter for the apparatus
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37205Compare measured, vision data with computer model, cad data
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37208Vision, visual inspection of workpiece
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45084Service robot

Definitions

  • the invention relates to the field of management of the movements of a robot operating on surfaces of very different geometries.
  • the invention can be applied in many technical fields in which the working surface of the robot is not known a priori.
  • the invention can be implemented for a home-made porcelain painting robot in which the robot must adapt to the different shapes of porcelain made by a craftsman.
  • the invention finds a particularly advantageous application for moving a motorized arm of a care robot, such as a massage robot.
  • a robot operating on an unknown surface must include displacement management means capable of analyzing the surface to be treated to determine a trajectory.
  • the recognition robots generally integrate at least one camera and image processing means for analyzing, over time, the exploration surface and determine the trajectory to follow for the robot.
  • This method of analyzing an unknown surface requires a great deal of computing power to precisely guide the movements of the robot over time. It follows that the crawlers move slowly so as to allow the motion management device to optimize the movements of the robot according to the information acquired by the camera and processed by the image processing means.
  • this massage robot is not autonomous because a practitioner must be present to use his expertise and program a trajectory of the massage robot on a digital model.
  • the technical problem consists in automating the management of the movements of a robot operating on an unknown surface with a high precision of movement.
  • the present invention aims to answer this technical problem by means of several known displacement sequences for a generic model associated with means for adapting a surface to be treated with the generic model so as to adapt the sequences known displacements for a generic model and apply them to the surface to be treated.
  • the robot is able to automatically adapt the sequences of displacements by adapting the shape of the generic model on the shape of the surface to be treated.
  • the invention relates to a mobility management device of a robot configured to treat a surface, said device comprising:
  • said determination means incorporate at least one three-dimensional generic model for which several displacement sequences are known; said device comprising means for adapting said generic model with said three-dimensional representation of said surface to be treated capable of deforming said generic model to correspond to said three-dimensional representation of said surface to be treated, the deformations of said generic model being applied to sequences of known displacements so as to obtain at least new displacement sequences adapted to the dimensions of said surface to be treated; said robot being configured by processing said surface according to one of the new displacement sequences.
  • the invention thus makes it possible to use several sequences of known displacements on a generic model to apply them to a surface to be treated whose geometry is not known during the learning of the displacement sequences.
  • a painter can set standard patterns to paint on mugs by recording these patterns against a generic model of a standard mug.
  • a robot can then scan the surface of a newly created cup and apply one of the standard patterns by deforming the generic model depending on the surface of the newly created cup.
  • a generic model of a human body is made from measurements on real people. This generic model is then represented in three dimensions so that a practitioner can define a massage path through various sensitive points of the generic model so as to obtain an effective massaging effect.
  • These sensitive points of the generic model may correspond to areas, for example an area extending within a radius of 50 mm around a sensing point in a direction normal to the surface at that point.
  • the body of the patient is scanned and the shape of the generic model is adapted to the shape of the patient so that the deformation of the generic model allows to obtain a deformation of the massage path so to adapt the movements of the robot with the shape of the body of the patient while respecting the accuracy of massage recorded by the practitioner.
  • the invention makes it possible to reproduce a massage quality of very high level with a sensation of massage very close to, or even identical to that of a practitioner.
  • several massage trajectories can be digitized to perform several different types of massage.
  • At least one known movement sequence includes positions for which actions are preprogrammed for said robot.
  • This embodiment makes it possible to control the operation of actuators during movements of the robot.
  • the robot can perform specific surface treatments in certain places.
  • certain positions of the robot can control the triggering of heating means to improve the comfort and / or the effect of the massage.
  • the known sequence of displacements can comprise several trajectories realized with a palpate-rolling movement whereas other displacements are carried out with another type of movement.
  • said generic model and said three-dimensional representation of said surface to be treated being formalized in the form of point clouds, said adaptation means comprise:
  • search means for each point of the point cloud of said three-dimensional representation, of the point of the generic model in a close neighborhood for which the difference between the normal direction of the point of the generic model and the normal direction of the point of interest is the weakest;
  • the normal directions make it possible to obtain information relating to the orientation of the faces of the generic model and of the three-dimensional representation of the surface to be treated. Unlike a simple comparison of point-to-point coordinates, a comparison of the faces makes it possible to obtain a more efficient recognition.
  • the adaptation of the generic model is carried out step by step by gradually modifying the generic model according to the average distances. It follows that this embodiment makes it possible to adapt the generic model effectively by comparing the normal directions of each point of the generic model and the normal directions of the three-dimensional representation of the surface to be treated.
  • said search means are configured to search the points of the generic model in a predefined sphere around the point of interest.
  • This embodiment aims to limit the search area of the points of the generic model so as to limit the calculation time.
  • the limitation of the search area also makes it possible to limit the amplitude of the modification of the generic model between two comparisons so as to increase the accuracy of the modification of the generic model.
  • the normal directions are determined by constructing a face by means of the coordinates of the three or four points closest to the point of interest.
  • This embodiment makes it possible to efficiently construct the faces of the generic model and the three-dimensional representation of the surface to be treated.
  • said adaptation means comprise:
  • the characteristic points may correspond to the upper and lower ends of the porcelain.
  • the characteristic points may correspond to the upper end of the skull, the position of the armpits and the position of the crotch.
  • said acquisition means comprise means for pre-processing said three-dimensional representation by means of the capture of several three-dimensional representations and the achievement of an average of the coordinates of the points between the different representations in three dimensions. This embodiment makes it possible to improve the accuracy of the representation in three dimensions and thus of the adaptation of the generic model.
  • said pre-processing means perform a filtering of said average of the coordinates of the points between the different three-dimensional representations.
  • the invention relates to a care robot comprising:
  • This second aspect of the invention relates to a care robot for which the precision of the movements of the robot is an essential criterion to avoid hurting the patient.
  • said acquisition means are arranged on said articulated arm or on said effector. This embodiment makes it possible to move the acquisition means to acquire precisely the three-dimensional representation.
  • FIG. 1 which constitutes a flowchart of the steps operating a device for managing the movements of a massage robot according to one embodiment of the invention.
  • the invention is described with reference to a massage robot so as to show the adaptability of the robot because it is obvious that the surfaces of human bodies have large disparities.
  • the invention is not limited to this specific application and it can be used for several robots working on a surface whose geometry is not predetermined and for which precise movements of the robot must be made.
  • the analysis of the surface to be treated is performed by acquisition means 14 capable of providing a three-dimensional representation Re of the surface to be treated.
  • the three-dimensional representation Re takes the form of a point cloud in which each point has three coordinates of an orthonormal system: x, y and z.
  • These acquisition means 14 may correspond to a set of photographic sensors, a set of infrared sensors, a tomographic sensor, a stereoscopic sensor or any other known sensor for acquiring a three-dimensional representation of a surface.
  • the Kinect® camera from Microsoft® can be used to obtain this three-dimensional representation Re.
  • this three-dimensional representation Re without capturing the environment, it is possible to capture a first cloud of points corresponding to the environment and a second cloud of points corresponding to the surface to be treated in its environment. Only the different points between the two point clouds are kept so as to extract from the environment the points corresponding to the surface to be treated.
  • This method allows you to get away from a standardized environment for recording and adapt to any environment.
  • these sensors 14 are often implemented with preprocessing means 15 to provide a three-dimensional representation Re with improved quality or accuracy.
  • the preprocessing means 15 may correspond to an algorithm for equalizing histograms, filtering, averaging the representation on several successive representations ...
  • the device then implements computerized processing in order to adapt a generic model ml, m2, m3 with the three-dimensional representation Re so as to transfer to the three-dimensional representation Re of the preprogrammed Tx displacement sequences on each generic model ml , m2, m3.
  • the displacement sequences Tx can be projected on the second cloud of points corresponding to the surface to be treated.
  • these displacement sequences Tx may include positions for which actions are preprogrammed for said robot.
  • the generic models ml, m2, m3 are also formalized in the form of a point cloud in which each point has three coordinates of an orthonormal system: x, y and z.
  • the generic model is composed of a ModMoy average model of N vertex with three coordinates and a ModSigma deformation matrix of M morphological components with 3N coordinates, that is to say three coordinates for N vertex.
  • a principal component analysis is applied to reduce the size of the data.
  • a principal component analysis is applied to these data, it is possible to determine the variance in the data and to associate the common variance on a component.
  • each generic model ml, m2, m3 stores about twenty components that will explain the majority of the variance for the thousand people.
  • the generic models ml, m2, m3 are stored in a memory accessible by the image processing means of the device adapted to perform the adaptation of a generic model ml, m2, m3 with the three-dimensional representation Re .
  • the device implements a detection of the characteristic points Pref of this three-dimensional representation Re by digital processing means 16.
  • characteristic points correspond to the upper end of the skull, the position of the armpits and the position of the crotch.
  • digital processing means 16 can implement all known methods for detecting elements on an image, such as the method of Viola and Jones for example.
  • the point cloud is transformed into a depth image that is to say a grayscale image, for example coded on 12 bits for coding depths ranging from 0 to 4095 mm.
  • This depth image is then thresholded and binarized to bring out with a value 1 the pixels corresponding to the object / body of interest and with a value 0 the pixels corresponding to the environment.
  • contour detection is applied to this binarized image using, for example, the method described in Suzuki, S. and Abe, K., Topological Structural Analysis of Digitized Binary Images by Border Following, CVGIP 30 1, pp. 46 (1985).
  • the contour highlights and its convexity defects (determined using, for example, the Sklansky method, J., Finding the Hull Convex of a Simple Polygon, PRL 1 $ number, pp 79-83 (1982)) are used as characteristic points Pref.
  • Selection means 17 of the generic model ml, m2, m3 are then implemented to select the generic model ml, m2, m3 closest to the three-dimensional representation Re.
  • this selection can be made by calculating the distance between the characteristic point Pref of the crest vertex and the crotch characteristic point so as to roughly estimate the height in height of the three-dimensional representation Re and selecting the model generic ml, m2, m3 for which the height height is the closest.
  • the selection of the generic model ml, m2, m3 can be performed by using the width of the three-dimensional representation Re by calculating the distance between the characteristic points Pref of the armpits.
  • the generic model ml, m2, m3 can be articulated with virtual bones representing the most important bones of the human skeleton.
  • fifteen virtual bones can be modeled on the generic model ml, m2, m3 to define the position and shape of the vertebral column, femurs, shins, ulna, humerus and skull.
  • the orientation of these virtual bones makes it possible to define the pose of the generic model, that is to say if the generic model ml, m2, m3 with an arm in the air, the legs apart ...
  • This pose of the generic model ml, m2, m3 can also be determined by the selection means 17 by comparing the distance (calculated, for example, using the Hu method.) Visual Pattern Recognition by Moment Invariants, IRE Transactions on Information Theory, 8 : 2, pp.
  • a first adaptation is then performed by adaptation means 18 by deforming the selected generic model to approach the three-dimensional representation Re.
  • this first adaptation can simply deform in width and height the generic model selected for that the spacing of the characteristic points Pref of the selected generic model corresponds to the spacing of the characteristic points Pref of the three-dimensional representation Re.
  • This first adaptation can also define the pose of the virtual bones of the generic model ml, m2, m3.
  • the position of the points of the Tx motion sequence preprogrammed on the generic model is adapted in the same way.
  • the device includes means for calculating the normals 19 of each surface of the three-dimensional representation Re and the selected generic model.
  • the normal directions can be determined by constructing each face of the three-dimensional representation Re by means of the coordinates of the three or four points closest to the point of interest.
  • the normal directions of the generic model can be calculated during the definition step of the generic model.
  • the two adaptations can be performed simultaneously in one and the same step.
  • the device uses search means 20 capable of detecting, for each point of the point cloud of the three-dimensional representation Re, the point of the generic model selected in a close neighborhood for which the difference between the normal direction of the point of the model generic and the normal direction of the point of interest is the lowest.
  • the search means 20 adapt the position and size of the virtual bones by varying the characteristics of each virtual bone to fit the virtual bones with the position of the body elements present on the bones.
  • the search means 20 may be configured to search the points of the generic model in a predefined sphere around the point of interest.
  • the radius of this sphere is determined according to the number of vertexes of the generic model and the size of the object / body of interest so that a dozen points are included in this sphere.
  • the device can then calculate the difference of the selected generic model with the three-dimensional representation Re by using determining means 21 able to calculate the distance between the points of interest and the points detected by the search means on the selected generic model.
  • the set of these distances forms vectors of transformations that should be applied to the point of interest so that it corresponds to the detected point.
  • Search means 22 aims to determine an average of these transformation vectors so as to obtain an overall transformation of the selected generic model.
  • the search means 22 calculate the difference between the three-dimensional configuration of the Pts3D vertex and the ModMoyen average model as well as the ModSigma ModSigmalnv pseudo inverse matrix.
  • the inverse pseudo matrix ModSigmalnv can be calculated by decomposing the ModSigma matrix into singular values using the following relationships:
  • V * being the transconjugated matrix of V
  • the search means 22 calculate the morphological components DiffMod with the following formula:
  • CompVec DiffMod * ModSigmalnv which also makes it possible to obtain DiffMod morphological components for a specific patient.
  • the CompVec transformation vector is then applied to the selected generic model, the pose is again estimated as before and the generic model adjusted if necessary and a new search is performed until the generic model is close enough to the representation in three Re.
  • the loop stops when the average Euclidean distance between all vertexes of the generic model and their corresponding on the point cloud is less than a threshold defined according to the number of vertexes of the generic model and the size of the object / body of interest, 2 mm for example, or when a maximum number of iterations, 100 iterations for example, is reached while the average distance below the threshold is not reached.
  • the displacement sequences Tx are defined on the generic model ml, m2, m3 in the orthonormal coordinate system of the sensor 14 and the robot receives commands in its orthonormal coordinate system which differs from that of the sensor 14.
  • To calibrate the sensor of vision 14 and the robot it is possible to record the coordinates of at least three common points in the two marks. In practice, a number of points N greater than three is preferably used.
  • the robot is moved to the work area and stops N times. At each stop, the position of the robot is recorded by calculating the displacements effected by the movement instruction of the robot and a detection makes it possible to know the three-dimensional position of this stop by means of the vision sensor 14.
  • the covariance matrix C is then determined by the following relation:
  • R VU 1 ; if the determinant of R is negative, it is possible to multiply the third column of the rotation matrix R by -1.
  • the translation to be applied between the two marks is determined by the following relation:
  • a displacement sequence Tx may be more complex than a simple set of points and may include different parameters for implementing the Tx displacement sequence, such as execution speed or execution pressure.
  • the displacement sequence Tx may include positions for which actions are preprogrammed for the robot, such as the application of vibration, suction or temperature.
  • the actions or parameters of the selected displacement sequence Tx may also be adjustable by the user or the operator before or during the execution of the motion sequence by the robot.
  • the operator can adjust the paint speed and the brush support power on the porcelain.
  • the user can choose himself the speed and the force of the massage by adjusting the pressure of the robot in real time, to adapt these parameters to his sensations. It can also adjust a temperature emitted by the massage hand or the effects achieved by this massage hand, such as the power of vibration or suction.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Pain & Pain Management (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Epidemiology (AREA)
  • Human Computer Interaction (AREA)
  • Manufacturing & Machinery (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Radiology & Medical Imaging (AREA)
  • Vascular Medicine (AREA)
  • Manipulator (AREA)
  • Image Processing (AREA)
  • Percussion Or Vibration Massage (AREA)
  • Massaging Devices (AREA)
EP18731147.7A 2017-06-26 2018-06-21 Dispositif de gestion des deplacements d'un robot et robot de soin associe Pending EP3645218A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1755812A FR3067957B1 (fr) 2017-06-26 2017-06-26 Dispositif de gestion des deplacements d'un robot et robot de soin associe
PCT/EP2018/066672 WO2019002104A1 (fr) 2017-06-26 2018-06-21 Dispositif de gestion des deplacements d'un robot et robot de soin associe

Publications (1)

Publication Number Publication Date
EP3645218A1 true EP3645218A1 (fr) 2020-05-06

Family

ID=59521126

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18731147.7A Pending EP3645218A1 (fr) 2017-06-26 2018-06-21 Dispositif de gestion des deplacements d'un robot et robot de soin associe

Country Status (9)

Country Link
US (1) US11338443B2 (ko)
EP (1) EP3645218A1 (ko)
JP (1) JP7097956B2 (ko)
KR (1) KR102500626B1 (ko)
CN (1) CN110785269B (ko)
CA (1) CA3067555A1 (ko)
FR (1) FR3067957B1 (ko)
SG (1) SG11201912547UA (ko)
WO (1) WO2019002104A1 (ko)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3104054B1 (fr) * 2019-12-10 2022-03-25 Capsix Dispositif de definition d’une sequence de deplacements sur un modele generique
USD1009283S1 (en) 2020-04-22 2023-12-26 Aescape, Inc. Therapy end effector
US11999061B2 (en) 2020-05-12 2024-06-04 Aescape, Inc. Method and system for autonomous object manipulation
WO2021231663A2 (en) 2020-05-12 2021-11-18 Aescape, Inc. Method and system for autonomous object interaction
US11858144B2 (en) 2020-05-12 2024-01-02 Aescape, Inc. Method and system for autonomous body interaction
CN111571611B (zh) * 2020-05-26 2021-09-21 广州纳丽生物科技有限公司 一种基于面部及皮肤特征的面部作业机器人轨迹规划方法
FR3127428A1 (fr) * 2021-09-30 2023-03-31 Exel Industries Procede de peinture d’une piece comprenant la generation d’une trajectoire adaptee a la piece reelle
FR3141853A1 (fr) 2022-11-14 2024-05-17 Capsix Effecteur de massage
US12083050B1 (en) 2023-09-06 2024-09-10 Aescape, Inc. Adjustable table system

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02148307A (ja) 1988-11-30 1990-06-07 Kobe Steel Ltd 工業用ロボットの教示データ作成装置
US5083552A (en) * 1990-06-05 1992-01-28 Harvey Lipowitz Computer controlled massage device
JP3670700B2 (ja) * 1994-04-27 2005-07-13 株式会社日立製作所 ロボット機構制御方法
JPH10128684A (ja) * 1996-10-30 1998-05-19 Asahi Sanac Kk ロボットの動作プログラム作成方法及びその作成装置
US6267737B1 (en) * 1997-05-15 2001-07-31 Algis A. Meilus Robotic system for lengthening muscles and method of use
IL133551A0 (en) * 1999-12-16 2001-04-30 Nissim Elias Human touch massager
JP2001246582A (ja) 2000-02-29 2001-09-11 Mitsubishi Heavy Ind Ltd 作業用ロボット装置
SE524818C2 (sv) 2003-02-13 2004-10-05 Abb Ab En metod och ett system för att programmera en industrirobot att förflytta sig relativt definierade positioner på ett objekt
US20050089213A1 (en) * 2003-10-23 2005-04-28 Geng Z. J. Method and apparatus for three-dimensional modeling via an image mosaic system
JP2005138223A (ja) 2003-11-06 2005-06-02 Fanuc Ltd ロボット用位置データ修正装置
FR2875043B1 (fr) * 2004-09-06 2007-02-09 Innothera Sa Lab Dispositif pour etablir une representation tridimensionnelle complete d'un membre d'un patient a partir d'un nombre reduit de mesures prises sur ce membre
DE102006005958A1 (de) * 2006-02-08 2007-08-16 Kuka Roboter Gmbh Verfahren zum Erzeugen eines Umgebungsbildes
CN100517060C (zh) * 2006-06-01 2009-07-22 高宏 一种三维人像摄影方法
US8578579B2 (en) * 2007-12-11 2013-11-12 General Electric Company System and method for adaptive machining
US8918211B2 (en) * 2010-02-12 2014-12-23 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
CN101751689B (zh) * 2009-09-28 2012-02-22 中国科学院自动化研究所 一种三维人脸重建方法
US9817389B2 (en) * 2013-03-05 2017-11-14 Rolls-Royce Corporation Adaptively machining component surfaces and hole drilling
SG10201402803RA (en) * 2014-06-02 2016-01-28 Yizhong Zhang A mobile automatic massage apparatus
KR20160033325A (ko) * 2014-09-17 2016-03-28 전자부품연구원 로봇암으로 제어되는 의료용 레이저 장치
US20170360578A1 (en) * 2014-12-04 2017-12-21 James Shin System and method for producing clinical models and prostheses
JP6497953B2 (ja) 2015-02-03 2019-04-10 キヤノン株式会社 オフライン教示装置、オフライン教示方法及びロボットシステム
US20170266077A1 (en) 2016-03-21 2017-09-21 Christian Campbell Mackin Robotic massage machine and method of use
CN106407930A (zh) * 2016-09-18 2017-02-15 长沙军鸽软件有限公司 一种按摩机器人的服务方法

Also Published As

Publication number Publication date
JP7097956B2 (ja) 2022-07-08
KR20200023646A (ko) 2020-03-05
US11338443B2 (en) 2022-05-24
FR3067957A1 (fr) 2018-12-28
CA3067555A1 (fr) 2019-01-03
CN110785269B (zh) 2023-04-07
CN110785269A (zh) 2020-02-11
US20210154852A1 (en) 2021-05-27
SG11201912547UA (en) 2020-01-30
FR3067957B1 (fr) 2020-10-23
WO2019002104A1 (fr) 2019-01-03
JP2020525306A (ja) 2020-08-27
KR102500626B1 (ko) 2023-02-15

Similar Documents

Publication Publication Date Title
EP3645218A1 (fr) Dispositif de gestion des deplacements d'un robot et robot de soin associe
EP4072794B1 (fr) Dispositif de definition d'une sequence de deplacements sur un modele generique
Boehnen et al. Accuracy of 3D scanning technologies in a face scanning scenario
US10048749B2 (en) Gaze detection offset for gaze tracking models
CN111480164B (zh) 头部姿势和分心估计
US20160202756A1 (en) Gaze tracking via eye gaze model
EP3941690A1 (fr) Procede de guidage d'un bras robot, systeme de guidage
Clarkson et al. Assessing the suitability of the Microsoft Kinect for calculating person specific body segment parameters
EP3146504A1 (fr) Procédé de construction d'un modèle du visage d'un individu, procédé et dispositif d'analyse de posture utilisant un tel modèle
Štrbac et al. Kinect in neurorehabilitation: computer vision system for real time hand and object detection and distance estimation
Manfredi et al. Skin surface reconstruction and 3D vessels segmentation in speckle variance optical coherence tomography
Grieve et al. Fingernail image registration using active appearance models
US11276184B2 (en) Method and device for determining the amplitude of a movement performed by a member of an articulated body
WO2020188063A1 (fr) Procede de generation d'une surface tridimensionnelle de travail d'un corps humain, systeme
US11957217B2 (en) Method of measuring the shape and dimensions of human body parts
Wang et al. Hybrid model and appearance based eye tracking with kinect
Labati et al. Two-view contactless fingerprint acquisition systems: a case study for clay artworks
WO2020260635A1 (fr) Procédé d'analyse de la démarche d'un individu
FR2855959A1 (fr) Procede et equipement pour la fabrication d'un equipement personnalise
Paar et al. Photogrammetric fingerprint unwrapping
Win Curve and Circle Fitting of 3D Data Acquired by RGB-D Sensor
Mlambo et al. Complexity and distortion analysis on methods for unrolling 3D to 2D fingerprints
Kim et al. Nonintrusive 3-D face data acquisition system
Font Calafell et al. A proposal for automatic fruit harvesting by combining a low cost stereovision camera and a robotic arm

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20191224

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210129