CN111571611B - Facial operation robot track planning method based on facial and skin features - Google Patents

Facial operation robot track planning method based on facial and skin features Download PDF

Info

Publication number
CN111571611B
CN111571611B CN202010457760.0A CN202010457760A CN111571611B CN 111571611 B CN111571611 B CN 111571611B CN 202010457760 A CN202010457760 A CN 202010457760A CN 111571611 B CN111571611 B CN 111571611B
Authority
CN
China
Prior art keywords
facial
robot
track
face
skin
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010457760.0A
Other languages
Chinese (zh)
Other versions
CN111571611A (en
Inventor
陈彦彪
翟敬梅
胡燕
唐骢
陈家骊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Nali Biotechnology Co ltd
Original Assignee
Guangzhou Nali Biotechnology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Nali Biotechnology Co ltd filed Critical Guangzhou Nali Biotechnology Co ltd
Priority to CN202010457760.0A priority Critical patent/CN111571611B/en
Publication of CN111571611A publication Critical patent/CN111571611A/en
Application granted granted Critical
Publication of CN111571611B publication Critical patent/CN111571611B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1072Measuring physical dimensions, e.g. size of the entire body or parts thereof measuring distances on the body, e.g. measuring length, height or thickness
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1075Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions by non-invasive methods, e.g. for determining thickness of tissue layer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H7/00Devices for suction-kneading massage; Devices for massaging the skin by rubbing or brushing not otherwise provided for
    • A61H7/002Devices for suction-kneading massage; Devices for massaging the skin by rubbing or brushing not otherwise provided for by rubbing or brushing
    • A61H7/004Devices for suction-kneading massage; Devices for massaging the skin by rubbing or brushing not otherwise provided for by rubbing or brushing power-driven, e.g. electrical
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H7/00Devices for suction-kneading massage; Devices for massaging the skin by rubbing or brushing not otherwise provided for
    • A61H7/007Kneading
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Robotics (AREA)
  • Physics & Mathematics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Pain & Pain Management (AREA)
  • Epidemiology (AREA)
  • Dermatology (AREA)
  • Mechanical Engineering (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Rehabilitation Therapy (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Toys (AREA)

Abstract

The invention discloses a facial operation robot based on facial and skin characteristics and a track planning method thereof, wherein the method comprises the following steps of: (1) dividing the face operation and the non-operation area based on the size of the head and face part of the adult; (2) comprehensively analyzing two skin tension lines of a Langerhans line and a wrinkle line of the facial skin, and designing operation tracks of the robot aiming at facial operation areas at different positions; (3) acquiring a track curve of the facial operation robot by combining an isoplanar method and a dichotomy method; (4) optimizing the operation track of the robot based on the curvature of the facial skin; (5) and selecting an optimal track point sequence of the facial operation area based on a nearest neighbor algorithm, realizing robot track planning, and obtaining an operation track of the facial operation robot. The invention designs and plans the track according to the characteristics of human body face operation, such as face viscoelasticity, skin anisotropy, human body sensitivity and the like, improves the stability of robot operation, reduces impact and realizes efficient and coherent operation of a face area.

Description

Facial operation robot track planning method based on facial and skin features
Technical Field
The invention relates to the technical field of robots, in particular to a facial operation robot based on facial and skin characteristics and a track planning method thereof.
Background
As the problems of large demands of practitioners, high requirements on skills, long culture period, high labor cost and the like exist in the fields of medical rehabilitation, massage health care and beauty care, the demands of robots which are in direct contact with the face of a human body for operation are gradually increased. In the current research of skin operation robot trajectory planning at home and abroad, the motion trajectory of a rehabilitation robot applied to limb rehabilitation training and treatment is mainly a set fixed trajectory; the robot applied to body massage is mainly a preset point-to-point motion track and a linear motion track obtained by characteristic point fitting; the robot applied to facial and oral rehabilitation can obtain an initial track by marking characteristic points on a human face CT image and fitting, and corrects the initial track by combining skin elasticity compensation quantity to obtain a simple and local oral massage operation track. The robot operation track planning methods have the advantages that the number of the track points of the robot operation track obtained by the track planning methods is small, the applicability is poor, the operation track of the local face area is obtained, and the robot operation track planning method facing the whole face area is not searched.
Different from an industrial robot or a mobile robot, the robot which operates on the surface of the human skin is operated by the human skin, has viscoelasticity, anisotropy and the like, has a complex anatomical structure and extremely complex mechanical properties; the skin at different facial positions has different mechanical property expressions due to the difference of internal tissue distribution; the richer nervous tissue in facial skin can perceive external stimulus, has different responses to external stimulus, directly influences the human receptivity in the facial operation of robot. In order to reduce the friction force of the robot during facial operation and improve the human perception, the robot track needs to be designed, optimized and planned in consideration of the viscoelasticity, anisotropy, large curvature change and the like of the facial skin.
In the prior art, key parts such as eyes, nose and mouth cannot be avoided in the face operation process of the robot, the operation stability is improved, the impact is reduced, and the face region operation is efficiently and continuously completed, so that a novel face skin operation robot based on the face skin characteristics and a track planning method thereof need to be designed.
Disclosure of Invention
The invention aims to provide a facial operation robot based on facial and skin characteristics and a track planning method thereof, aiming at a human facial operation robot, the human sensibility is improved while the safety and the effectiveness of the robot facial operation are ensured.
The invention solves the technical problem and adopts the following technical scheme:
a facial operation robot track planning method based on face and skin features is characterized in that the track planning of a facial operation robot is carried out based on skin features such as a facial safe operation area and skin anisotropy, so that key parts such as eyes, a nose and a mouth of the robot can be avoided in the operation process, the operation stability is improved, the impact is reduced, and the operation of the facial area is efficiently and continuously completed, and specifically comprises the following steps:
(1) dividing the face operation and the non-operation area based on the size of the head and face part of the adult;
(2) comprehensively analyzing two skin tension lines of a Langerhans line and a wrinkle line of the facial skin, and designing operation tracks of the robot aiming at facial operation areas at different positions;
(3) acquiring a track curve of the facial operation robot by combining an isoplanar method and a dichotomy method;
(4) optimizing the operation track of the robot based on the curvature of the facial skin;
(5) and selecting an optimal track point sequence of the facial operation area based on a nearest neighbor algorithm, realizing robot track planning, and obtaining an operation track of the facial operation robot.
In the step (1), the size data of the female head and face item part in the national standard 'adult head and face size' (GB/T2428.98) is selected for analysis and used as the basis for dividing the face region, and the reference size data for dividing the face region is set as DG-NAnd DM1-M2Extracting facial key part feature points, taking an eyebrow central point (EB) and a nose tip point (S) as reference points, and passing through a transverse width dimension DM1-M2And a longitudinal height dimension DG-NRestricting and dividing a face operation area;
furthermore, two skin tension lines, namely a facial wrinkle line and a Langerhans line, which have large influence on skin extension and tension are analyzed in the step (2), and different operation tracks are designed according to skin characteristics of different facial areas of the skin so as to achieve a better facial skin operation effect;
furthermore, the bending change of the track curve obtained based on the isoplanar method in the step (3) is obvious, and the bending condition of the track curve is improved by processing the data of the intersection points by adopting a successive dichotomy;
further, in the step (4), the normal curvature of the face track point is calculated by using an approximate solution method, a threshold value is set, and when the normal curvature deviation value between adjacent track points in the track line is greater than the threshold value, a plurality of points are interpolated between the two track points;
further, setting an initial position point of the robot in the step (5), processing the face operation track points based on a nearest neighbor algorithm, and selecting an optimal face operation area track point sequence to obtain the operation track of the face operation robot.
The invention has the advantages that:
compared with the existing skin operation robot and the track planning method thereof, the method comprehensively considers the facial skin characteristics, the human body sensitivity, the robot operation safety and the like to plan the track of the facial operation robot. Dividing a face operation area to ensure that key parts such as eyes, a nose, a mouth and the like can be avoided in the face operation process of the robot; the method mainly considers the anisotropy of facial skin, combines two skin tension lines of a facial wrinkle line and a Langerhans line, and designs the operation tracks of the skin in different facial areas so as to realize better facial operation effect; the operation track of the robot is optimized based on the curvature of the face, the operation stability is improved, and the impact is reduced; and planning the track of the facial operation robot based on a nearest neighbor algorithm to realize the coherent operation of the facial area of the robot.
Drawings
Fig. 1 is a flow chart of a facial operation robot trajectory planning method based on facial and skin features according to the invention.
FIG. 2a is a schematic diagram of the measurement items of the head and face of a woman in the division of the face work area according to the present invention.
FIG. 2b is a schematic diagram of the division of the facial working area according to the present invention.
FIG. 3a is a schematic view of a facial wrinkle line according to the present invention.
FIG. 3b is a schematic view of Langer's line on the face according to the present invention.
FIG. 4a is a schematic diagram of obtaining a face operation track point by an iso-planar method according to the present invention.
FIG. 4b is a schematic diagram of a trajectory generation method based on successive dichotomy according to the present invention.
FIG. 5a is a schematic diagram of the approximate solution of the normal curvature of the face according to the present invention.
Fig. 5b is a simplified robot face operation trace diagram according to the present invention.
Fig. 6 is a schematic diagram of a simplified robot face operation track obtained based on a nearest neighbor algorithm.
Detailed Description
The purpose of the present invention is described in further detail below by using specific examples, which cannot be described in detail herein, but the embodiments of the present invention are not limited to the following examples.
Referring to fig. 1 to 6, the facial operation robot based on facial and skin features and the trajectory planning method thereof provided by the embodiment of the present invention can be used in the field of robot trajectory planning, and perform facial operation area division based on the size of the head and face part of an adult; considering skin anisotropy and facial wrinkles, comprehensively analyzing two skin tension lines, and designing robot operation tracks of facial operation areas at different positions; combining an isoplanar method and a dichotomy method to obtain a facial robot operation track curve; optimizing the robot operation track based on the curvature of the face; the method comprises the following steps of selecting an optimal face operation area track point sequence based on a nearest neighbor algorithm, realizing robot track planning, and finally obtaining the operation track of a face operation robot, wherein the method specifically comprises the following steps:
s1, selecting partial size data of the female head and face project in the national standard 'adult head and face size' (GB/T2428.98) for analysis, and taking the partial size data as a basis for dividing the face operation region, wherein a schematic diagram of the female head and face measurement project is shown in figure 2 a. In the figure, point V is the vertex of the head; point G is the glabellar point; point N is the pre-auricular point; points M1 and M2 are a left corner point and a right corner point respectively; dG-NIs the distance between the glabellar point and the anterior auricular point, DM1-M2The transverse distance between the left and right corner points; 1. 2, 3 are womenSub-head-face measurement items.
Setting reference size data of face region division to DG-NAnd DM1-M2,DG-NThe value can be determined from the distance from the vertex to the eyebrow and the height of the head and ears, DM1-M2The value is the mouth width dimension. As shown in fig. 2b, the feature points of key parts of the face are extracted, and the central point (EB) of the eyebrow and the nose tip point (S) are taken as reference points and pass through the transverse width dimension DM1-M2And a longitudinal height dimension DG-NThe areas of the three key parts of the eyes, the nose and the mouth are defined as non-operation areas, and the rest areas of the face are defined as operation areas including the forehead area and the left and right cheek areas.
S2, facial wrinkle lines and Langer' S lines are two lines of skin tension that have a greater effect on skin extension and tension. The facial wrinkle lines are lines of a plurality of bulges and depressions naturally formed on the surface of the skin, so that the skin can be stretched, and the skin has elasticity like a pleated skirt. In the daily facial care operation, in order to resist the trouble caused by facial wrinkles, the facial lifting operation and massage are usually performed perpendicular to the facial wrinkle lines, so that the facial blood circulation is promoted, the wrinkles are reduced, and the skin aging is delayed. Langer lines indicate the preferential direction of skin extensibility and are indicative of skin anisotropy. Elastin and collagen fiber inside the skin along the Langer line direction are easier to extend, the face care operation is carried out to conform to the skin extensibility direction, the friction obstruction in the operation can be reduced, and the human body feeling is improved.
Fig. 3a and 3b show the distribution of facial wrinkle lines and Langer lines on the face, respectively, where H is the forehead region and C is the cheek region of the face. In the H area, the trend of the wrinkle line is basically consistent with that of the Langer line, the skin elasticity is obviously shown compared with the extensibility in the H area due to the fact that the skin viscoelasticity is small, the wrinkle removing effect is considered more, and the operation track is perpendicular to the wrinkle line of the face. In the C area, the wrinkle line is not completely vertical to the Langer line, certain intersection angles exist at different positions in different degrees, the viscoelasticity of the skin in the C area is considered to be obvious, the extensibility of the skin is obviously shown, in order to conform to the extensibility direction of the skin, the friction force is reduced, the human body sensitivity is improved, a certain wrinkle removing effect is realized, and the operation track is along the Langer line on the face.
And S3, respectively cutting the forehead, the left cheek area and the right cheek area by adopting an isoplanar method to obtain cut lines. As shown in fig. 4a, a set of section planes S ═ S is defined along the Y direction and parallel to the X-Z plane1,S2,…,Si,…,SmAnd m is the total number of the cross-sectional planes, and the offset between the cross-sectional planes is the line spacing L. Suppose a sectional plane SiIntersecting with the curved surface of the facial skin to obtain the number of intersection points on an intersection line as n, traversing all triangular patches on the facial model intersected with the intersection plane S, judging the position relationship between the triangular patches and the intersection plane S, and analyzing different intersection conditions to obtain the intersection point P of the operation areaC
Figure BDA0002509914050000079
The obtained intersection point PCConnecting lines according to bubbling sequencing to obtain an operation area intercept line C conforming to the operation direction:
C={C1,C2,…,Ci,…,Cm}(i={1,2,…,m})
in order to improve the bending condition of the operation track, the successive dichotomy is adopted to process the intersection data, and an intersection line C is setiAll the cross-over points on the upper part form a closed interval
Figure BDA00025099140500000710
Bi1Has a midpoint of
Figure BDA0002509914050000071
Will be provided with
Figure BDA00025099140500000711
As a new interval Bi2Boundary value of (i.e.
Figure BDA0002509914050000072
Bi2Has a midpoint of
Figure BDA0002509914050000073
Circulating the midpoint value in the calculation interval until
Figure BDA0002509914050000074
(j is the total number of times of the loop calculation), and different intercept point intervals B are obtainedijCorresponding midpoint value
Figure BDA0002509914050000075
Are connected in sequence with BijThe intersection points of the corresponding positions of the intersection lines of every middle line are connected if no intersection point exists in the corresponding interval
Figure BDA0002509914050000076
FIG. 4B is a schematic diagram of a trajectory generation method based on successive dichotomy, in which purple points are corresponding intercept point intervals Bi1,Bi2,Bi3(i=1,2,3,4,5) midpoint
Figure BDA0002509914050000077
Connection B13,B23,B33,B43,B53The first intersection point in the cross section obtains a red trace line TC1(ii) a Connect the second intersection point, due to B13,B43,B53Without a second intersection point, then connect
Figure BDA0002509914050000078
Instead of the point of intersection, T is obtainedC2Connecting B according to the method described abovei1,Bi2,Bi3And the bending condition of the obtained face operation track is improved by other track lines.
S4, optimizing the massage tracks of different face areas based on the curvature of the face, and interpolating track points at the position with larger curvature change to increase the number of the track points. The three-dimensional reconstruction of the facial skin curved surface is a grid model, the normal curvature of the curve at the grid vertex cannot be directly calculated, and the curve can only be obtained by an approximate solving method.
As shown in FIG. 5a, a vertex v of the facial mesh model and n triangles associated with the vertex v are takenThe set of patches is Tv
Figure BDA0002509914050000081
In the formula
Figure BDA0002509914050000082
vij,vi(j+1)Are the two vertices of the ith relevant triangular patch of vertex vth.
Each triangular patch
Figure BDA0002509914050000087
Has a unit normal vector of
Figure BDA0002509914050000083
The following equation is obtained:
Figure BDA0002509914050000084
estimating a normal vector N at the vertex v from the unit normal vectors of each triangular patchv
Figure BDA0002509914050000085
Calculating the normal curvature k at the vertex v by the curvature formula of any point on the triangular meshvComprises the following steps:
Figure BDA0002509914050000086
the curvature of each track point of the massage track is obtained by the method, and a threshold value T is setHWhen the normal curvature deviation value between adjacent track points in the track line is larger than the threshold value THAnd interpolating and supplementing a plurality of points between the two track points. The positions with large curvature change are increased in the number of track points, and the normal vector change between adjacent track points is reduced, so that the tail end position of the robot is ensuredThe posture change is smoother.
Analyzing the simplified robot face operation track schematic diagram shown in fig. 5b in combination with the step S4, wherein black arrows in the diagram indicate trends of the operation tracks; the yellow points are track points, and the number of the track points at the position with larger curvature is larger; the forehead operation track is thThe three parts of the left cheek are respectively tlf1,tlf2,tlf3The operation tracks of the three parts of the right cheek are respectively trf1,trf2,trf3
S5, planning the robot face operation track based on the nearest neighbor algorithm, optimally selecting the optimal face operation starting point and track operation sequence, obtaining the shortest robot face massage path, and obtaining the simplified robot face operation track shown in figure 6.
A facial operation robot based on facial and skin features for implementing the trajectory planning method as shown in FIG. 6, the robot uses the forehead region trajectory points H1The forehead area is lifted and massaged for the starting point of the robot face massage and moves to the track point H2Finishing the 1 st section of face massage operation; then move to the locus point CL11According to the trajectory tlf1Move to the track point CL of the track12Finishing the 2 nd section of face massage operation; by parity of reasoning, the track t is completed in sequencelf2, tlf3,trf1,trf2,trf3And (3) performing facial massage operation in sections 3-7 of the area.
The above examples of the present invention are merely examples for clearly illustrating the present invention and are not intended to limit the embodiments of the present invention. It will be apparent to those skilled in the art that other variations and modifications may be made in the foregoing description, and it is not necessary or necessary to exhaustively enumerate all embodiments herein. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (4)

1. A facial operation robot track planning method based on face and skin features is characterized in that the track planning of a facial operation robot is carried out based on a facial safe operation area and skin anisotropic skin features, so that key parts of eyes, a nose and a mouth of the robot can be avoided in the operation process, the operation stability is improved, the impact is reduced, the operation of the facial area is efficiently and continuously completed, and the method specifically comprises the following steps:
(1) dividing the face operation and non-operation areas based on the size of the head and face part of the adult:
the method for dividing the size data of the female head and face items into face operation areas comprises the following steps: the distance from the head vertex V to the glabellar point G, namely the distance from the head vertex V to the glabellar point G; the height of the head and the ear, namely the distance between the anterior auricular point N and the vertex V of the head; mouth width, i.e., the lateral distance between the left mouth corner point M1 and the right mouth corner point M2; extracting key part feature points of the face, and defining the areas where the three key parts of the eyes, the nose and the mouth are located as non-operation areas and the rest areas of the face as operation areas including forehead areas and left and right cheek areas by taking an eyebrow center point EB and a nose tip point S as reference points and through the dimension of transverse width and mouth width and the dimension of the distance from a head vertex V of longitudinal height to an glabellar point G;
(2) comprehensively analyzing two skin tension lines of a Langerhans line and a wrinkle line of the face skin, and designing face operation tracks of the robot aiming at face operation areas at different positions;
(3) acquiring a track curve of the facial operation robot by combining an isoplanar method and a dichotomy method;
the distance between the intersection points obtained based on the isoplanar method is not uniform, if the intersection points corresponding to each row of intersection line are directly and sequentially connected, the track bending change is obvious, the intersection point data is processed by a successive bisection method to improve the bending condition of the obtained operation track, and the method comprises the following specific steps:
respectively intercepting the forehead, the left cheek area and the right cheek area by adopting an isoplanar method to obtain intercepting lines: defining a set of edges
Figure 120247DEST_PATH_IMAGE001
In a direction parallel to
Figure 223332DEST_PATH_IMAGE002
Cross-sectional plane of plane
Figure 727126DEST_PATH_IMAGE003
mThe offset between each section plane is the line spacing for the total number of section planesL. Assuming a sectional plane
Figure 232056DEST_PATH_IMAGE004
The number of intersecting points on the intersecting line obtained by intersecting the curved surface of the facial skin isnTraversing all triangular patches on the face model intersected with the section plane S, judging the position relation between the triangular patches and the section plane S, and analyzing different intersection conditions to obtain the section intersection point of the working area
Figure 795893DEST_PATH_IMAGE005
Figure 120695DEST_PATH_IMAGE006
And connecting the obtained intersection points according to bubbling sequencing to obtain an operation area intersection line C conforming to the operation direction:
Figure 428180DEST_PATH_IMAGE007
in order to improve the bending condition of the operation track, the successive dichotomy is adopted to process the intersection point data and set an intersection line
Figure 256458DEST_PATH_IMAGE008
All the cross-over points on the upper part form a closed interval
Figure DEST_PATH_IMAGE009
Figure 194458DEST_PATH_IMAGE010
Has a midpoint of
Figure DEST_PATH_IMAGE011
Will be
Figure 955958DEST_PATH_IMAGE012
As a new interval
Figure DEST_PATH_IMAGE013
Boundary value of (i.e.
Figure 270396DEST_PATH_IMAGE014
Figure 218761DEST_PATH_IMAGE013
Has a midpoint of
Figure DEST_PATH_IMAGE015
And circularly calculating the midpoint value in the interval until
Figure 593241DEST_PATH_IMAGE016
(j is the total number of times of the loop calculation) to obtain different section intersection point intervals
Figure DEST_PATH_IMAGE017
Corresponding midpoint value
Figure 564740DEST_PATH_IMAGE018
. Are connected in sequence
Figure 479606DEST_PATH_IMAGE017
The intersection points of the corresponding positions of the intersection lines of every middle line are connected if no intersection point exists in the corresponding interval
Figure 548056DEST_PATH_IMAGE018
(4) Optimizing the operation track of the robot based on the curvature of the facial skin;
(5) and selecting an optimal track point sequence of the facial operation area based on a nearest neighbor algorithm, realizing robot track planning, and obtaining an operation track of the facial operation robot.
2. The facial and skin feature based facial work robot trajectory planning method of claim 1, characterized by: in the step (4), for cheekbones and cheek regions, the skin normal curvature changes greatly, and the terminal normal vector changes greatly during robot operation, so that operation tracks of different face regions are optimized based on the face curvature, track point interpolation is performed at the position with the large curvature change, and the number of track points is increased.
3. The facial and skin feature based facial work robot trajectory planning method of claim 1, characterized by: and (5) the massage operation priority sequences of different facial areas in the step (5) directly influence the total length of the facial massage path of the robot, and the optimal track point sequence of the facial operation area is selected based on the nearest neighbor algorithm, so that the trajectory planning of the robot is realized, the idle stroke of the robot is reduced, and the operation efficiency is improved.
4. A facial operation robot based on facial and skin features implementing the trajectory planning method according to one of claims 1 to 3.
CN202010457760.0A 2020-05-26 2020-05-26 Facial operation robot track planning method based on facial and skin features Active CN111571611B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010457760.0A CN111571611B (en) 2020-05-26 2020-05-26 Facial operation robot track planning method based on facial and skin features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010457760.0A CN111571611B (en) 2020-05-26 2020-05-26 Facial operation robot track planning method based on facial and skin features

Publications (2)

Publication Number Publication Date
CN111571611A CN111571611A (en) 2020-08-25
CN111571611B true CN111571611B (en) 2021-09-21

Family

ID=72117852

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010457760.0A Active CN111571611B (en) 2020-05-26 2020-05-26 Facial operation robot track planning method based on facial and skin features

Country Status (1)

Country Link
CN (1) CN111571611B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115741732A (en) * 2022-11-15 2023-03-07 福州大学 Interactive path planning and motion control method of massage robot
CN115847449A (en) * 2023-02-22 2023-03-28 深圳市德壹医疗科技有限公司 Intelligent massage method, device and equipment based on path planning and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008041457A1 (en) * 2006-09-29 2008-04-10 Waseda University Massage robot, control program therefor, and robot for specifying portion of human body
CN204700888U (en) * 2015-06-12 2015-10-14 唐志伟 A kind of novel feeding robot
JP2015186568A (en) * 2014-03-13 2015-10-29 パナソニックIpマネジメント株式会社 massage device and massage method
CN105574484A (en) * 2014-11-04 2016-05-11 三星电子株式会社 Electronic device, and method for analyzing face information in electronic device
CN105913416A (en) * 2016-04-06 2016-08-31 中南大学 Method for automatically segmenting three-dimensional human face model area
FR3067957A1 (en) * 2017-06-26 2018-12-28 Capsix ROBOT DISPLACEMENT MANAGEMENT DEVICE AND ASSOCIATED CARE ROBOT
KR101950148B1 (en) * 2018-09-13 2019-02-19 주식회사 바디프랜드 Method and apparatus for providing massage for stimulating physeal plate for promoting growth
CN109938842A (en) * 2019-04-18 2019-06-28 王小丽 Facial surgical placement air navigation aid and device
CN109940626A (en) * 2019-01-23 2019-06-28 浙江大学城市学院 A kind of thrush robot system and its control method based on robot vision
CN209221348U (en) * 2018-06-28 2019-08-09 诺思科技有限公司 Artificial intelligence robot for skin treating
CN110472605A (en) * 2019-08-21 2019-11-19 广州纳丽生物科技有限公司 A kind of skin problem diagnostic method based on deep learning face subregion
CN110900597A (en) * 2018-09-14 2020-03-24 上海沃迪智能装备股份有限公司 Jumping motion track planning method with settable vertical height and corner height

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8270693B2 (en) * 2007-04-03 2012-09-18 M2S Anatomical visualization and measurement system
US9607347B1 (en) * 2015-09-04 2017-03-28 Qiang Li Systems and methods of 3D scanning and robotic application of cosmetics to human
CN105740781B (en) * 2016-01-25 2020-05-19 北京眼神智能科技有限公司 Three-dimensional human face living body detection method and device
US10022192B1 (en) * 2017-06-23 2018-07-17 Auris Health, Inc. Automatically-initialized robotic systems for navigation of luminal networks
US10434658B2 (en) * 2017-11-29 2019-10-08 Midea Group Co., Ltd. Massage robot using machine vision

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008041457A1 (en) * 2006-09-29 2008-04-10 Waseda University Massage robot, control program therefor, and robot for specifying portion of human body
JP2015186568A (en) * 2014-03-13 2015-10-29 パナソニックIpマネジメント株式会社 massage device and massage method
CN105574484A (en) * 2014-11-04 2016-05-11 三星电子株式会社 Electronic device, and method for analyzing face information in electronic device
CN204700888U (en) * 2015-06-12 2015-10-14 唐志伟 A kind of novel feeding robot
CN105913416A (en) * 2016-04-06 2016-08-31 中南大学 Method for automatically segmenting three-dimensional human face model area
FR3067957A1 (en) * 2017-06-26 2018-12-28 Capsix ROBOT DISPLACEMENT MANAGEMENT DEVICE AND ASSOCIATED CARE ROBOT
CN209221348U (en) * 2018-06-28 2019-08-09 诺思科技有限公司 Artificial intelligence robot for skin treating
KR101950148B1 (en) * 2018-09-13 2019-02-19 주식회사 바디프랜드 Method and apparatus for providing massage for stimulating physeal plate for promoting growth
CN110900597A (en) * 2018-09-14 2020-03-24 上海沃迪智能装备股份有限公司 Jumping motion track planning method with settable vertical height and corner height
CN109940626A (en) * 2019-01-23 2019-06-28 浙江大学城市学院 A kind of thrush robot system and its control method based on robot vision
CN109938842A (en) * 2019-04-18 2019-06-28 王小丽 Facial surgical placement air navigation aid and device
CN110472605A (en) * 2019-08-21 2019-11-19 广州纳丽生物科技有限公司 A kind of skin problem diagnostic method based on deep learning face subregion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
机器人虚拟仿真及远程控制系统的研究与实现;翟敬梅;《计算机工程与应用》;20160731;全文 *

Also Published As

Publication number Publication date
CN111571611A (en) 2020-08-25

Similar Documents

Publication Publication Date Title
CN111571611B (en) Facial operation robot track planning method based on facial and skin features
He et al. A wireless BCI and BMI system for wearable robots
KR102006019B1 (en) Method, system and non-transitory computer-readable recording medium for providing result information about a procedure
Muret et al. Beyond body maps: Information content of specific body parts is distributed across the somatosensory homunculus
CN203723608U (en) Comb
Pun et al. Brain-computer interaction research at the Computer Vision and Multimedia Laboratory, University of Geneva
US20230200908A1 (en) Computing platform for improved aesthetic outcomes and patient safety in medical and surgical cosmetic procedures
CN110782528A (en) Free deformation human face shaping simulation method, system and storage medium
CN113362924A (en) Medical big data-based facial paralysis rehabilitation task auxiliary generation method and system
CN114842522A (en) Artificial intelligence auxiliary evaluation method applied to beauty treatment
JP3138305U (en) Facial mask
Sudarsanan et al. Controlling a robot using brain waves
CN204377989U (en) A kind of perpendicular oval mouth mask
CN112329640A (en) Facial nerve palsy disease rehabilitation detection system based on eye muscle movement analysis
CN113221958A (en) Method, device and system for matching massage track with massage area and storage medium
HELFRICH Human-like trajectory planning for a motorized upper-limb exoskeleton
Wendel et al. Measuring tissue thicknesses of the human head using centralized and normalized trajectories
US20230200907A1 (en) Computing platform for improved aesthetic outcomes and patient safety in medical and surgical cosmetic procedures
JP3138269U (en) Facial mask
US11497418B2 (en) System and method for neuroactivity detection in infants
WO2022173056A1 (en) Skin state inference method, device, program, system, trained model generation method, and trained model
Liu et al. Single-trial discrimination of EEG signals for stroke patients: a general multi-way analysis
CN117636446B (en) Face acupoint positioning method, acupuncture robot and storage medium
CN203416918U (en) Comb
CN214340712U (en) Hairline designer applied to line hair work

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant