CN114581413A - Image processing working method and system applied to hair planting - Google Patents
Image processing working method and system applied to hair planting Download PDFInfo
- Publication number
- CN114581413A CN114581413A CN202210217598.4A CN202210217598A CN114581413A CN 114581413 A CN114581413 A CN 114581413A CN 202210217598 A CN202210217598 A CN 202210217598A CN 114581413 A CN114581413 A CN 114581413A
- Authority
- CN
- China
- Prior art keywords
- hair follicle
- hair
- result
- identification
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 210000004209 hair Anatomy 0.000 title claims abstract description 93
- 238000000034 method Methods 0.000 title claims abstract description 56
- 238000012545 processing Methods 0.000 title claims abstract description 52
- 210000003780 hair follicle Anatomy 0.000 claims abstract description 272
- 238000009826 distribution Methods 0.000 claims abstract description 75
- 238000007781 pre-processing Methods 0.000 claims abstract description 64
- 238000005070 sampling Methods 0.000 claims abstract description 39
- 238000013210 evaluation model Methods 0.000 claims abstract description 30
- 238000004458 analytical method Methods 0.000 claims abstract description 22
- 230000036541 health Effects 0.000 claims description 15
- 230000015654 memory Effects 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 13
- 230000003648 hair appearance Effects 0.000 claims description 12
- 230000011218 segmentation Effects 0.000 claims description 10
- 238000000926 separation method Methods 0.000 claims description 10
- 238000012549 training Methods 0.000 claims description 7
- 238000004891 communication Methods 0.000 claims description 6
- 238000010276 construction Methods 0.000 claims description 6
- 238000012360 testing method Methods 0.000 claims description 4
- 238000002054 transplantation Methods 0.000 abstract description 46
- 230000008569 process Effects 0.000 description 13
- 230000000694 effects Effects 0.000 description 10
- 238000011156 evaluation Methods 0.000 description 8
- 238000003672 processing method Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 6
- 201000004384 Alopecia Diseases 0.000 description 5
- 238000003860 storage Methods 0.000 description 5
- 231100000360 alopecia Toxicity 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 206010003694 Atrophy Diseases 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000037444 atrophy Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000012258 culturing Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000003779 hair growth Effects 0.000 description 1
- 230000003676 hair loss Effects 0.000 description 1
- 230000003741 hair volume Effects 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000002406 microsurgery Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 210000004761 scalp Anatomy 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/243—Classification techniques relating to the number of classes
- G06F18/24317—Piecewise classification, i.e. whereby each classification requires several discriminant rules
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/92—Dynamic range modification of images or parts thereof based on global image properties
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30088—Skin; Dermal
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Image Processing (AREA)
Abstract
The invention discloses an image processing working method and system applied to hair planting, which are used for obtaining a first preset target sampling area; performing multi-angle image acquisition of the first preset target sampling area to obtain a first image set; performing image preprocessing on the first image set to obtain a first image preprocessing result, and performing boundary feature identification planning on the first image preprocessing result to obtain first boundary information; performing hair follicle characteristic identification through the first image preprocessing result to obtain a first hair follicle density grading identification result; inputting the first image preprocessing result into a hair follicle state evaluation model to obtain a first hair follicle state distribution identification result; and performing hair planting treatment according to the first boundary information, the first hair follicle density grading identification result and the first hair follicle state distribution identification result. The technical problem that image acquisition analysis processing of hair follicles is not accurately carried out in the prior art, and intelligent hair transplantation assistance is carried out is solved.
Description
Technical Field
The invention relates to the field related to electric digital data processing, in particular to an image processing working method and system applied to hair planting.
Background
With the development of society, the pace is accelerated, the learning pressure and the working pressure are increased, the alopecia becomes a trouble like baldness of most people, data survey shows that the number of people suffering from alopecia exceeds 2.5 hundred million in China, one person suffers from alopecia in 6 people on average, many people are hurt by the alopecia, and the hair transplantation becomes a new concept and gradually enters the thought of people. The hair transplantation refers to the process of completely cutting the resistance value of the part around the hair follicle, separating from the original position of the scalp and transplanting the hair follicle to a required position.
However, in the process of implementing the technical scheme of the invention in the application, the technology at least has the following technical problems:
the prior art lacks the technical problem of carrying out the image acquisition analysis processing of accurate carrying out the hair follicle, and then carries out intelligent plant hair subsidiary.
Disclosure of Invention
The application solves the technical problems that image acquisition analysis processing of hair follicles is not accurately carried out in the prior art, and intelligent hair transplantation is assisted, achieves the purpose of carrying out image acquisition and analysis on the hair follicles, accurately identifies hair follicle states, and intelligently plants and sends out the auxiliary technical effect.
In view of the above problems, the present application provides an image processing method and system applied to hair transplantation.
In a first aspect, the present application provides an image processing method applied to hair transplantation, the method is applied to an intelligent hair transplantation area segmentation system, the intelligent hair transplantation area segmentation system is in communication connection with an image acquisition device, and the method includes: obtaining a first predetermined target sampling area; acquiring a multi-angle image of the first preset target sampling area through the image acquisition device to obtain a first image set; performing image preprocessing on the first image set to obtain a first image preprocessing result, and performing boundary feature identification planning on the first image preprocessing result to obtain first boundary information; performing hair follicle characteristic identification through the first image preprocessing result to obtain a first hair follicle density grading identification result; inputting the first image preprocessing result into a hair follicle state evaluation model to obtain a first hair follicle state distribution identification result; and performing hair planting treatment according to the first boundary information, the first hair follicle density grading identification result and the first hair follicle state distribution identification result.
On the other hand, the application also provides an image processing work system applied to hair transplantation, and the system comprises: a first obtaining unit for obtaining a first predetermined target sampling region; a second obtaining unit, configured to perform multi-angle image acquisition of the first predetermined target sampling area through an image acquisition device, to obtain a first image set; a third obtaining unit, configured to perform image preprocessing on the first image set to obtain a first image preprocessing result, and perform boundary feature identification planning on the first image preprocessing result to obtain first boundary information; a fourth obtaining unit, configured to perform feature identification on hair follicles through the first image preprocessing result, and obtain a first hair follicle density classification identification result; a fifth obtaining unit, configured to input the first image preprocessing result into a hair follicle state assessment model, and obtain a first hair follicle state distribution identification result; a first processing unit, configured to perform hair transplant processing according to the first boundary information, the first hair follicle density classification identification result, and the first hair follicle state distribution identification result.
In a third aspect, the present invention provides an electronic device, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the method according to any one of the first aspect when executing the program.
In a fourth aspect, the present application provides a computer program product comprising a computer program and/or instructions which, when executed by a processor, performs the steps of the method of any one of the first aspect.
One or more technical solutions provided in the present application have at least the following technical effects or advantages:
the image acquisition device is used for acquiring multi-angle images of a first preset target sampling area, preprocessing the acquisition result of the images, including position association, definition processing, brightness pair comparison adjustment and the like of the images to obtain a first image preprocessing result, and performing boundary feature identification planning on the first image preprocessing result to obtain first boundary information; performing hair follicle characteristic identification through the first image preprocessing result to obtain a first hair follicle density grading identification result; inputting the first image preprocessing result into a hair follicle state evaluation model to obtain a first hair follicle state distribution identification result; through the first boundary information, the first hair follicle density grading identification result and the first hair follicle state distribution identification result, hair planting processing is carried out, so that the overall evaluation identification of the target object is more accurate, image acquisition and analysis on hair follicles are achieved, then hair follicle state identification is accurately carried out, and the technical effect of intelligent hair planting assistance is achieved.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
Fig. 1 is a schematic flowchart of an image processing method applied to hair transplantation according to the present application;
fig. 2 is a schematic flowchart of a second boundary information obtaining method applied to hair transplant image processing according to the present application;
fig. 3 is a schematic flowchart of a third boundary information obtaining method applied to an image processing work method for hair transplantation according to the present application;
fig. 4 is a schematic flowchart of a process for constructing a hair follicle status evaluation model according to an image processing method applied to hair transplantation;
FIG. 5 is a schematic structural diagram of an image processing system applied to hair transplantation according to the present application;
fig. 6 is a schematic structural diagram of an electronic device according to the present application.
Description of reference numerals: a first obtaining unit 11, a second obtaining unit 12, a third obtaining unit 13, a fourth obtaining unit 14, a fifth obtaining unit 15, a first processing unit 16, an electronic device 50, a processor 51, a memory 52, an input device 53, an output device 54.
Detailed Description
The application solves the technical problems that image acquisition analysis processing of hair follicles is not accurately carried out in the prior art, and intelligent hair transplantation is assisted, achieves the purpose of carrying out image acquisition and analysis on the hair follicles, accurately identifies hair follicle states, and intelligently plants and sends out the auxiliary technical effect. Embodiments of the present application are described below with reference to the accompanying drawings. As can be appreciated by those skilled in the art, with the development of technology and the emergence of new scenarios, the technical solutions provided in the present application are also applicable to similar technical problems.
The terms "first," "second," and the like in the description and in the claims of the present application and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances and are merely descriptive of the various embodiments of the application and how objects of the same nature can be distinguished. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of elements is not necessarily limited to those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Summary of the application
Hair transplantation is a process of taking out partial healthy hair follicle tissues by using microsurgery technology, carefully screening and matching, processing and culturing, and then artistically transplanting the hair follicle tissues to bald and bald parts of patients according to the natural hair growth direction. The prior art lacks the image acquisition analysis processing of carrying out the hair follicle accurately, and then carries out the supplementary technical problem of intelligent plant hair.
In view of the above technical problems, the technical solution provided by the present application has the following general idea:
the application provides an image processing working method applied to hair planting, the method is applied to an intelligent hair planting region segmentation system, the intelligent hair planting region segmentation system is in communication connection with an image acquisition device, and the method comprises the following steps: obtaining a first predetermined target sampling area; acquiring a multi-angle image of the first preset target sampling area through the image acquisition device to obtain a first image set; performing image preprocessing on the first image set to obtain a first image preprocessing result, and performing boundary feature identification planning on the first image preprocessing result to obtain first boundary information; performing hair follicle characteristic identification through the first image preprocessing result to obtain a first hair follicle density grading identification result; inputting the first image preprocessing result into a hair follicle state evaluation model to obtain a first hair follicle state distribution identification result; and performing hair planting treatment according to the first boundary information, the first hair follicle density grading identification result and the first hair follicle state distribution identification result.
Having thus described the general principles of the present application, various non-limiting embodiments thereof will now be described in detail with reference to the accompanying drawings.
Example one
As shown in fig. 1, the present application provides an image processing method applied to hair transplantation, where the method is applied to an intelligent hair transplantation area segmentation system, the intelligent hair transplantation area segmentation system is in communication connection with an image acquisition device, and the method includes:
step S100: obtaining a first predetermined target sampling area;
step S200: acquiring a multi-angle image of the first preset target sampling area through the image acquisition device to obtain a first image set;
particularly, intelligence plant hair regional division system is for carrying out the system of patient's intelligence image acquisition analysis and regional division evaluation sign before the plant hair, image acquisition device is the camera equipment that can carry out high definition image acquisition, just image acquisition device can carry out the image acquisition of multi-angle, generates the image set simultaneously, just intelligence plant hair regional division system with image acquisition device communication connection can carry out mutual information transmission and information interaction. The first predetermined target sampling area is a sampling area where a user needs to perform hair transplantation, namely, a target hair transplantation area and a related area, and is generally concentrated in a hairline area, a dog-ear area and a head top front copy area.
Further, after the first predetermined target sampling area is selected, the image acquisition device is initialized and configured, after parameter initialization is completed, acquisition position point distribution of the image acquisition device is performed according to the area characteristics of the selected first target sampling area, multi-angle image acquisition of the first predetermined target sampling area is performed based on an acquisition position point distribution result, the first image set is obtained, positioning marks are arranged among images acquired by the multi-angle image acquisition, and mutual image association and centralized analysis can be performed. And obtaining the first image set according to an acquisition result. Through the selection of the first preset target sampling area, multi-angle image acquisition is carried out on the basis of the image acquisition device, and data support is provided for subsequent accurate hair planting analysis.
Step S300: performing image preprocessing on the first image set to obtain a first image preprocessing result, and performing boundary feature identification planning on the first image preprocessing result to obtain first boundary information;
specifically, the image preprocessing is a process of eliminating target-independent information in an image and recovering real and useful information, and comprises the steps of enhancing the detectability of target information and simplifying data so as to improve the reliability of feature extraction, image segmentation matching and identification. Specifically, the image preprocessing includes brightness adjustment of images, contrast optimization, position association matching of images, and the like, after the images in the first image set are subjected to image preprocessing, mutual association of the images and feature optimization of hair follicle features are performed, feature recognition of hair follicles is performed on the preprocessed images, boundary definition of existing hair follicles of a user is performed according to information of the number, quality and density of the recognized hair follicles, and first boundary information is obtained, and is reference identification data for subsequent hair transplantation, and the first boundary information is obtained by analyzing and processing the images, so that accurate image identification is performed subsequently, and accurate data support is provided for the hair transplantation processing.
Step S400: performing hair follicle characteristic identification through the first image preprocessing result to obtain a first hair follicle density grading identification result;
step S500: inputting the first image preprocessing result into a hair follicle state evaluation model to obtain a first hair follicle state distribution identification result;
specifically, the characteristic recognition of the hair follicle is carried out according to the first image preprocessing result, and the first hair follicle density grading identification result is obtained according to the characteristic recognition result of the hair follicle. The characteristic identification of the hair follicle means that the characteristic identification is carried out on the hair follicle in all states, namely the hair follicle in a healthy state and the hair follicle in an atrophic state are included, the closed hair follicle is not in the range of the characteristic identification, regional hair follicle density grading identification is carried out through the quantity information of the hair follicle in a unit area, and the first hair follicle density grading identification result is obtained based on the whole grading identification result of the density.
Further, the hair follicle state evaluation model is a model for analyzing and evaluating the state of the hair follicle, and evaluation parameters of the hair follicle state evaluation model comprise the diameter of the hair follicle, the number of hairs growing in the hair follicle, the health state of the hair follicle (such as closure, atrophy, oil condition and the like). And inputting the first image preprocessing result into a hair follicle state evaluation model to obtain a state evaluation result of each hair follicle in a first preset target sampling area of the user, namely the first hair follicle state distribution identification result. And through the evaluation and acquisition of the first hair follicle density grading identification result and the first hair follicle state distribution identification result, accurate data support is provided for the subsequent accurate hair transplantation treatment.
Step S600: and performing hair planting treatment according to the first boundary information, the first hair follicle density grading identification result and the first hair follicle state distribution identification result.
Specifically, the first hair follicle density grading identification result and the first hair follicle state distribution identification result are integrated, namely information integration is carried out on the distribution density of the hair follicles and the state of the current hair follicle, grading identification is carried out on the closed hair follicle in a sub-health state, density analysis grading is carried out on the hair follicle in the health state to obtain a first grading parameter, density analysis grading is carried out on the hair follicle in the health state and the sub-health state to obtain a second grading parameter, priority hair planting identification of a user is carried out based on the first grading parameter, the second grading parameter and the first boundary information, and then subsequent hair planting treatment is carried out. By acquiring the first boundary information, the first hair follicle density grading identification result and the first hair follicle state distribution identification result, grading parameter evaluation optimization is performed, so that reference data for hair planting processing is more intelligent and accurate, and a better technical effect of assisting hair planting is achieved.
Further, as shown in fig. 2, step S600 of the present application further includes:
step S610: acquiring a second preset target sampling area, and acquiring multi-angle images of the second preset target sampling area through the image acquisition device to acquire a second image set;
step S620: obtaining a second hair follicle density grading identification result and a second hair follicle state distribution identification result according to the second image set;
step S630: obtaining a transplant hair follicle identification result according to the second hair follicle density grading identification result and the second hair follicle state distribution identification result;
step S640: correcting the first boundary information according to the transplanted hair follicle identification result and the first hair follicle density grading identification result to obtain second boundary information;
step S650: and correcting the first boundary information according to the transplanted hair follicle identification result and the first hair follicle density grading identification result to obtain second boundary information.
Specifically, the second predetermined target sampling area is an area for hair follicle extraction, and is generally an area of a occipital region, a large hair follicle sampling area of the user, that is, the second predetermined target sampling area, is determined according to the estimation of the actual hair follicle distribution condition of the user, multi-angle image acquisition is performed on the second predetermined target sampling area through the image acquisition device to obtain a second image set, adaptive image enhancement and image contrast adjustment are performed on the second image set, and the second hair follicle density grading identification result and the second hair follicle state distribution identification result are obtained according to the adjustment result.
Further, the second hair follicle density classification identification result and the second hair follicle state distribution identification result are evaluated in the same manner, and are not expanded in detail here. And evaluating and identifying the transplantable hair follicle in the second preset target sampling area according to the obtained second hair follicle density grading identification result and the second hair follicle state distribution identification result, so as to obtain the transplantation hair follicle identification result. And modifying the first boundary information through the transplanted hair follicle identification result and the density grading identification of the existing hair follicle, namely the first hair follicle density grading identification result, re-determining the actually modifiable boundary of the user, estimating the contour boundary under the expected hair-planting density, and performing the hair-planting treatment of the user through the second boundary. Through carrying out image acquisition to user's transplantation region, and then make the hair planting aassessment to user, the information that the boundary was confirmed is more accurate, and then has tamped the basis for follow-up accurate hair planting is supplementary.
Further, as shown in fig. 3, step S650 of the present application further includes:
step S651: obtaining first expected boundary information of a user, wherein the first expected boundary information has a first weight;
step S652: obtaining first expected density information of the user, wherein the first expected density information has a second weight;
step S653: judging the first expected boundary information and the first expected density information simultaneously according to the transplanted hair follicle identification result and the first hair follicle density grading identification result to obtain a first judgment result;
step S654: when the first judgment result is that the first expected boundary information and the first expected density information cannot be simultaneously realized, adjusting the first expected boundary information according to the first weight and the second weight to obtain third boundary information;
step S655: and performing hair planting processing through the third boundary information.
Specifically, the first expected boundary information is the expected hair-transplantation boundary information of the user, that is, the boundary position information of the expected hair-transplantation planned by the user according to the actual face shape and the hair volume of the user, the first expected density information is the required expected hair follicle density for different areas, generally speaking, the densities of the vertex, both sides and the back of the brain are not consistent, and the density of the normal hair line position is generally 70/cm2The following general conventional planting density is 55-60/cm2。
Further, the first desired density may vary from region to region, with a typical edge region, i.e., the most marginal region of the hairline, having a desired density of typically 50/cm2. The intermediate zone is the region between the primary zone and the edge zone, and a density of 55/cm is generally desired2The transition zone is planted with hairTransition zone of naturally growing hair, generally desired density is 55-60/cm2. The first expected boundary information and the first expected density information both have own weight values, and the weight values are basis for carrying out measurement, selection and adjustment on density and boundary.
Judging whether the first expected boundary information and the first expected density information are simultaneously realized according to the transplanted hair follicle identification result and the first hair follicle density grading identification result, wherein the judging process is to judge whether the transplanted hair follicle of the user simultaneously meets the expected user requirement, and when the first judgment result is that the first expected boundary information and the first expected density information can be simultaneously realized, keeping the expected density and the expected boundary of the user, and performing hair transplantation treatment after fine adjustment; when the first judgment result is that the first expected boundary information and the first expected density information cannot be simultaneously realized, at this time, it is necessary to perform adaptive boundary position and hair-planting density adjustment according to the weight distribution of the first weight and the second weight of the user, obtain the third boundary information according to the adjustment result, and perform hair-planting processing based on the third boundary information. Through the acquisition of the expected boundary and the expected density of the user and the combination of the weight distribution value of the user, the reference boundary of the hair planting is more fit with the actual demand of the user, and the technical effect of more intelligent and accurate reference data for the follow-up auxiliary hair planting is realized.
Further, as shown in fig. 4, step S500 of the present application further includes:
step S510: acquiring hair follicle images through the big data to obtain a third image set;
step S520: performing hair follicle state identification on the third image set to obtain a third image identification set;
step S530: constructing the hair follicle state evaluation model by taking the third image set as input data in training data and taking the third image identification set as supervision data;
step S540: and when the output test result of the hair follicle state evaluation model meets a first preset threshold value, finishing the construction of the hair follicle state evaluation model.
In particular, the image acquisition of the hair follicle is performed by means of big data, the third set of images, i.e. the set of images of the hair follicle, being obtained in the data of the licensed stream. Performing image identification on the third image set, namely identifying the state of hair follicles by combining machine equipment with an artificial mode to obtain a third image identification set, wherein the content of identification comprises the health state of the hair follicles, the diameters of the hair follicles, the number of the hairs, hair quality parameters and the like, taking the third image set as training data, taking the content of identification as supervision data, constructing a hair follicle state evaluation model based on a neural network, setting a first predetermined threshold, wherein the first predetermined threshold is an expected threshold of the hair follicle state evaluation model, namely a constraint parameter of an output result of the hair follicle state evaluation model, taking a certain amount of test data, inputting the hair follicle state evaluation model, and finishing training of the hair follicle state evaluation model when the output result of the hair follicle state evaluation model meets the first predetermined threshold, and completing the construction of the hair follicle state evaluation model. Through the training and the construction of the hair follicle state evaluation model, the judgment of the hair follicle state of a user is more accurate, and then the basis for realizing accurate image processing identification and providing accurate auxiliary parameter tamping of hair transplantation is provided.
Further, step S440 of the present application further includes:
step S441: obtaining a first predetermined separation interval;
step S442: performing density threshold setting distribution on the first preset separation interval to obtain a first density threshold setting distribution result;
step S443: performing hair follicle characteristic identification through the first image preprocessing result to obtain a first hair follicle actual quantity distribution result;
step S444: and comparing and identifying the actual number distribution result of the first hair follicle with the first density threshold value setting distribution result to obtain a first hair follicle density grading identification result.
Specifically, the first predetermined divided section is a preset divided section according to the difference of the areas, and different divided sections have different density thresholds. If the top of the head has a first density threshold value, the hairline area has a second density threshold value, and the break angle has a third density threshold value, the first density threshold value setting distribution result is obtained based on the first density threshold value, the second density threshold value and the third density threshold value.
And performing feature extraction on hair follicles in different states to obtain hair follicle feature parameters, performing feature identification on the first image preprocessing result based on the hair follicle feature parameters, and obtaining an actual number distribution result of the hair follicles of the user, namely the actual number distribution result of the first hair follicles. Comparing the actual number distribution result of the first hair follicle with the first density threshold value setting distribution result, wherein the larger the density difference value is, the higher the density grading identification grade is, and obtaining the first hair follicle density grading identification result according to the identification result set. Through the hair follicle density grading identification of each position area, the grading identification result of the obtained hair follicle density provides data support for the follow-up priority judgment of hair transplantation and the matching of a more proper hair follicle distribution scheme.
Further, step S630 of the present application further includes:
step S631: obtaining hair follicle diameter distribution parameters according to the first preset separation interval;
step S632: and obtaining a transplant hair follicle identification result according to the hair follicle diameter distribution parameter, the second hair follicle density grading identification result and the second hair follicle state distribution identification result.
In particular, the difference between said first predetermined separation zones defines the requirement for the diameter distribution of the hair follicles in the separation zones, generally speaking, the smaller the diameter of the hair follicles required in the separation zones at the border of the hairline, the larger the diameter of the follicles required close to the normal growth zone. And according to the required hair follicle diameter distribution parameters and the corresponding quantity, identifying the second hair follicle density classification identification result, namely identifying the hair follicles in the hair follicle sampling area, identifying the corresponding hair follicles (back pillows) in the corresponding transplanting positions, and obtaining the transplanting hair follicle identification result according to the identification result. The quantity and the thickness of the hair follicle requirements of the area to be transplanted are limited, and then the corresponding selected identification is carried out from the transplantable area, so that the transplantation of the hair follicle has stronger correspondence and pertinence, and further the subsequent hair transplantation is more natural and scientific.
Further, the identifying the hair follicle status of the third image set to obtain a third image identification set, in step S520 of the present application, further includes:
step S521: acquiring diameter identification parameters of hair follicles, and performing hair follicle diameter identification of the third image set through the diameter identification parameters to acquire a first identification result;
step S522: acquiring a health identification parameter of a hair follicle, and performing hair follicle health identification of the third image set through the health identification parameter to acquire a second identification result;
step S523: obtaining hair quality analysis parameters, and performing hair quality damage identification on the third image set through the hair quality analysis parameters to obtain a third identification result;
step S524: and obtaining the third image identification set according to the first identification result, the second identification result and the third identification result.
Specifically, diameter identification of hair follicles is performed on the images in the third image set, and diameter restoration of hair follicles is performed according to shooting distance and shooting magnification of the images to obtain diameter identification results of the hair follicles in the third image set, that is, the first identification results; according to hair follicle health parameters reflected by the hair follicle image, if the hair follicle is withered, closed and blocked, carrying out health state identification to obtain a second identification result; and analyzing the hair quality in the third image set, and identifying the damaged hair quality according to the number, thickness and hardness of hairs of the same hair follicle to obtain a third identification result. And obtaining the third image identification set through the first identification result, the second identification result and the third identification result. And through the acquisition of the third image identification set, data support is provided for a follow-up accurate hair follicle state evaluation model.
In summary, the image processing method and system applied to hair transplantation provided by the present application have the following technical effects:
1. the image acquisition device is used for acquiring multi-angle images of a first preset target sampling area, preprocessing the acquisition result of the images, including position association, definition processing, brightness comparison adjustment and the like of the images to obtain a first image preprocessing result, and performing boundary feature identification planning on the first image preprocessing result to obtain first boundary information; performing hair follicle characteristic identification through the first image preprocessing result to obtain a first hair follicle density grading identification result; inputting the first image preprocessing result into a hair follicle state evaluation model to obtain a first hair follicle state distribution identification result; through the first boundary information, the first hair follicle density grading identification result and the first hair follicle state distribution identification result, hair planting processing is carried out, so that the overall evaluation identification of the target object is more accurate, image acquisition and analysis on hair follicles are achieved, then hair follicle state identification is accurately carried out, and the technical effect of intelligent hair planting assistance is achieved.
2. Due to the adoption of the mode of image acquisition of the transplanting area of the user, the hair transplantation evaluation of the user is realized, the information determined by the boundary is more accurate, and the foundation is provided for the accurate hair transplantation auxiliary tamping in the follow-up process.
3. Due to the adoption of the mode of acquiring the expected boundary and the expected density of the user and the combination of the weight distribution value of the user, the reference boundary of the hair planting is more fit with the actual requirement of the user, and the technical effect of more intelligent and accurate reference data for the follow-up auxiliary hair planting is realized.
4. Due to the adoption of the training and the construction of the hair follicle state evaluation model, the judgment of the hair follicle state of the user is more accurate, and the foundation is provided for realizing accurate image processing identification and tamping of the auxiliary parameters of hair transplantation accuracy.
5. Due to the adoption of the classification identification of the hair follicle density in each position area, the obtained classification identification result of the hair follicle density provides data support for the subsequent hair transplantation priority judgment and the matching of a more proper hair follicle distribution scheme.
Example two
Based on the same inventive concept as the image processing working method applied to hair transplantation in the foregoing embodiment, the present invention further provides an image processing working system applied to hair transplantation, as shown in fig. 5, the system includes:
a first obtaining unit 11, wherein the first obtaining unit 11 is used for obtaining a first predetermined target sampling area;
a second obtaining unit 12, wherein the second obtaining unit 12 is configured to perform multi-angle image acquisition on the first predetermined target sampling area through an image acquisition device to obtain a first image set;
a third obtaining unit 13, where the third obtaining unit 13 is configured to perform image preprocessing on the first image set to obtain a first image preprocessing result, and perform boundary feature identification planning on the first image preprocessing result to obtain first boundary information;
a fourth obtaining unit 14, where the fourth obtaining unit 14 is configured to perform feature identification on hair follicles through the first image preprocessing result, and obtain a first hair follicle density classification identification result;
a fifth obtaining unit 15, where the fifth obtaining unit 15 is configured to input the first image preprocessing result into a hair follicle state evaluation model, and obtain a first hair follicle state distribution identification result;
a first processing unit 16, wherein the first processing unit 16 is configured to perform hair transplantation processing according to the first boundary information, the first hair follicle density classification identification result, and the first hair follicle state distribution identification result.
Further, the system further comprises:
a sixth obtaining unit, configured to obtain a second predetermined target sampling area, and perform multi-angle image acquisition on the second predetermined target sampling area through the image acquisition device to obtain a second image set;
a seventh obtaining unit, configured to obtain a second hair follicle density classification identification result and a second hair follicle state distribution identification result according to the second image set;
an eighth obtaining unit, configured to obtain a transplant hair follicle identification result according to the second hair follicle density classification identification result and the second hair follicle state distribution identification result;
a ninth obtaining unit, configured to correct the first boundary information according to the transplanted hair follicle identification result and the first hair follicle density classification identification result, and obtain second boundary information;
and the second processing unit is used for carrying out hair planting processing based on the second boundary information.
Further, the system further comprises:
a tenth obtaining unit, configured to obtain first expected boundary information of a user, wherein the first expected boundary information has a first weight;
an eleventh obtaining unit configured to obtain first desired density information of the user, wherein the first desired density information has a second weight;
a twelfth obtaining unit, configured to perform, according to the transplanted hair follicle identification result and the first hair follicle density classification identification result, judgment that the first expected boundary information and the first expected density information are simultaneously implemented, and obtain a first judgment result;
a thirteenth obtaining unit, configured to, when the first determination result is that the first expected boundary information and the first expected density information cannot be simultaneously implemented, adjust the first expected boundary information according to the first weight and the second weight, and obtain third boundary information;
and the third processing unit is used for carrying out hair planting processing through the third boundary information.
Further, the system further comprises:
and the fourteenth obtaining unit is used for acquiring images according to the regional sealing result of the bridge floor lane to obtain the first image set.
Further, the system further comprises:
a fifteenth obtaining unit, configured to perform hair follicle image acquisition through the big data to obtain a third image set;
a sixteenth obtaining unit, configured to perform hair follicle status identification on the third image set to obtain a third image identification set;
a first constructing unit, configured to construct the hair follicle state assessment model by using the third image set as input data in training data and using the third image identification set as supervision data;
a seventeenth obtaining unit, configured to complete construction of the hair follicle state assessment model when an output test result of the hair follicle state assessment model satisfies a first predetermined threshold.
Further, the system further comprises:
an eighteenth obtaining unit for obtaining a first predetermined divided section;
a nineteenth obtaining unit, configured to perform density threshold setting distribution on the first predetermined partition interval, and obtain a first density threshold setting distribution result;
a twentieth obtaining unit, configured to perform feature identification on hair follicles through the first image preprocessing result, and obtain a first hair follicle actual number distribution result;
a twenty-first obtaining unit, configured to compare and identify the actual number distribution result of the first hair follicle with the first density threshold setting distribution result, and obtain a first hair follicle density classification identification result.
Further, the system further comprises:
a twenty-second obtaining unit for obtaining hair follicle diameter distribution parameters from the first predetermined separation zone;
a twenty-third obtaining unit, configured to obtain a transplant hair follicle identification result according to the hair follicle diameter distribution parameter, a second hair follicle density classification identification result, and the second hair follicle state distribution identification result.
Further, the system further comprises:
a twenty-fourth obtaining unit, configured to obtain diameter identification parameters of hair follicles, and perform diameter identification on the hair follicles of the third image set according to the diameter identification parameters, to obtain a first identification result;
a twenty-fifth obtaining unit, configured to obtain a health identification parameter of a hair follicle, perform hair follicle health identification on the third image set according to the health identification parameter, and obtain a second identification result;
a twenty-sixth obtaining unit, configured to obtain a hair quality analysis parameter, perform hair quality damaged identification on the third image set according to the hair quality analysis parameter, and obtain a third identification result;
a twenty-seventh obtaining unit, configured to obtain the third image identifier set according to the first identifier result, the second identifier result, and the third identifier result.
Various changes and specific examples of the image processing method applied to hair transplantation in the first embodiment of fig. 1 are also applicable to the image processing system applied to hair transplantation in the present embodiment, and through the foregoing detailed description of the image processing method applied to hair transplantation, those skilled in the art can clearly know the implementation method of the image processing system applied to hair transplantation in the present embodiment, so for the brevity of the description, detailed descriptions are omitted here.
Exemplary electronic device
The electronic device of the present application is described below with reference to fig. 6.
Fig. 6 illustrates a schematic structural diagram of an electronic device according to the present application.
Based on the inventive concept of an image processing working method applied to hair transplantation in the foregoing embodiment, the present invention also provides an electronic device, and the electronic device according to the present application is described below with reference to fig. 6. The electronic device may be a removable device itself or a stand-alone device independent thereof, on which a computer program is stored which, when being executed by a processor, carries out the steps of any of the methods as described hereinbefore.
As shown in fig. 6, the electronic device 50 includes one or more processors 51 and a memory 52.
The processor 51 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 50 to perform desired functions.
The memory 52 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer-readable storage medium and executed by the processor 51 to implement the methods of the various embodiments of the application described above and/or other desired functions.
In one example, the electronic device 50 may further include: an input device 53 and an output device 54, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
The embodiment of the invention provides an image processing working method applied to hair planting, which is applied to an intelligent hair planting area segmentation system, wherein the intelligent hair planting area segmentation system is in communication connection with an image acquisition device, and the method comprises the following steps: obtaining a first predetermined target sampling area; acquiring a multi-angle image of the first preset target sampling area through the image acquisition device to obtain a first image set; performing image preprocessing on the first image set to obtain a first image preprocessing result, and performing boundary feature identification planning on the first image preprocessing result to obtain first boundary information; performing characteristic identification on hair follicles through the first image preprocessing result to obtain a first hair follicle density grading identification result; inputting the first image preprocessing result into a hair follicle state evaluation model to obtain a first hair follicle state distribution identification result; and performing hair planting treatment according to the first boundary information, the first hair follicle density grading identification result and the first hair follicle state distribution identification result. The problem of prior art lack the image acquisition analysis processes that carries out the accuracy and carry out the hair follicle, and then carry out intelligent plant and send out supplementary technical problem is solved, reach and carry out image acquisition and analysis to the hair follicle, and then accurate carry out hair follicle state sign, carry out intelligent plant and send out supplementary technological effect.
Through the above description of the embodiments, those skilled in the art will clearly understand that the present application can be implemented by software plus necessary general-purpose hardware, and certainly can also be implemented by special-purpose hardware including special-purpose integrated circuits, special-purpose CPUs, special-purpose memories, special-purpose components and the like. Generally, functions performed by computer programs can be easily implemented by corresponding hardware, and specific hardware structures for implementing the same functions may be various, such as analog circuits, digital circuits, or dedicated circuits. However, for the present application, the implementation of a software program is more preferable. Based on such understanding, the technical solutions of the present application may be substantially embodied in the form of a software product, which is stored in a readable storage medium, such as a floppy disk, a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk of a computer, and includes several instructions for causing a computer device to execute the method according to the embodiments of the present application.
In the above embodiments, all or part of the implementation may be realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product.
The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the procedures or functions described in accordance with the present application are generated, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on or transmitted from a computer-readable storage medium to another computer-readable storage medium, which may be magnetic (e.g., floppy disks, hard disks, tapes), optical (e.g., DVDs), or semiconductor (e.g., Solid State Disks (SSDs)), among others.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of the processes should be determined by the functions and the inherent logic, and should not constitute any limitation to the implementation process of the present application.
Additionally, the terms "system" and "network" are often used interchangeably herein. The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be understood that in this application, "B corresponding to A" means that B is associated with A, from which B can be determined. It should also be understood that determining B from a does not mean determining B from a alone, but may be determined from a and/or other information.
Those of ordinary skill in the art will appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the components and steps of the various examples have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In short, the above description is only a preferred embodiment of the present disclosure, and is not intended to limit the scope of the present disclosure. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
Claims (10)
1. An image processing working method applied to hair planting is characterized in that the method is applied to an intelligent hair planting region segmentation system, the intelligent hair planting region segmentation system is in communication connection with an image acquisition device, and the method comprises the following steps:
obtaining a first predetermined target sampling area;
acquiring a multi-angle image of the first preset target sampling area through the image acquisition device to obtain a first image set;
performing image preprocessing on the first image set to obtain a first image preprocessing result, and performing boundary feature identification planning on the first image preprocessing result to obtain first boundary information;
performing hair follicle characteristic identification through the first image preprocessing result to obtain a first hair follicle density grading identification result;
inputting the first image preprocessing result into a hair follicle state evaluation model to obtain a first hair follicle state distribution identification result;
and performing hair planting treatment according to the first boundary information, the first hair follicle density grading identification result and the first hair follicle state distribution identification result.
2. The method of claim 1, wherein the method further comprises:
acquiring a second preset target sampling area, and acquiring multi-angle images of the second preset target sampling area through the image acquisition device to acquire a second image set;
obtaining a second hair follicle density grading identification result and a second hair follicle state distribution identification result according to the second image set;
obtaining a transplant hair follicle identification result according to the second hair follicle density grading identification result and the second hair follicle state distribution identification result;
correcting the first boundary information according to the transplanted hair follicle identification result and the first hair follicle density grading identification result to obtain second boundary information;
and performing hair planting processing based on the second boundary information.
3. The method of claim 2, wherein the method further comprises:
obtaining first expected boundary information of a user, wherein the first expected boundary information has a first weight;
obtaining first expected density information of the user, wherein the first expected density information has a second weight;
judging the first expected boundary information and the first expected density information simultaneously according to the transplanted hair follicle identification result and the first hair follicle density grading identification result to obtain a first judgment result;
when the first judgment result is that the first expected boundary information and the first expected density information cannot be simultaneously realized, adjusting the first expected boundary information according to the first weight and the second weight to obtain third boundary information;
and performing hair planting processing through the third boundary information.
4. The method of claim 1, wherein the method further comprises:
acquiring hair follicle images through the big data to obtain a third image set;
performing hair follicle state identification on the third image set to obtain a third image identification set;
constructing the hair follicle state evaluation model by taking the third image set as input data in training data and taking the third image identification set as supervision data;
and when the output test result of the hair follicle state evaluation model meets a first preset threshold value, finishing the construction of the hair follicle state evaluation model.
5. The method of claim 2, wherein the method further comprises:
obtaining a first predetermined separation interval;
performing density threshold setting distribution on the first preset separation interval to obtain a first density threshold setting distribution result;
performing hair follicle characteristic identification through the first image preprocessing result to obtain a first hair follicle actual quantity distribution result;
and comparing and identifying the actual number distribution result of the first hair follicle with the first density threshold value setting distribution result to obtain a first hair follicle density grading identification result.
6. The method of claim 5, wherein the method further comprises:
obtaining hair follicle diameter distribution parameters according to the first preset separation interval;
and acquiring a transplant hair follicle identification result according to the hair follicle diameter distribution parameter, the second hair follicle density grading identification result and the second hair follicle state distribution identification result.
7. The method of claim 4, wherein said identifying a hair follicle state for said third set of images, obtaining a third set of image identifications, further comprises:
diameter identification parameters of hair follicles are obtained, hair follicle diameter identification of the third image set is carried out through the diameter identification parameters, and a first identification result is obtained;
acquiring a health identification parameter of a hair follicle, and performing hair follicle health identification of the third image set through the health identification parameter to acquire a second identification result;
obtaining hair quality analysis parameters, and performing hair quality damage identification on the third image set through the hair quality analysis parameters to obtain a third identification result;
and obtaining the third image identification set according to the first identification result, the second identification result and the third identification result.
8. An image processing work system for hair transplant, the system comprising:
a first obtaining unit for obtaining a first predetermined target sampling region;
a second obtaining unit, configured to perform multi-angle image acquisition of the first predetermined target sampling area through an image acquisition device, to obtain a first image set;
a third obtaining unit, configured to perform image preprocessing on the first image set to obtain a first image preprocessing result, and perform boundary feature identification planning on the first image preprocessing result to obtain first boundary information;
a fourth obtaining unit, configured to perform feature identification on hair follicles through the first image preprocessing result, and obtain a first hair follicle density classification identification result;
a fifth obtaining unit, configured to input the first image preprocessing result into a hair follicle state evaluation model, and obtain a first hair follicle state distribution identification result;
a first processing unit, configured to perform hair transplant processing according to the first boundary information, the first hair follicle density classification identification result, and the first hair follicle state distribution identification result.
9. An electronic device comprising a processor and a memory; the memory is used for storing; the processor is used for executing the method of any one of claims 1 to 7 through calling.
10. A computer program product comprising a computer program and/or instructions, characterized in that the computer program and/or instructions, when executed by a processor, implement the steps of the method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210217598.4A CN114581413A (en) | 2022-03-07 | 2022-03-07 | Image processing working method and system applied to hair planting |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210217598.4A CN114581413A (en) | 2022-03-07 | 2022-03-07 | Image processing working method and system applied to hair planting |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114581413A true CN114581413A (en) | 2022-06-03 |
Family
ID=81774260
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210217598.4A Pending CN114581413A (en) | 2022-03-07 | 2022-03-07 | Image processing working method and system applied to hair planting |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114581413A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114972307A (en) * | 2022-06-20 | 2022-08-30 | 重庆智泊特机器人有限公司 | Hair follicle automatic identification method and system based on deep learning and hair transplanting robot |
CN115082475A (en) * | 2022-08-22 | 2022-09-20 | 张家港大裕橡胶制品有限公司 | Pollution detection method and system in rubber glove production process |
CN115590584A (en) * | 2022-09-06 | 2023-01-13 | 汕头大学(Cn) | Hair follicle hair taking control method and system based on mechanical arm |
CN116747019A (en) * | 2023-08-11 | 2023-09-15 | 北京碧莲盛不剃发植发医疗美容门诊部有限责任公司 | Automatic hair taking controller and method without shaving and planting hair |
CN117934582A (en) * | 2024-01-24 | 2024-04-26 | 大麦毛发医疗(深圳)集团股份有限公司 | Method and system for calculating area of hair follicle planting pen planting area |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102693352A (en) * | 2006-10-05 | 2012-09-26 | 修复型机器人公司 | Follicular unit transplantation planner and methods of its use |
CN103517687A (en) * | 2011-05-18 | 2014-01-15 | 修复型机器人公司 | Systems and methods for selecting a desired quantity of follicular units |
US20160253799A1 (en) * | 2013-11-01 | 2016-09-01 | The Florida International University Board Of Trustees | Context Based Algorithmic Framework for Identifying and Classifying Embedded Images of Follicle Units |
CN109452959A (en) * | 2018-11-27 | 2019-03-12 | 王鹏君 | A kind of method and device of seamless Multi-layer technology |
CN109938844A (en) * | 2014-07-31 | 2019-06-28 | 修复型机器人公司 | System and method for generating hair transplantation position |
CN111755097A (en) * | 2020-07-06 | 2020-10-09 | 南方医科大学南方医院 | Method and system for automatically calculating hair follicle planting amount |
US20210059754A1 (en) * | 2019-08-28 | 2021-03-04 | TrichoLAB GmbH | Hair transplant planning system |
WO2021059279A1 (en) * | 2019-09-26 | 2021-04-01 | Spider Medical Ltd. | An automated system and a method for performing hair restoration |
CN112914516A (en) * | 2021-03-25 | 2021-06-08 | 王宏鑫 | Intelligent detection method for head hair planting area and head auxiliary detection system |
CN113081262A (en) * | 2021-06-09 | 2021-07-09 | 南京新生医疗科技有限公司 | Method and system for intelligently planning hair transplanting area at fixed point |
CN113100937A (en) * | 2021-06-16 | 2021-07-13 | 南京新生医疗科技有限公司 | Hair transplant density determination method and system based on intelligent comparison |
CN113627425A (en) * | 2021-07-16 | 2021-11-09 | 汕头大学 | Hair follicle identification and extraction method and system based on neural network model |
CN215227528U (en) * | 2021-03-25 | 2021-12-21 | 王宏鑫 | Auxiliary detection system for hair transplanting area of head and hair transplanting lens |
-
2022
- 2022-03-07 CN CN202210217598.4A patent/CN114581413A/en active Pending
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102693352A (en) * | 2006-10-05 | 2012-09-26 | 修复型机器人公司 | Follicular unit transplantation planner and methods of its use |
CN103517687A (en) * | 2011-05-18 | 2014-01-15 | 修复型机器人公司 | Systems and methods for selecting a desired quantity of follicular units |
US20160253799A1 (en) * | 2013-11-01 | 2016-09-01 | The Florida International University Board Of Trustees | Context Based Algorithmic Framework for Identifying and Classifying Embedded Images of Follicle Units |
CN109938844A (en) * | 2014-07-31 | 2019-06-28 | 修复型机器人公司 | System and method for generating hair transplantation position |
CN109452959A (en) * | 2018-11-27 | 2019-03-12 | 王鹏君 | A kind of method and device of seamless Multi-layer technology |
US20210059754A1 (en) * | 2019-08-28 | 2021-03-04 | TrichoLAB GmbH | Hair transplant planning system |
WO2021059279A1 (en) * | 2019-09-26 | 2021-04-01 | Spider Medical Ltd. | An automated system and a method for performing hair restoration |
CN111755097A (en) * | 2020-07-06 | 2020-10-09 | 南方医科大学南方医院 | Method and system for automatically calculating hair follicle planting amount |
CN112914516A (en) * | 2021-03-25 | 2021-06-08 | 王宏鑫 | Intelligent detection method for head hair planting area and head auxiliary detection system |
CN215227528U (en) * | 2021-03-25 | 2021-12-21 | 王宏鑫 | Auxiliary detection system for hair transplanting area of head and hair transplanting lens |
CN113081262A (en) * | 2021-06-09 | 2021-07-09 | 南京新生医疗科技有限公司 | Method and system for intelligently planning hair transplanting area at fixed point |
CN113100937A (en) * | 2021-06-16 | 2021-07-13 | 南京新生医疗科技有限公司 | Hair transplant density determination method and system based on intelligent comparison |
CN113627425A (en) * | 2021-07-16 | 2021-11-09 | 汕头大学 | Hair follicle identification and extraction method and system based on neural network model |
Non-Patent Citations (1)
Title |
---|
朱世伟: "单位毛囊和单株毛囊移植修复发际线缺陷分析", 《中国医疗美容》 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114972307A (en) * | 2022-06-20 | 2022-08-30 | 重庆智泊特机器人有限公司 | Hair follicle automatic identification method and system based on deep learning and hair transplanting robot |
CN115082475A (en) * | 2022-08-22 | 2022-09-20 | 张家港大裕橡胶制品有限公司 | Pollution detection method and system in rubber glove production process |
CN115590584A (en) * | 2022-09-06 | 2023-01-13 | 汕头大学(Cn) | Hair follicle hair taking control method and system based on mechanical arm |
CN115590584B (en) * | 2022-09-06 | 2023-11-14 | 汕头大学 | Hair follicle taking control method and system based on mechanical arm |
CN116747019A (en) * | 2023-08-11 | 2023-09-15 | 北京碧莲盛不剃发植发医疗美容门诊部有限责任公司 | Automatic hair taking controller and method without shaving and planting hair |
CN117934582A (en) * | 2024-01-24 | 2024-04-26 | 大麦毛发医疗(深圳)集团股份有限公司 | Method and system for calculating area of hair follicle planting pen planting area |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114581413A (en) | Image processing working method and system applied to hair planting | |
CN113081262B (en) | Method and system for intelligently planning hair transplanting area at fixed point | |
CN103077529B (en) | Based on the plant leaf blade characteristic analysis system of image scanning | |
CN101506825B (en) | System and method for classifying follicular units | |
Singh et al. | Detection and classification of plant leaf diseases in image processing using MATLAB | |
US11471218B2 (en) | Hair transplant planning system | |
CN111524080A (en) | Face skin feature identification method, terminal and computer equipment | |
US20150379350A1 (en) | Method and system for dividing plant organ point cloud | |
JP5102302B2 (en) | Data processing method, apparatus and computer program | |
CN111931811A (en) | Calculation method based on super-pixel image similarity | |
CN110660070A (en) | Rice vein image extraction method and device | |
CN105844534A (en) | Automatic cow body condition scoring method and scoring device | |
CN112200154A (en) | Face recognition method and device for mask, electronic equipment and storage medium | |
CN110021019B (en) | AI-assisted hair thickness distribution analysis method for AGA clinical image | |
CN110033448B (en) | AI-assisted male baldness Hamilton grading prediction analysis method for AGA clinical image | |
DE102014224656A1 (en) | Method and device for segmenting a medical examination subject with quantitative MR imaging methods | |
CN103440672A (en) | Flowering plant flower image division extracting method | |
Lai et al. | Effective segmentation for dental X-ray images using texture-based fuzzy inference system | |
CN109409182A (en) | Embryo's automatic identifying method based on image procossing | |
Tang et al. | Leaf extraction from complicated background | |
Hitimana et al. | Automatic estimation of live coffee leaf infection based on image processing techniques | |
WO2012013186A2 (en) | Method and device for determining eye torsion | |
CN113658129A (en) | Position extraction method combining visual saliency and line segment strength | |
CN110415246A (en) | A kind of analysis method of stomach fat ingredient | |
CN114862799B (en) | Full-automatic brain volume segmentation method for FLAIR-MRI sequence |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20230531 Address after: Unit C1, 1st Floor, Building B4-3, No. 60 Huayuangang Road, Huangpu District, Shanghai, 200001 Applicant after: Shanghai Chenxi Medical Beauty Clinic Co.,Ltd. Address before: 210000 Zhongshan North Road 346, Gulou District, Nanjing, Jiangsu. Applicant before: NANJING XINSHENG MEDICAL TECHNOLOGY Co.,Ltd. |
|
TA01 | Transfer of patent application right | ||
AD01 | Patent right deemed abandoned |
Effective date of abandoning: 20230811 |
|
AD01 | Patent right deemed abandoned |