CN115969511B - Dehairing instrument control method, device, equipment and storage medium based on identity recognition - Google Patents
Dehairing instrument control method, device, equipment and storage medium based on identity recognition Download PDFInfo
- Publication number
- CN115969511B CN115969511B CN202310108312.3A CN202310108312A CN115969511B CN 115969511 B CN115969511 B CN 115969511B CN 202310108312 A CN202310108312 A CN 202310108312A CN 115969511 B CN115969511 B CN 115969511B
- Authority
- CN
- China
- Prior art keywords
- analysis result
- image
- picture
- skin color
- image analysis
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to the field of data processing, and discloses a depilatory instrument control method, device, equipment and storage medium based on identity recognition, which are used for improving the safety of a depilatory instrument during control and use. The method comprises the following steps: when the dehairing instrument enters a starting state, an image acquisition device preset in the dehairing instrument is used for acquiring an image of a current user, so that a corresponding picture to be processed is obtained; preprocessing the picture to be processed to obtain a picture to be identified; inputting the picture to be identified into a preset image analysis model for image analysis to obtain a corresponding image analysis result; performing skin color analysis on the current user through a spectrum analysis device preset in the dehairing instrument, and determining a corresponding skin color analysis result; and carrying out authority analysis on the current user according to the image analysis result and the skin color analysis result, determining an authority analysis result, and controlling the dehairing instrument according to the authority analysis result.
Description
Technical Field
The present invention relates to the field of data processing, and in particular, to a method, apparatus, device, and storage medium for controlling an epilator based on identity recognition.
Background
With the development of the cosmetic industry, more and more cosmetic instruments are developed and optimized, and dehairing instruments are devices developed for performing the relevant cosmetic project. In the prior art, the general principle of depilatory devices is laser depilatory after cold compress.
However, the existing depilatory instrument cannot accurately identify the skin, and the skin color of the user cannot be accurately detected, and meanwhile, due to the fact that some specific users cannot use the depilatory instrument due to physical reasons, the identity of the user of the depilatory instrument cannot be identified at present, so that the depilatory instrument cannot be controlled according to the identity of the user, and the safety of the user when controlling and using the depilatory instrument is not high at present.
Disclosure of Invention
The invention provides a depilatory instrument control method, a depilatory instrument control device, depilatory instrument control equipment and a depilatory instrument control storage medium based on identity identification, which are used for improving safety when the depilatory instrument is controlled and used.
The invention provides a dehairing instrument control method based on identity recognition, which comprises the following steps: when the dehairing instrument enters a starting state, an image acquisition device preset in the dehairing instrument is used for acquiring an image of a current user, so that a corresponding picture to be processed is obtained; preprocessing the picture to be processed to obtain a picture to be identified; inputting the picture to be identified into a preset image analysis model for image analysis to obtain a corresponding image analysis result; performing skin color analysis on the current user through a spectrum analysis device preset in the dehairing instrument, and determining a corresponding skin color analysis result; and carrying out authority analysis on the current user according to the image analysis result and the skin color analysis result, determining an authority analysis result, and controlling the dehairing instrument according to the authority analysis result.
In the invention, the preprocessing of the picture to be processed to obtain the picture to be identified comprises the following steps:
performing binarization processing on the picture to be processed to obtain a corresponding binarized picture; carrying out pixel point location analysis on the binarized picture, and determining the position information corresponding to each pixel point in the binarized picture; and carrying out semantic segmentation on the picture to be processed based on the position information corresponding to each pixel point in the binarized picture to obtain the picture to be identified.
In the invention, the inputting the picture to be identified into a preset image analysis model for image analysis to obtain a corresponding image analysis result comprises the following steps: inputting the picture to be identified into the image analysis model for feature extraction, and determining corresponding image feature vectors; performing gray feature conversion on the image feature vector to obtain a corresponding gray feature vector; performing feature fusion processing on the image feature vector and the gray feature vector to obtain a target fusion vector; and inputting the target fusion vector into the image analysis model to perform image analysis, so as to obtain a corresponding image analysis result.
In the invention, the inputting the target fusion vector into the image analysis model for image analysis to obtain a corresponding image analysis result comprises the following steps: inputting the target fusion vector into the image analysis model, and carrying out Laplace filtering processing on the target fusion vector through the image analysis model to obtain an ambiguity evaluation value corresponding to the target fusion vector; and analyzing the ambiguity evaluation value through a preset evaluation value mapping function to obtain a corresponding image analysis result.
In the invention, the skin color analysis is carried out on the current user through a spectrum analysis device preset in the dehairing instrument, and the corresponding skin color analysis result is determined, which comprises the following steps: transmitting light rays to the current user through the spectrum analysis device, and collecting reflected light rays corresponding to the transmitted light rays; performing spectrum analysis on the reflected light rays to determine corresponding skin color mapping values; and performing skin color type matching on the skin color mapping value, and determining a corresponding skin color analysis result.
In the present invention, the performing authority analysis on the current user by the image analysis result and the skin color analysis result, determining the authority analysis result, and controlling the dehairing instrument by the authority analysis result includes: respectively carrying out weight parameter analysis on the image analysis result and the skin color analysis result to obtain a first weight parameter corresponding to the image analysis result and a second weight parameter corresponding to the skin color analysis result; performing fusion score calculation on the image analysis result and the skin color analysis result through the first weight parameter and the second weight parameter, and determining a corresponding target fusion score; and carrying out authority analysis on the current user based on the target fusion value, determining the authority analysis result, and controlling the dehairing instrument through the authority analysis result.
In the present invention, the calculating the fusion score of the image analysis result and the skin color analysis result by the first weight parameter and the second weight parameter, and determining the corresponding target fusion score includes: performing first score mapping on the image analysis result to obtain a corresponding first score; performing second score mapping on the skin color analysis result to obtain a corresponding second score; and carrying out weighted calculation on the first score and the second score through the first weight parameter and the second weight parameter to obtain a corresponding target fusion score.
The invention also provides a dehairing instrument control device based on the identity recognition, which comprises:
the image acquisition module is used for acquiring an image of a current user through an image acquisition device preset in the dehairing instrument when the dehairing instrument enters a starting state, so as to obtain a corresponding picture to be processed;
the image processing module is used for preprocessing the picture to be processed to obtain a picture to be identified;
the first analysis module is used for inputting the picture to be identified into a preset image analysis model to perform image analysis to obtain a corresponding image analysis result;
the second analysis module is used for carrying out skin color analysis on the current user through a spectrum analysis device preset in the dehairing instrument and determining a corresponding skin color analysis result;
and the permission analysis module is used for performing permission analysis on the current user according to the image analysis result and the skin color analysis result, determining the permission analysis result and controlling the dehairing instrument according to the permission analysis result.
A third aspect of the present invention provides an identification-based depilatory control apparatus, comprising: a memory and at least one processor, the memory having instructions stored therein; the at least one processor invokes the instructions in the memory to cause the identification-based depilatory control apparatus to perform the identification-based depilatory control method described above.
A fourth aspect of the invention provides a computer readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform the identification-based depilatory control method described above.
According to the technical scheme provided by the invention, when the dehairing instrument enters a starting state, the current user is subjected to image acquisition through the image acquisition device preset in the dehairing instrument, so that a corresponding picture to be processed is obtained; preprocessing the picture to be processed to obtain a picture to be identified; inputting the picture to be identified into a preset image analysis model for image analysis to obtain a corresponding image analysis result; performing skin color analysis on the current user through a spectrum analysis device preset in the dehairing instrument, and determining a corresponding skin color analysis result; performing authority analysis on the current user by the image analysis result and the skin color analysis result, determining the authority analysis result, and controlling the dehairing instrument by the authority analysis result; the authority analysis is carried out on the current user by the image analysis result and the skin color analysis result, so that the attribute and the authority of the current user can be accurately analyzed, the accuracy of identifying unauthorized users is improved, and the safety of the dehairing instrument in control and use is further improved.
Drawings
FIG. 1 is a schematic diagram of an embodiment of a dehairing instrument control method based on identification in an embodiment of the present invention;
FIG. 2 is a flowchart of inputting a picture to be identified into a preset image analysis model for image analysis in an embodiment of the present invention;
FIG. 3 is a flowchart of performing authority analysis on a current user according to an embodiment of the present invention;
FIG. 4 is a schematic view of an embodiment of an identification-based epilator control apparatus in accordance with an embodiment of the present invention;
fig. 5 is a schematic view of an embodiment of the epilator control apparatus based on identification in an embodiment of the invention.
Detailed Description
The embodiment of the invention provides a dehairing instrument control method, device, equipment and storage medium based on identity recognition, which are used for improving the safety of enterprise data management and analysis. The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
For ease of understanding, a specific flow of an embodiment of the present invention is described below with reference to fig. 1, where an embodiment of a method for controlling an epilator based on identification in an embodiment of the present invention includes:
s101, when the dehairing instrument enters a starting state, acquiring an image of a current user through an image acquisition device preset in the dehairing instrument to obtain a corresponding picture to be processed;
it will be appreciated that the executing body of the present invention may be an epilator control apparatus based on identification, and may also be a terminal or a server, which is not limited herein. The embodiment of the invention is described by taking a server as an execution main body as an example.
Specifically, when the dehairing instrument enters a starting state, an image acquisition device preset in the dehairing instrument is used for acquiring an image of a current user to obtain a corresponding picture to be processed, wherein the server is used for acquiring the image of the current user at a position to be used by controlling the image acquisition device, and the fact that the acquired image needs to be encrypted for ensuring the privacy of the user when the image acquisition is carried out so as to further ensure the safety and privacy of the image, and meanwhile, the picture to be processed is an intention dehairing position picture of the current user.
S102, preprocessing a picture to be processed to obtain a picture to be identified;
the server performs binarization processing on the picture to be processed to obtain a corresponding binarized picture; the server analyzes pixel points of the binarized picture and determines position information corresponding to each pixel point in the binarized picture; and the server performs semantic segmentation on the picture to be processed based on the position information corresponding to each pixel point in the binarized picture to obtain the picture to be identified.
S103, inputting the picture to be identified into a preset image analysis model for image analysis to obtain a corresponding image analysis result;
specifically, the server inputs the picture to be identified into an image analysis model for image analysis, wherein when the image analysis is carried out, the server carries out feature extraction on the picture to be identified through a feature extraction network of the image analysis model to obtain an image feature vector, the subsequent server carries out gray feature conversion on the picture to be identified to generate a corresponding gray feature vector, and then the server carries out image analysis through the image analysis model based on the gray feature vector of the image feature vector machine to generate an image analysis result.
S104, performing skin color analysis on the current user through a spectrum analysis device preset in the dehairing instrument, and determining a corresponding skin color analysis result;
specifically, skin color analysis is performed on the current user through a spectrum analysis device preset in the dehairing instrument, specifically, the server emits light to the current user through the spectrum analysis device and collects reflected light at the same time, and then the server performs skin color analysis on the current user through the reflected light to determine skin color analysis results.
S105, performing authority analysis on the current user according to the image analysis result and the skin color analysis result, determining the authority analysis result, and controlling the dehairing instrument according to the authority analysis result.
Specifically, the server calculates the score according to the image analysis result and the skin color analysis result, determines a corresponding target fusion score, further performs authority analysis on the current user through the target fusion score, and determines a corresponding authority analysis result.
In the embodiment of the invention, when the dehairing instrument enters a starting state, the image acquisition device preset in the dehairing instrument is used for acquiring the image of the current user to obtain the corresponding picture to be processed; preprocessing the picture to be processed to obtain a picture to be identified; inputting the picture to be identified into a preset image analysis model for image analysis to obtain a corresponding image analysis result; performing skin color analysis on the current user through a spectrum analysis device preset in the dehairing instrument, and determining a corresponding skin color analysis result; performing authority analysis on the current user by the image analysis result and the skin color analysis result, determining the authority analysis result, and controlling the dehairing instrument by the authority analysis result; the authority analysis is carried out on the current user by the image analysis result and the skin color analysis result, so that the attribute and the authority of the current user can be accurately analyzed, the accuracy of identifying unauthorized users is improved, and the safety of the dehairing instrument in control and use is further improved.
In an embodiment of the present invention, the process of executing step S102 may specifically include the following steps:
(1) Performing binarization processing on the picture to be processed to obtain a corresponding binarized picture;
(2) Carrying out pixel point location analysis on the binarized picture, and determining the position information corresponding to each pixel point in the binarized picture;
(3) And carrying out semantic segmentation on the picture to be processed based on the position information corresponding to each pixel point in the binarized picture to obtain the picture to be identified.
Specifically, the server carries out binarization processing on the picture to be processed through a preset binarization algorithm to obtain a corresponding binarization picture and confidence coefficient and position information of each pixel point in the binarization picture, and determines attribute information of each pixel point based on the confidence coefficient and position information of each pixel point in the obtained binarization picture; and carrying out semantic segmentation on the obtained binarized picture based on the attribute information of each pixel point, finally obtaining the picture to be identified, and overcoming a plurality of situations which are simultaneously related and are unfavorable for binarization processing through a binarization algorithm, so that the binarized picture after semantic segmentation retains as much information as possible, and information loss is reduced.
In a specific embodiment, the process of executing step S104 may specifically include the following steps:
(1) Transmitting light rays to a current user through a spectrum analysis device, and collecting reflected light rays corresponding to the transmitted light rays;
(2) Performing spectrum analysis on the reflected light rays to determine corresponding skin color mapping values;
(3) And performing skin color type matching on the skin color mapping value, and determining a corresponding skin color analysis result.
Specifically, the spectrum analysis device is used for emitting light rays to the current user, collecting reflected light rays corresponding to the emitted light rays, carrying out spectrum analysis on the reflected light rays, and determining corresponding skin color mapping values, and it is required to be noted that before carrying out spectrum analysis on the reflected light rays, a plurality of sample spectrum data are also required to be collected, wherein the server is used for collecting the plurality of sample spectrum data, further processing and screening the plurality of sample spectrum data, finally determining a corresponding spectrum analysis model, and further carrying out spectrum analysis on the reflected light rays through the spectrum analysis model, and determining the corresponding skin color mapping values.
When spectrum analysis is performed on reflected light rays and skin tone mapping values are determined, a server firstly determines skin tone areas in a picture to be processed, determines shadow probability maps and brightness mapping maps corresponding to the skin tone areas, wherein the shadow probability maps comprise shadow probability values corresponding to each pixel in the skin tone areas, the brightness mapping maps comprise brightness mapping values of each pixel in the skin tone areas, determines shadow distribution maps corresponding to the skin tone areas according to the shadow probability maps and the brightness mapping maps, determines corresponding skin tone mapping values, performs skin tone type matching on the skin tone mapping values, and determines corresponding skin tone analysis results.
In a specific embodiment, as shown in fig. 2, the process of performing step S103 may specifically include the following steps:
s201, inputting a picture to be identified into an image analysis model for feature extraction, and determining a corresponding image feature vector;
s202, converting gray features of the image feature vectors to obtain corresponding gray feature vectors;
s203, carrying out feature fusion processing on the image feature vector and the gray feature vector to obtain a target fusion vector;
s204, inputting the target fusion vector into an image analysis model for image analysis to obtain a corresponding image analysis result.
Specifically, inputting a picture to be identified into an image analysis model for feature extraction, and determining a corresponding image feature vector, wherein a server extracts an effective area image and pixel point color value data of the picture to be identified, performs color block segmentation, acquires image data of the effective area image, performs multi-equal-division subdivision on the effective area image, acquires subdivision areas of the effective area image, performs connected domain confirmation, line segment identification and line length measurement on the subdivision areas to acquire image feature data of the subdivision areas, performs statistics and combination processing on the image feature data of the subdivision areas to acquire the image feature vector, further performs gray feature conversion on the image feature vector to acquire a corresponding gray feature vector, performs feature fusion processing on the image feature vector and the gray feature vector to acquire a target fusion vector, and inputs the target fusion vector into the image analysis model for image analysis, so as to acquire a corresponding image analysis result.
When the server performs gray feature conversion on the image feature vector, the server
And extracting the edge of each mark point in the image feature vector, fitting the edge to be a closed area, namely, the center circle of the mark point, taking the center circle of the mark point as the center, determining a plurality of concentric rings in the ring, calculating the feature gray value of each concentric ring, and further carrying out gray feature conversion according to the feature gray value by the server to obtain a corresponding gray feature vector.
In a specific embodiment, the process of executing step S204 may specifically include the following steps:
(1) Inputting the target fusion vector into an image analysis model, and carrying out Laplace filtering treatment on the target fusion vector through the image analysis model to obtain a ambiguity evaluation value corresponding to the target fusion vector;
(2) And analyzing the ambiguity evaluation value through a preset evaluation value mapping function to obtain a corresponding image analysis result.
Specifically, inputting a target fusion vector into an image analysis model, carrying out Laplacian filtering processing on the target fusion vector through the image analysis model to obtain an ambiguity evaluation value corresponding to the target fusion vector, wherein a server uses iterative adaptive filtering to construct a Laplacian pyramid, carrying out iterative adaptive filtering on the target fusion vector input in the Laplacian pyramid, carrying out local frequency estimation on an input image by the iterative adaptive filtering, retaining a main frequency part in the image, then carrying out coherence-based filtering processing on the rest part of the image to obtain an ambiguity evaluation value corresponding to the target fusion vector, and analyzing the ambiguity evaluation value through a preset evaluation value mapping function to obtain a corresponding image analysis result.
In a specific embodiment, as shown in fig. 3, the process of executing step S105 may specifically include the following steps:
s301, respectively carrying out weight parameter analysis on an image analysis result and a skin color analysis result to obtain a first weight parameter corresponding to the image analysis result and a second weight parameter corresponding to the skin color analysis result;
s302, carrying out fusion score calculation on an image analysis result and a skin color analysis result through a first weight parameter and a second weight parameter, and determining a corresponding target fusion score;
s303, performing authority analysis on the current user based on the target fusion value, determining an authority analysis result, and controlling the dehairing instrument through the authority analysis result.
Specifically, the server performs weight parameter analysis on the image analysis result and the skin color analysis result to obtain a first weight parameter corresponding to the image analysis result and a second weight parameter corresponding to the skin color analysis result, wherein the server obtains the image analysis result and the skin color analysis result, sample data in the image analysis result and the skin color analysis result are provided with class labels, obtains initial weight parameters, the initial weight parameters comprise class proxy parameters, constructs a similarity loss function of the image analysis result and the skin color analysis result with the initial weight parameters, the similarity loss function comprises a decreasing part based on the similarity between the class proxy parameters and target sample data, and an increasing part based on the similarity between the class proxy parameters and the target sample data, continuously adjusts the initial weight parameters through back propagation until the similarity loss function is minimum, obtains the first weight parameters corresponding to the image analysis result and the second weight parameters corresponding to the skin color analysis result, calculates fusion scores of the image analysis result and the analysis result through the first weight parameters and the second weight parameters, determines corresponding target fusion scores, performs the current user based on the target fusion scores, and determines the control authority of the epilation authority analyzer through the analysis authority.
In a specific embodiment, the process of executing step S302 may specifically include the following steps:
(1) Mapping the first score to the image analysis result to obtain a corresponding first score;
(2) Performing second score mapping on the skin color analysis result to obtain a corresponding second score;
(3) And carrying out weighted calculation on the first score and the second score through the first weight parameter and the second weight parameter to obtain a corresponding target fusion score.
Specifically, an image analysis result and a skin color analysis result to be mapped are obtained, a first score corresponding to the image analysis result and a second score corresponding to the skin color analysis result are obtained through calculation according to a first calculation strategy and a second calculation strategy respectively according to the image analysis result and the skin color analysis result to be mapped, and further the first score and the second score are weighted through a first weight parameter and a second weight parameter, so that a corresponding target fusion score is obtained.
The depilatory control method based on identity recognition in the embodiment of the present invention is described above, and the depilatory control device based on identity recognition in the embodiment of the present invention is described below, referring to fig. 4, an embodiment of the depilatory control device based on identity recognition in the embodiment of the present invention includes:
the image acquisition module 401 is configured to acquire an image of a current user through an image acquisition device preset in the dehairing instrument when the dehairing instrument enters a power-on state, so as to obtain a corresponding picture to be processed;
the image processing module 402 is configured to pre-process the to-be-processed picture to obtain a to-be-identified picture;
the first analysis module 403 is configured to input the picture to be identified into a preset image analysis model for image analysis, so as to obtain a corresponding image analysis result;
a second analysis module 404, configured to perform skin color analysis on the current user by using a spectral analysis device preset in the epilator, and determine a corresponding skin color analysis result;
and the permission analysis module 405 is configured to perform permission analysis on the current user according to the image analysis result and the skin color analysis result, determine a permission analysis result, and control the dehairing instrument according to the permission analysis result.
Optionally, the image processing module 402 is specifically configured to: performing binarization processing on the picture to be processed to obtain a corresponding binarized picture; carrying out pixel point location analysis on the binarized picture, and determining the position information corresponding to each pixel point in the binarized picture; and carrying out semantic segmentation on the picture to be processed based on the position information corresponding to each pixel point in the binarized picture to obtain the picture to be identified.
Optionally, the first analysis module 403 further includes:
the feature extraction unit is used for inputting the picture to be identified into the image analysis model to perform feature extraction and determining a corresponding image feature vector;
the feature conversion unit is used for carrying out gray feature conversion on the image feature vector to obtain a corresponding gray feature vector;
the fusion processing unit is used for carrying out feature fusion processing on the image feature vector and the gray feature vector to obtain a target fusion vector;
and the image analysis unit is used for inputting the target fusion vector into the image analysis model for image analysis to obtain a corresponding image analysis result.
Optionally, the analysis unit is specifically configured to: inputting the target fusion vector into the image analysis model, and carrying out Laplace filtering processing on the target fusion vector through the image analysis model to obtain an ambiguity evaluation value corresponding to the target fusion vector; and analyzing the ambiguity evaluation value through a preset evaluation value mapping function to obtain a corresponding image analysis result.
Optionally, the second analysis module 404 is specifically configured to: transmitting light rays to the current user through the spectrum analysis device, and collecting reflected light rays corresponding to the transmitted light rays; performing spectrum analysis on the reflected light rays to determine corresponding skin color mapping values; and performing skin color type matching on the skin color mapping value, and determining a corresponding skin color analysis result.
Optionally, the rights analysis module 405 further includes:
the parameter analysis unit is used for respectively carrying out weight parameter analysis on the image analysis result and the skin color analysis result to obtain a first weight parameter corresponding to the image analysis result and a second weight parameter corresponding to the skin color analysis result;
the score calculating unit is used for calculating the fusion score of the image analysis result and the skin color analysis result through the first weight parameter and the second weight parameter, and determining a corresponding target fusion score;
and the permission analysis unit is used for carrying out permission analysis on the current user based on the target fusion value, determining the permission analysis result and controlling the dehairing instrument through the permission analysis result.
Optionally, the score calculating unit is specifically configured to: performing first score mapping on the image analysis result to obtain a corresponding first score; performing second score mapping on the skin color analysis result to obtain a corresponding second score; and carrying out weighted calculation on the first score and the second score through the first weight parameter and the second weight parameter to obtain a corresponding target fusion score.
Through the cooperation of the components, when the dehairing instrument enters a starting state, an image acquisition device preset in the dehairing instrument is used for acquiring an image of a current user, so that a corresponding picture to be processed is obtained; preprocessing the picture to be processed to obtain a picture to be identified; inputting the picture to be identified into a preset image analysis model for image analysis to obtain a corresponding image analysis result; performing skin color analysis on the current user through a spectrum analysis device preset in the dehairing instrument, and determining a corresponding skin color analysis result; performing authority analysis on the current user by the image analysis result and the skin color analysis result, determining the authority analysis result, and controlling the dehairing instrument by the authority analysis result; the authority analysis is carried out on the current user by the image analysis result and the skin color analysis result, so that the attribute and the authority of the current user can be accurately analyzed, the accuracy of identifying unauthorized users is improved, and the safety of the dehairing instrument in control and use is further improved.
Fig. 5 is a schematic structural diagram of an identification-based depilatory control device according to an embodiment of the present invention, where the identification-based depilatory control device 500 may vary considerably in configuration or performance, and may include one or more processors (central processing units, CPU) 510 (e.g., one or more processors) and memory 520, and one or more storage media 530 (e.g., one or more mass storage devices) storing applications 533 or data 532. Wherein memory 520 and storage medium 530 may be transitory or persistent storage. The program stored on the storage medium 530 may include one or more modules (not shown), each of which may include a series of instruction operations on the identification-based depilatory control device 500. Still further, the processor 610 may be configured to communicate with the storage medium 530 to execute a series of instruction operations in the storage medium 530 on the identification-based depilatory control device 500.
The identification-based depilatory control device 500 may also include one or more power supplies 640, one or more wired or wireless network interfaces 550, one or more input/output interfaces 660, and/or one or more operating systems 531, such as Windows Serve, mac OS X, unix, linux, freeBSD, and the like. It will be appreciated by those skilled in the art that the identification-based depilatory control device configuration shown in fig. 5 is not limiting of the identification-based depilatory control device, and may include more or fewer components than illustrated, or may be combined with certain components, or a different arrangement of components.
The invention also provides a depilatory control device based on identity recognition, which comprises a memory and a processor, wherein the memory stores computer readable instructions which, when executed by the processor, cause the processor to execute the steps of the depilatory control method based on identity recognition in the above embodiments.
The present invention also provides a computer readable storage medium, which may be a non-volatile computer readable storage medium, or may be a volatile computer readable storage medium, where instructions are stored in the computer readable storage medium, when the instructions are executed on a computer, cause the computer to perform the steps of the identification-based dehairing instrument control method.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random acceS memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.
Claims (8)
1. An dehairing instrument control method based on identity recognition is characterized by comprising the following steps of;
when the dehairing instrument enters a starting state, an image acquisition device preset in the dehairing instrument is used for acquiring an image of a current user, so that a corresponding picture to be processed is obtained;
preprocessing the picture to be processed to obtain a picture to be identified;
inputting the picture to be identified into a preset image analysis model for image analysis to obtain a corresponding image analysis result;
performing skin color analysis on the current user through a spectrum analysis device preset in the dehairing instrument, and determining a corresponding skin color analysis result;
performing authority analysis on the current user according to the image analysis result and the skin color analysis result, determining an authority analysis result, and controlling the dehairing instrument through the authority analysis result; specifically, respectively carrying out weight parameter analysis on the image analysis result and the skin color analysis result to obtain a first weight parameter corresponding to the image analysis result and a second weight parameter corresponding to the skin color analysis result; performing fusion score calculation on the image analysis result and the skin color analysis result through the first weight parameter and the second weight parameter, and determining a corresponding target fusion score; performing authority analysis on the current user based on the target fusion value, determining an authority analysis result, and controlling the dehairing instrument through the authority analysis result; the image analysis result is subjected to first score mapping to obtain a corresponding first score; performing second score mapping on the skin color analysis result to obtain a corresponding second score; and carrying out weighted calculation on the first score and the second score through the first weight parameter and the second weight parameter to obtain a corresponding target fusion score.
2. The identity-based dehairing instrument control method according to claim 1, wherein the preprocessing the to-be-processed picture to obtain the to-be-identified picture includes:
performing binarization processing on the picture to be processed to obtain a corresponding binarized picture;
carrying out pixel point location analysis on the binarized picture, and determining the position information corresponding to each pixel point in the binarized picture;
and carrying out semantic segmentation on the picture to be processed based on the position information corresponding to each pixel point in the binarized picture to obtain the picture to be identified.
3. The identity-based dehairing instrument control method according to claim 1, wherein the inputting the picture to be identified into a preset image analysis model for image analysis to obtain a corresponding image analysis result comprises:
inputting the picture to be identified into the image analysis model for feature extraction, and determining corresponding image feature vectors;
performing gray feature conversion on the image feature vector to obtain a corresponding gray feature vector;
performing feature fusion processing on the image feature vector and the gray feature vector to obtain a target fusion vector;
and inputting the target fusion vector into the image analysis model to perform image analysis, so as to obtain a corresponding image analysis result.
4. The identity-based dehairing instrument control method according to claim 3, wherein the inputting the target fusion vector into the image analysis model for image analysis, obtaining a corresponding image analysis result, includes:
inputting the target fusion vector into the image analysis model, and carrying out Laplace filtering processing on the target fusion vector through the image analysis model to obtain an ambiguity evaluation value corresponding to the target fusion vector;
and analyzing the ambiguity evaluation value through a preset evaluation value mapping function to obtain a corresponding image analysis result.
5. The identity-based depilatory control method of claim 1, wherein said performing skin color analysis on the current user by a spectral analysis device preset in the depilatory device, determining a corresponding skin color analysis result, comprises:
transmitting light rays to the current user through the spectrum analysis device, and collecting reflected light rays corresponding to the transmitted light rays;
performing spectrum analysis on the reflected light rays to determine corresponding skin color mapping values;
and performing skin color type matching on the skin color mapping value, and determining a corresponding skin color analysis result.
6. An identification-based depilator control device, which is characterized in that the identification-based depilator control device comprises:
the image acquisition module is used for acquiring an image of a current user through an image acquisition device preset in the dehairing instrument when the dehairing instrument enters a starting state, so as to obtain a corresponding picture to be processed;
the image processing module is used for preprocessing the picture to be processed to obtain a picture to be identified;
the first analysis module is used for inputting the picture to be identified into a preset image analysis model to perform image analysis to obtain a corresponding image analysis result;
the second analysis module is used for carrying out skin color analysis on the current user through a spectrum analysis device preset in the dehairing instrument and determining a corresponding skin color analysis result;
the permission analysis module is used for performing permission analysis on the current user according to the image analysis result and the skin color analysis result, determining a permission analysis result and controlling the dehairing instrument through the permission analysis result; specifically, respectively carrying out weight parameter analysis on the image analysis result and the skin color analysis result to obtain a first weight parameter corresponding to the image analysis result and a second weight parameter corresponding to the skin color analysis result; performing fusion score calculation on the image analysis result and the skin color analysis result through the first weight parameter and the second weight parameter, and determining a corresponding target fusion score; performing authority analysis on the current user based on the target fusion value, determining an authority analysis result, and controlling the dehairing instrument through the authority analysis result; the image analysis result is subjected to first score mapping to obtain a corresponding first score; performing second score mapping on the skin color analysis result to obtain a corresponding second score; and carrying out weighted calculation on the first score and the second score through the first weight parameter and the second weight parameter to obtain a corresponding target fusion score.
7. An identification-based depilator control device, comprising: a memory and at least one processor, the memory having instructions stored therein;
the at least one processor invokes the instructions in the memory to cause the identification-based depilatory control apparatus to perform the identification-based depilatory control method of any of claims 1-5.
8. A computer readable storage medium having instructions stored thereon, wherein the instructions when executed by a processor implement the identification-based depilatory control method of any of claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310108312.3A CN115969511B (en) | 2023-02-14 | 2023-02-14 | Dehairing instrument control method, device, equipment and storage medium based on identity recognition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310108312.3A CN115969511B (en) | 2023-02-14 | 2023-02-14 | Dehairing instrument control method, device, equipment and storage medium based on identity recognition |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115969511A CN115969511A (en) | 2023-04-18 |
CN115969511B true CN115969511B (en) | 2023-05-30 |
Family
ID=85959771
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310108312.3A Active CN115969511B (en) | 2023-02-14 | 2023-02-14 | Dehairing instrument control method, device, equipment and storage medium based on identity recognition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115969511B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118177962B (en) * | 2024-05-15 | 2024-07-26 | 深圳市美莱雅智能科技有限公司 | Output energy control method, device, medium and dehairing instrument based on skin color identification |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108846359A (en) * | 2018-06-13 | 2018-11-20 | 新疆大学科学技术学院 | It is a kind of to divide the gesture identification method blended with machine learning algorithm and its application based on skin-coloured regions |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040052418A1 (en) * | 2002-04-05 | 2004-03-18 | Bruno Delean | Method and apparatus for probabilistic image analysis |
US10299871B2 (en) * | 2005-09-30 | 2019-05-28 | Restoration Robotics, Inc. | Automated system and method for hair removal |
WO2010093503A2 (en) * | 2007-01-05 | 2010-08-19 | Myskin, Inc. | Skin analysis methods |
US8951266B2 (en) * | 2011-01-07 | 2015-02-10 | Restoration Robotics, Inc. | Methods and systems for modifying a parameter of an automated procedure |
CN106345066B (en) * | 2016-10-11 | 2019-05-21 | 深圳可思美科技有限公司 | IPL depilatory apparatus |
CN106570909B (en) * | 2016-11-02 | 2020-01-17 | 华为技术有限公司 | Skin color detection method, device and terminal |
JP2020087351A (en) * | 2018-11-30 | 2020-06-04 | キヤノン株式会社 | Information processing system, information processing method and program |
EP3838339A1 (en) * | 2019-12-20 | 2021-06-23 | Koninklijke Philips N.V. | Treatment device and method |
US20220405930A1 (en) * | 2020-12-31 | 2022-12-22 | Lumenis Be Ltd. | Apparatus and method for sensing and analyzing skin condition |
CN114521956B (en) * | 2021-12-31 | 2023-12-19 | 广州星际悦动股份有限公司 | Method, device and medium for controlling output energy based on skin color and dehairing instrument |
-
2023
- 2023-02-14 CN CN202310108312.3A patent/CN115969511B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108846359A (en) * | 2018-06-13 | 2018-11-20 | 新疆大学科学技术学院 | It is a kind of to divide the gesture identification method blended with machine learning algorithm and its application based on skin-coloured regions |
Also Published As
Publication number | Publication date |
---|---|
CN115969511A (en) | 2023-04-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115969511B (en) | Dehairing instrument control method, device, equipment and storage medium based on identity recognition | |
CN110222511B (en) | Malicious software family identification method and device and electronic equipment | |
WO2017198749A1 (en) | Image processing apparatus and method | |
KR20180092197A (en) | Method and device to select candidate fingerprint image for recognizing fingerprint | |
US9715729B2 (en) | Method and apparatus for processing block to be processed of urine sediment image | |
Stojanović et al. | Latent overlapped fingerprint separation: a review | |
CN111553241A (en) | Method, device and equipment for rejecting mismatching points of palm print and storage medium | |
Xia et al. | Rotation-invariant Weber pattern and Gabor feature for fingerprint liveness detection | |
El-Tarhouni et al. | Multi-scale shift local binary pattern based-descriptor for finger-knuckle-print recognition | |
CN112732693B (en) | Intelligent internet of things data acquisition method, device, equipment and storage medium | |
JP2015026283A (en) | Image processing apparatus, image processing method, and program | |
Bodade et al. | Dynamic iris localisation: A novel approach suitable for fake iris detection | |
CN116071348B (en) | Workpiece surface detection method and related device based on visual detection | |
CN116956139A (en) | Equipment association method and device based on infrared band | |
Walker et al. | Locating salient facial features using image invariants | |
KR101628602B1 (en) | Similarity judge method and appratus for judging similarity of program | |
CN111428553B (en) | Face pigment spot recognition method and device, computer equipment and storage medium | |
CN113935420A (en) | Malicious encrypted data detection method and device, computer equipment and storage medium | |
JP2019106089A (en) | Information processing device, information processing method, and computer program | |
CN116665261A (en) | Image processing method, device and equipment | |
ALOUPOGIANNI et al. | Binary malignancy classification of skin tissue using reflectance and texture features from macropathology multi-spectral images | |
Afonso et al. | A fast large scale iris database classification with optimum-path forest technique: A case study | |
Lu | An investigation on self-attentive models for malware classification | |
CN112181687B (en) | Information storage method and related device based on data encryption | |
Savakar et al. | An extremely randomized trees method for weapons classification based on wound patterns of sharp metals using ultrasound images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |