CN117517595B - Cosmetic testing method and device, electronic equipment and storage medium - Google Patents

Cosmetic testing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117517595B
CN117517595B CN202410009064.1A CN202410009064A CN117517595B CN 117517595 B CN117517595 B CN 117517595B CN 202410009064 A CN202410009064 A CN 202410009064A CN 117517595 B CN117517595 B CN 117517595B
Authority
CN
China
Prior art keywords
data
user
test
subjective
objective
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410009064.1A
Other languages
Chinese (zh)
Other versions
CN117517595A (en
Inventor
邱显荣
孙丽丽
李静
冯法晴
梁超
倪芳
张晓洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sanlihuiping Beijing Testing Technology Co ltd
Original Assignee
Sanlihuiping Beijing Testing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sanlihuiping Beijing Testing Technology Co ltd filed Critical Sanlihuiping Beijing Testing Technology Co ltd
Priority to CN202410009064.1A priority Critical patent/CN117517595B/en
Publication of CN117517595A publication Critical patent/CN117517595A/en
Application granted granted Critical
Publication of CN117517595B publication Critical patent/CN117517595B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/0001Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00 by organoleptic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Medicinal Chemistry (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Analytical Chemistry (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Food Science & Technology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application provides a cosmetic testing method, a device, electronic equipment and a storage medium, wherein the method comprises the following steps: testing the product to be tested according to a preset test flow through a target crowd, and acquiring test data of each user in the target crowd; the test data comprises subjective data and objective data; respectively carrying out feature extraction processing on subjective data and objective data, and carrying out splicing processing on a first extraction result aiming at the subjective data and a second extraction result aiming at the objective data to obtain target feature data; classifying the target characteristic data to obtain a test result of each user, and obtaining the product efficacy of the product to be tested according to the test result of each user; the method and the device can combine subjective data and objective data to test the product efficacy of the cosmetics.

Description

Cosmetic testing method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of data processing technologies, and in particular, to a cosmetic testing method, a device, an electronic apparatus, and a storage medium.
Background
In recent years, the proportion of Sensitive Skin (SS) groups is gradually increased, and the cosmetic preparation with soothing efficacy is paid attention to the cosmetic market. The definition of sensitive skin is: the sensitive skin is a high reaction state which occurs under physiological or pathological conditions, mainly occurs on the face, and is characterized in that the skin is easy to generate subjective symptoms such as burning, stinging, itching, tightness and the like when being stimulated by physical, chemical, mental and other factors, and is accompanied with or not accompanied with objective signs such as erythema, scales, telangiectasia and the like.
At present, the cosmetic is mainly tested in a subjective test or objective test mode, the subjective test is often inaccurate in result due to the influence of factors such as human tolerance degree, psychology and environment, and the objective test is usually accurate due to the fact that skin objective data are obtained, but the actual feeling of a user is considered due to the fact that the cosmetic is aimed at the user. The prior art lacks a comprehensive assessment scheme for fusing subjective tests and objective tests.
Disclosure of Invention
In view of the foregoing, embodiments of the present application provide a method, apparatus, electronic device, and storage medium for testing a cosmetic product, which can combine subjective data and objective data to test the product efficacy of the cosmetic product.
The technical scheme of the embodiment of the application is realized as follows:
in a first aspect, embodiments of the present application provide a cosmetic testing method, the method comprising:
testing a product to be tested according to a preset test flow through a target crowd, and acquiring test data of each user in the target crowd; wherein the test data comprises subjective data and objective data;
respectively carrying out feature extraction processing on the subjective data and the objective data, and carrying out splicing processing on a first extraction result aiming at the subjective data and a second extraction result aiming at the objective data to obtain target feature data;
And classifying the target characteristic data to obtain a test result of each user, and obtaining the product efficacy of the product to be tested according to the test result of each user.
In a second aspect, embodiments of the present application further provide a cosmetic testing device, the device comprising:
the acquisition module is used for testing the product to be tested according to a preset test flow through a target crowd and acquiring test data of each user in the target crowd; wherein the test data comprises subjective data and objective data;
the extraction module is used for respectively carrying out feature extraction processing on the subjective data and the objective data, and carrying out splicing processing on a first extraction result aiming at the subjective data and a second extraction result aiming at the objective data to obtain target feature data;
and the classification module is used for carrying out classification processing on the target characteristic data to obtain a test result of each user, and obtaining the product efficacy of the product to be tested according to the test result of each user.
In a third aspect, embodiments of the present application further provide an electronic device, including: a processor, a storage medium, and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor in communication with the storage medium via the bus when the electronic device is operating, the processor executing the machine-readable instructions to perform the cosmetic testing method of any one of the first aspects.
In a fourth aspect, embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the cosmetic testing method of any one of the first aspects.
The embodiment of the application has the following beneficial effects:
testing a product to be tested according to a preset test flow through a target crowd, and acquiring test data of each user in the target crowd, wherein the test data comprises subjective data and objective data; then, respectively carrying out feature extraction processing on the subjective data and the objective data, and carrying out splicing processing on a first extraction result aiming at the subjective data and a second extraction result aiming at the objective data to obtain target feature data, so that the objective data and the subjective data with different data forms can be spliced in a feature form, and the data format is unified; and then classifying the target characteristic data to obtain a test result of each user, and obtaining the product efficacy of the product to be tested according to the test result of each user, wherein the obtained product efficacy considers both objective data and objective data, and compared with a single test mode, the obtained product efficacy has higher association degree with the user.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of steps S101-S103 provided in an embodiment of the present application;
fig. 2 is a schematic flow chart of steps S201-S202 provided in the embodiment of the present application;
fig. 3 is a schematic flow chart of steps S301 to S303 provided in the embodiment of the present application;
fig. 4 is a schematic flow chart of steps S401 to S405 provided in the embodiment of the present application;
fig. 5 is a schematic flow chart of steps S501-S502 provided in the embodiment of the present application;
FIG. 6 is a schematic diagram of feature extraction provided by an embodiment of the present application;
fig. 7 is a schematic structural view of a cosmetic testing device according to an embodiment of the present application;
fig. 8 is a schematic diagram of a composition structure of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it should be understood that the accompanying drawings in the present application are only for the purpose of illustration and description, and are not intended to limit the protection scope of the present application. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in this application, illustrates operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Moreover, one or more other operations may be added to the flow diagrams and one or more operations may be removed from the flow diagrams as directed by those skilled in the art.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is to be understood that "some embodiments" can be the same subset or different subsets of all possible embodiments and can be combined with one another without conflict.
In addition, the described embodiments are only some, but not all, of the embodiments of the present application. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be within the scope of the present application.
In the following description, the terms "first", "second", "third" and the like are merely used to distinguish similar objects and do not represent a particular ordering of the objects, it being understood that the "first", "second", "third" may be interchanged with a particular order or sequence, as permitted, to enable embodiments of the application described herein to be practiced otherwise than as illustrated or described herein.
It should be noted that the term "comprising" will be used in the embodiments of the present application to indicate the presence of the features stated hereinafter, but not to exclude the addition of other features.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application and is not intended to be limiting of the present application.
Referring to fig. 1, fig. 1 is a schematic flow chart of steps S101 to S103 of the cosmetic testing method according to the embodiment of the present application, and will be described with reference to steps S101 to S103 shown in fig. 1.
Step S101, testing products to be tested according to a preset test flow through a target crowd, and obtaining test data of each user in the target crowd; wherein the test data comprises subjective data and objective data;
step S102, respectively carrying out feature extraction processing on the subjective data and the objective data, and carrying out splicing processing on a first extraction result aiming at the subjective data and a second extraction result aiming at the objective data to obtain target feature data;
and step 103, classifying the target characteristic data to obtain a test result of each user, and obtaining the product efficacy of the product to be tested according to the test result of each user.
According to the cosmetic testing method, products to be tested are tested according to a preset testing flow through target groups, and testing data of each user in the target groups are obtained, wherein the testing data comprise subjective data and objective data; then, respectively carrying out feature extraction processing on the subjective data and the objective data, and carrying out splicing processing on a first extraction result aiming at the subjective data and a second extraction result aiming at the objective data to obtain target feature data, so that the objective data and the subjective data with different data forms can be spliced in a feature form, and the data format is unified; and then classifying the target characteristic data to obtain a test result of each user, and obtaining the product efficacy of the product to be tested according to the test result of each user, wherein the obtained product efficacy considers both objective data and objective data, and compared with a single test mode, the obtained product efficacy has higher association degree with the user.
The following describes the above exemplary steps of the embodiments of the present application, respectively.
In step S101, testing a product to be tested according to a preset test flow by a target crowd, and obtaining test data of each user in the target crowd; wherein the test data includes subjective data and objective data.
Here, the subjects of the target population are preferably between 18 and 45 years of age (excluding pregnant and lactating women). In addition, the subject cannot have any of the following:
(1) For the last month, the operator who affects the skin hemoglobin concentration or who has performed an operation that affects the sensitivity of the skin nerve;
(2) Antihistamines used in the last week or immunosuppressants used in the last month;
(3) Any anti-inflammatory agent is administered to the subject at the site in the last two months;
(4) A subject suffering from a clinically unhealed inflammatory skin condition;
(5) Insulin dependent diabetes mellitus patients;
(6) Asthma or other chronic respiratory disease patients undergoing treatment;
(7) Patients who received anti-cancer treatments in nearly six months;
(8) Patients with immunodeficiency or autoimmune disease;
(9) Women in lactation or gestation;
(10) Bilateral mastectomy and bilateral axillary lymphadenectomy;
(11) A judging person who affects the test result at the skin to-be-tested part due to scar, atrophy or other damage;
(12) Participate in other clinical trial researchers;
(13) Non-volunteer participants or those who were unable to complete the prescribed content as required by the experiment.
The subject needs to meet the following requirements during the experiment:
(1) Inhibit oral or topical administration of other agents that are alleged to have effects on cutaneous nerve sensitivity or similar alleged;
(2) The subject site is prohibited from receiving cosmetic surgery or other cosmetic means that may affect the sensitivity of the cutaneous nerve;
(3) The subject is mainly subjected to indoor activities, and long-term exposure to outdoor illumination is avoided.
After classifying subjects according to a subjective evaluation method (for example, a sensitive skin questionnaire (warsier questionnaire)), the skin index value of each subject is detected.
(1) Preferably, the skin index values of the normal skin crowd group and the sensitive skin crowd group are obtained under the conditions that the ambient temperature is 20-22 ℃ and the humidity is 40-60%, and real-time dynamic monitoring is carried out, and the testing conditions in the testing process should be kept consistent, such as: a tester, a site or instrument, etc.
(2) Preferably, under the management of a tester, the face of a subject is cleaned by clean water, and is sucked by a chipless water-absorbing dry tissue after being washed, and the subject sits still in a test environment with the temperature of 20-22 ℃ and the humidity of 40-60%, cannot drink water and drink, and the nasal and labial sulcus parts are exposed, so that the subject is kept relaxed and is prevented from contacting the nasal and labial sulcus parts.
(3) Preferably, the nasolabial folds are used as the test sites, and the area of the test site is preferably at least 2cm×2cm.
(4) Preferably the subject has a sitting time of at least 30min, even at least 45min.
In some embodiments, referring to fig. 2, fig. 2 is a flowchart of steps S201 to S202 provided in the embodiments of the present application, where the obtaining test data of each user in the target crowd includes: the subjective data of each user is obtained through steps S201-S202, which will be described in connection with specific steps.
In step S201, a questionnaire is determined based on the product to be tested.
In step S202, the questionnaire filled in by each user is acquired; wherein the subjective data of each user is included in the questionnaire filled in by each user.
Here, the subjective data is acquired in the form of a questionnaire. The product to be tested may be determined according to the product to be tested, for example, a cosmetic product having a soothing effect, and the corresponding questionnaire may have the following problems: "do you feel that the present cosmetic has soothing function? "," how much you feel relaxed after the application of the present cosmetic? (0-100) ". Existing medical questionnaires, such as the Huaxi questionnaire, may also be used.
Illustratively, the Huaxi questionnaire questions include objective questions (subject subjective assessment questionnaire) +subjective questions (subject self-questions).
Mature questionnaires are selected that are generally accepted in the industry. The questions and answers are as follows, 1-4 points are respectively assigned to the options A-D, the options of the items of allergy history, family allergy history and facial disease history belong to "have" or "none", the "no" is set as 1 point, and the "yes" is set as 2 points; the scores are 12 to 17 for tolerance type (skin insensitivity), 18 to 23 for mild sensitivity, 24 to 32 for moderate sensitivity, and 33 to 42 for severe sensitivity.
The title may be: is there symptoms of erythema, flushing, pimple, itching, tightness, desquamation, tingling, etc. of unknown cause on your face? Whether symptoms such as erythema, flushing, pimple, itching, tightness, desquamation, stinging and the like appear on the face when the temperature of the environment changes or when the room is conditioned or windy? Is the facial symptoms of erythema, flushing, pimple, itching, tightness, desquamation, tingling, etc. at seasonal changes? And the like.
In the embodiment of the application, 20 subjects are selected, and the corresponding questionnaire conditions are shown in table 1:
TABLE 1
In step S102, feature extraction processing is performed on the subjective data and the objective data, and a first extraction result for the subjective data and a second extraction result for the objective data are spliced to obtain target feature data.
In some embodiments, the objective data includes image data, index data, and experimental data, and the obtaining test data for each user in the target population includes:
the objective data for each user is obtained by:
acquiring image data of each user through image acquisition equipment, wherein the image data comprises a left face shot, a front face shot and a right face shot;
and obtaining index data of each user through a first detection device, wherein the index data comprises percutaneous water loss and skin colorValue, skin heme index EI, skin sensory current threshold;
and acquiring experimental data of each user through a second detection device, wherein the experimental data comprise lactic acid stinging experimental data, DMSO (DMSO) stimulation experimental data and capsaicin experimental data.
The lactic acid stinging data are shown in table 2:
TABLE 2
The percutaneous water loss data are shown in table 3:
TABLE 3 Table 3
DMSO stimulation test data are shown in table 4:
TABLE 4 Table 4
Skin colorThe experimental data of the values are shown in table 5:
TABLE 5
Skin heme index EI data are shown in Table 6:
TABLE 6
Capsaicin experimental data are shown in table 7:
TABLE 7
Skin feel current threshold data are shown in table 8:
TABLE 8
In some embodiments, referring to fig. 3, fig. 3 is a schematic flow chart of steps S301 to S303 provided in the embodiment of the present application, and the subjective data is subjected to feature extraction processing through steps S301 to S303, and the description will be made with reference to each step.
In step S301, the subjective data is input into a BERT language model to obtain text data corresponding to the subjective data, and normalization processing is performed on numerical data in the subjective data, so that the numerical data is limited within a preset range.
In step S302, the text data and the numerical data are spliced, and the splicing result is input into a matrix composed of random numbers, so as to obtain a first matrix with a preset format.
In step S303, the first matrix is back-propagated through a dynamic gate, and the back-propagated result is input to a codec.
For example, referring to fig. 6, fig. 6 is a schematic diagram of feature extraction provided by an embodiment of the present application, as shown in fig. 6, and in terms of text, the BERT model is used herein for training. (1) After the input sentence passes the BERT, a word vector of the input sentence is obtained. (2) The numerical data is processed, which is referred to herein as questionnaire scoring data, normalized, and normalized using maximum and minimum normalization, in order to limit the data to a certain range after processing, so as to reduce the influence of singular data on subsequent experiments. (3) And splicing the normalized questionnaire score and the text data after BERT pre-training. (4) Together into a matrix of random numbers such that the output is 1 by 128. (5) through dynamic gate and then input to the codec.
In some embodiments, referring to fig. 4, fig. 4 is a schematic flow chart of steps S401 to S405 provided in the embodiment of the present application, and the image data in the objective data is subjected to feature extraction processing through steps S401 to S405, which will be described in connection with each step.
In step S401, the left face shot, the front face shot and the right face shot are preprocessed to obtain an analyzed left face shot, an analyzed front face shot and an analyzed right face shot.
In step S402, the left face shot, the front face shot, the right face shot, the left face shot after analysis, the front face shot after analysis and the right face shot after analysis are input into an image classification model, so as to perform block processing on a two-dimensional image, flatten each image block into a one-dimensional vector, perform linear projection transformation on each vector, introduce position codes, and add position information of a sequence.
In step S403, the vector of each of the segmented pictures is input into a matrix composed of random numbers so that the sizes of the vectors are the same.
In step S404, all vectors are stitched into a second matrix of a preset format.
In step S405, the second matrix is back-propagated through a dynamic gate, and the back-propagated result is input to a codec.
For example, referring to fig. 6, in the picture analysis, we will preprocess the left side face, right side face, front face and their corresponding left side face, right side face, front face pictures formed by the facial analysis software.
(1) The image is processed by Vision Transformer technology, the aim is to block two-dimensional images, flatten each image block into one-dimensional vectors, then carry out linear projection transformation on each vector, introduce position codes at the same time, and add the position information of the sequence. (2) The vector of each already segmented picture is then input into a matrix of random numbers in order to make the vectors the same size. (3) All vectors are stitched to form a 1 by 128 matrix. (4) inputting it into the dynamic gate for back propagation. (5) The back-propagated results are input to the codec where we do this using a 12-layer transducer.
In some embodiments, referring to fig. 5, fig. 5 is a schematic flow chart of steps S501-S502 provided in the embodiments of the present application, and the characteristic extraction process is performed on the index data or the experimental data in the objective data through steps S501-S502, which will be described in connection with each step.
In step S501, the index data or the experimental data is normalized and then spliced and input into a matrix composed of random numbers, so as to obtain a third matrix in a preset format.
In step S502, the third matrix is back-propagated through a dynamic gate, and the back-propagated result is input to a codec.
For example, please continue to refer to fig. 6, the lactic acid stinging test, the DMSO stimulus test, and the capsaicin test are spliced and input into a matrix composed of random numbers after the data normalization process, so that the output result is 1 by 128, and then input into the codec after passing through the dynamic gate.
Transdermal moisture loss, skin colorThe value, the skin heme index EI and the skin feel current threshold value are spliced and input into a matrix consisting of random numbers after data normalization processing is carried out on four experimental results, so that the output result is 1 times128, through the dynamic gate and into the codec.
In step S103, the target feature data is classified to obtain a test result of each user, and the product efficacy of the product to be tested is obtained according to the test result of each user.
After the experiment is finished, the four matrixes of 1 by 128 are spliced by using a Concat function to form a matrix of 4 by 128, the matrix is input into a full-connection layer, and then the output result is subjected to a three-classification task through a Sigmod function, so that the skin sensitivity type of the subject is obtained.
In some embodiments, the classifying the target feature data to obtain a test result of each user includes:
and classifying the target feature data based on a preset data weight, wherein the data weight of the first extraction result in the target feature data is lower than the data weight of the second extraction result in the target feature data.
Here, the first extraction result corresponding to the subjective data and the second extraction result corresponding to the objective data may be given a data weight of the response, for example, the subjective data may have a weight of 40%, the objective data may have a weight of 60%, and in general, the objective data may have a weight greater than the subjective data, so as to ensure accuracy of the detection result.
The final output results are shown in table 9:
TABLE 9
In summary, the embodiment of the application has the following beneficial effects:
Testing a product to be tested according to a preset test flow through a target crowd, and acquiring test data of each user in the target crowd, wherein the test data comprises subjective data and objective data; then, respectively carrying out feature extraction processing on the subjective data and the objective data, and carrying out splicing processing on a first extraction result aiming at the subjective data and a second extraction result aiming at the objective data to obtain target feature data, so that the objective data and the subjective data with different data forms can be spliced in a feature form, and the data format is unified; and then classifying the target characteristic data to obtain a test result of each user, and obtaining the product efficacy of the product to be tested according to the test result of each user, wherein the obtained product efficacy considers both objective data and objective data, and compared with a single test mode, the obtained product efficacy has higher association degree with the user.
Based on the same inventive concept, the embodiment of the present application further provides a cosmetic testing device corresponding to the cosmetic testing method in the first embodiment, and since the principle of solving the problem of the device in the embodiment of the present application is similar to that of the cosmetic testing method described above, the implementation of the device may refer to the implementation of the method, and the repetition is omitted.
As shown in fig. 7, fig. 7 is a schematic structural view of a cosmetic testing device 700 according to an embodiment of the present application. Cosmetic testing device 700 includes:
the obtaining module 701 is configured to test a product to be tested according to a preset test flow by a target crowd, and obtain test data of each user in the target crowd; wherein the test data comprises subjective data and objective data;
the extracting module 702 is configured to perform feature extraction processing on the subjective data and the objective data, and perform splicing processing on a first extraction result for the subjective data and a second extraction result for the objective data, so as to obtain target feature data;
and the classification module 703 is configured to perform classification processing on the target feature data to obtain a test result of each user, and obtain a product efficacy of the product to be tested according to the test result of each user.
It will be appreciated by those skilled in the art that the function of each unit in the cosmetic testing device 700 shown in fig. 7 can be understood with reference to the foregoing description of the cosmetic testing method. The functions of the units in the cosmetic testing device 700 shown in fig. 7 may be implemented by a program running on a processor, or by a specific logic circuit.
In a possible implementation manner, the acquiring module 701 acquires test data of each user in the target crowd, including:
the subjective data for each user is obtained by:
determining a questionnaire based on the product to be tested;
acquiring the questionnaire filled in by each user; wherein the subjective data of each user is included in the questionnaire filled in by each user.
In one possible implementation manner, the objective data includes image data, index data, and experimental data, and the obtaining module 701 obtains test data of each user in the target crowd, including:
the objective data for each user is obtained by:
acquiring image data of each user through image acquisition equipment, wherein the image data comprises a left face shot, a front face shot and a right face shot;
and obtaining index data of each user through a first detection device, wherein the index data comprises percutaneous water loss and skin colorValue, skin heme index EI, skin sensory current threshold;
and acquiring experimental data of each user through a second detection device, wherein the experimental data comprise lactic acid stinging experimental data, DMSO (DMSO) stimulation experimental data and capsaicin experimental data.
In one possible implementation, the extraction module 702 performs the feature extraction process of subjective data by:
inputting the subjective data into a BERT language model to obtain text data corresponding to the subjective data, and carrying out normalization processing on numerical data in the subjective data so as to limit the numerical data within a preset range;
splicing the text data and the numerical data, and inputting a splicing result into a matrix formed by random numbers to obtain a first matrix in a preset format;
and carrying out back propagation on the first matrix through a dynamic gate, and inputting the back propagation result into a coder-decoder.
In one possible implementation, the extraction module 702 performs the feature extraction process of the image data in the objective data by:
preprocessing the left face shot, the front face shot and the right face shot to obtain an analyzed left face shot, an analyzed front face shot and an analyzed right face shot;
inputting the left side face shot, the front face shot, the right side face shot, the analyzed left side face shot, the analyzed front face shot and the analyzed right side face shot into an image classification model to carry out block processing on a two-dimensional image, flattening each image block into a one-dimensional vector, carrying out linear projection transformation on each vector, introducing position codes, and adding position information of a sequence;
Inputting the vector of each segmented picture into a matrix composed of random numbers so that the sizes of the vectors are the same;
splicing all vectors into a second matrix with a preset format;
and carrying out back propagation on the second matrix through a dynamic gate, and inputting the back propagation result into a coder-decoder.
In one possible implementation, the extraction module 702 performs the feature extraction process of the index data or experimental data in the objective data by:
the index data or the experimental data are subjected to normalization processing and then spliced and input into a matrix composed of random numbers, so that a third matrix in a preset format is obtained;
and carrying out back propagation on the third matrix through a dynamic gate, and inputting the back propagation result into a coder-decoder.
In a possible implementation manner, the classifying module 703 performs a classification process on the target feature data to obtain a test result of each user, where the method includes:
and classifying the target feature data based on a preset data weight, wherein the data weight of the first extraction result in the target feature data is lower than the data weight of the second extraction result in the target feature data.
The cosmetic testing device tests the product to be tested according to a preset testing flow through a target crowd, and obtains testing data of each user in the target crowd, wherein the testing data comprises subjective data and objective data; then, respectively carrying out feature extraction processing on the subjective data and the objective data, and carrying out splicing processing on a first extraction result aiming at the subjective data and a second extraction result aiming at the objective data to obtain target feature data, so that the objective data and the subjective data with different data forms can be spliced in a feature form, and the data format is unified; and then classifying the target characteristic data to obtain a test result of each user, and obtaining the product efficacy of the product to be tested according to the test result of each user, wherein the obtained product efficacy considers both objective data and objective data, and compared with a single test mode, the obtained product efficacy has higher association degree with the user.
As shown in fig. 8, fig. 8 is a schematic diagram of a composition structure of an electronic device 800 provided in an embodiment of the present application, where the electronic device 800 includes:
a processor 801, a storage medium 802, and a bus 803, the storage medium 802 storing machine readable instructions executable by the processor 801, the processor 801 and the storage medium 802 communicating through the bus 803 when the electronic device 800 is running, the processor 801 executing the machine readable instructions to perform the steps of the cosmetic testing method described in the embodiments of the present application.
In actual use, the various components of the electronic device 800 are coupled together via the bus 803. It is understood that bus 803 is used to enable connected communications between these components. The bus 803 includes a power bus, a control bus, and a status signal bus in addition to the data bus. But for clarity of illustration the various buses are labeled as bus 803 in fig. 8.
The electronic equipment tests products to be tested according to a preset test flow through a target crowd, and test data of each user in the target crowd are obtained, wherein the test data comprise subjective data and objective data; then, respectively carrying out feature extraction processing on the subjective data and the objective data, and carrying out splicing processing on a first extraction result aiming at the subjective data and a second extraction result aiming at the objective data to obtain target feature data, so that the objective data and the subjective data with different data forms can be spliced in a feature form, and the data format is unified; and then classifying the target characteristic data to obtain a test result of each user, and obtaining the product efficacy of the product to be tested according to the test result of each user, wherein the obtained product efficacy considers both objective data and objective data, and compared with a single test mode, the obtained product efficacy has higher association degree with the user.
The present embodiments also provide a computer readable storage medium storing executable instructions that, when executed by the at least one processor 801, implement the cosmetic testing method described in the embodiments of the present application.
In some embodiments, the storage medium may be a magnetic random Access Memory (FRAM, ferromagneticRandom Access Memory), read Only Memory (ROM), programmable Read Only Memory (PROM, programmable Read-Only Memory), erasable programmable Read Only Memory (EPROM, erasableProgrammable Read-Only Memory), electrically erasable programmable Read Only Memory (EEPROM, electricallyErasable Programmable Read-Only Memory), flash Memory (Flash Memory), magnetic surface Memory, optical disk, or compact disk Read Only Memory (CD-ROM, compact Disc Read-Only Memory), or the like; but may be a variety of devices including one or any combination of the above memories.
In some embodiments, the executable instructions may be in the form of programs, software modules, scripts, or code, written in any form of programming language (including compiled or interpreted languages, or declarative or procedural languages), and they may be deployed in any form, including as stand-alone programs or as modules, components, subroutines, or other units suitable for use in a computing environment.
As an example, the executable instructions may, but need not, correspond to files in a file system, may be stored as part of a file that holds other programs or data, for example, in one or more scripts in a hypertext markup Language (HTML, hyperTextMarkup Language) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
As an example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices located at one site or, alternatively, distributed across multiple sites and interconnected by a communication network.
The computer readable storage medium tests products to be tested according to a preset test flow through a target crowd, and obtains test data of each user in the target crowd, wherein the test data comprises subjective data and objective data; then, respectively carrying out feature extraction processing on the subjective data and the objective data, and carrying out splicing processing on a first extraction result aiming at the subjective data and a second extraction result aiming at the objective data to obtain target feature data, so that the objective data and the subjective data with different data forms can be spliced in a feature form, and the data format is unified; and then classifying the target characteristic data to obtain a test result of each user, and obtaining the product efficacy of the product to be tested according to the test result of each user, wherein the obtained product efficacy considers both objective data and objective data, and compared with a single test mode, the obtained product efficacy has higher association degree with the user.
In several embodiments provided in the present application, it should be understood that the disclosed method and electronic device may be implemented in other manners. The above described device embodiments are only illustrative, e.g. the division of the units is only one logical function division, and there may be other divisions in practice, such as: multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the various components shown or discussed may be coupled or directly coupled or communicatively coupled to each other via some interface, whether indirectly coupled or communicatively coupled to devices or units, whether electrically, mechanically, or otherwise.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical units, may be located in one place, or may be distributed over multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a platform server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes or substitutions are covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (8)

1. A cosmetic testing method, the method comprising:
testing a product to be tested according to a preset test flow through a target crowd, and acquiring test data of each user in the target crowd; wherein the test data comprises subjective data and objective data;
respectively carrying out feature extraction processing on the subjective data and the objective data, and carrying out splicing processing on a first extraction result aiming at the subjective data and a second extraction result aiming at the objective data to obtain target feature data;
classifying the target characteristic data to obtain a test result of each user, and obtaining the product efficacy of the product to be tested according to the test result of each user;
the objective data includes image data, index data and experimental data, and the obtaining test data of each user in the target crowd includes:
the objective data for each user is obtained by:
acquiring image data of each user through image acquisition equipment, wherein the image data comprises a left face shot, a front face shot and a right face shot;
And obtaining index data of each user through a first detection device, wherein the index data comprises percutaneous water loss and skin colorValue, skin heme index EI, skin sensory current threshold;
the experimental data of each user are obtained through second detection equipment, wherein the experimental data comprise lactic acid stinging experimental data, DMSO (DMSO) stimulation experimental data and capsaicin experimental data;
the index data or experimental data in the objective data is subjected to feature extraction processing by the following means:
the index data or the experimental data are subjected to normalization processing and then spliced and input into a matrix composed of random numbers, so that a third matrix in a preset format is obtained;
and carrying out back propagation on the third matrix through a dynamic gate, and inputting the back propagation result into a coder-decoder.
2. The method of claim 1, wherein the obtaining test data for each user in the target group comprises:
the subjective data for each user is obtained by:
determining a questionnaire based on the product to be tested;
acquiring the questionnaire filled in by each user; wherein the subjective data of each user is included in the questionnaire filled in by each user.
3. The method according to claim 1, wherein the subjective data is subjected to a feature extraction process by:
inputting the subjective data into a BERT language model to obtain text data corresponding to the subjective data, and carrying out normalization processing on numerical data in the subjective data so as to limit the numerical data within a preset range;
splicing the text data and the numerical data, and inputting a splicing result into a matrix formed by random numbers to obtain a first matrix in a preset format;
and carrying out back propagation on the first matrix through a dynamic gate, and inputting the back propagation result into a coder-decoder.
4. The method according to claim 1, wherein the image data in the objective data is subjected to a feature extraction process by:
preprocessing the left face shot, the front face shot and the right face shot to obtain an analyzed left face shot, an analyzed front face shot and an analyzed right face shot;
inputting the left side face shot, the front face shot, the right side face shot, the analyzed left side face shot, the analyzed front face shot and the analyzed right side face shot into an image classification model to carry out block processing on a two-dimensional image, flattening each image block into a one-dimensional vector, carrying out linear projection transformation on each vector, introducing position codes, and adding position information of a sequence;
Inputting the vector of each segmented picture into a matrix composed of random numbers so that the sizes of the vectors are the same;
splicing all vectors into a second matrix with a preset format;
and carrying out back propagation on the second matrix through a dynamic gate, and inputting the back propagation result into a coder-decoder.
5. The method according to claim 1, wherein classifying the target feature data to obtain the test result of each user comprises:
and classifying the target feature data based on a preset data weight, wherein the data weight of the first extraction result in the target feature data is lower than the data weight of the second extraction result in the target feature data.
6. A cosmetic product testing device, the device comprising:
an acquisition module forTesting a product to be tested according to a preset test flow through a target crowd, and acquiring test data of each user in the target crowd; wherein the test data comprises subjective data and objective data; the objective data includes image data, index data and experimental data, and the obtaining test data of each user in the target crowd includes: the objective data for each user is obtained by: acquiring image data of each user through image acquisition equipment, wherein the image data comprises a left face shot, a front face shot and a right face shot; and obtaining index data of each user through a first detection device, wherein the index data comprises percutaneous water loss and skin color Value, skin heme index EI, skin sensory current threshold; the experimental data of each user are obtained through second detection equipment, wherein the experimental data comprise lactic acid stinging experimental data, DMSO (DMSO) stimulation experimental data and capsaicin experimental data;
the extraction module is used for respectively carrying out feature extraction processing on the subjective data and the objective data, and carrying out splicing processing on a first extraction result aiming at the subjective data and a second extraction result aiming at the objective data to obtain target feature data; the index data or experimental data in the objective data is subjected to feature extraction processing by the following means: the index data or the experimental data are subjected to normalization processing and then spliced and input into a matrix composed of random numbers, so that a third matrix in a preset format is obtained; the third matrix is back-propagated through a dynamic gate, and the back-propagated result is input into a coder-decoder;
and the classification module is used for carrying out classification processing on the target characteristic data to obtain a test result of each user, and obtaining the product efficacy of the product to be tested according to the test result of each user.
7. An electronic device, comprising: a processor, a storage medium, and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor in communication with the storage medium via the bus when the electronic device is operating, the processor executing the machine-readable instructions to perform the cosmetic testing method of any one of claims 1 to 5.
8. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, performs the cosmetic testing method according to any one of claims 1 to 5.
CN202410009064.1A 2024-01-04 2024-01-04 Cosmetic testing method and device, electronic equipment and storage medium Active CN117517595B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410009064.1A CN117517595B (en) 2024-01-04 2024-01-04 Cosmetic testing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410009064.1A CN117517595B (en) 2024-01-04 2024-01-04 Cosmetic testing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN117517595A CN117517595A (en) 2024-02-06
CN117517595B true CN117517595B (en) 2024-03-26

Family

ID=89749790

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410009064.1A Active CN117517595B (en) 2024-01-04 2024-01-04 Cosmetic testing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117517595B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5888879A (en) * 1996-06-13 1999-03-30 Pola Chemical Industries Inc. Method for evaluating make-up cosmetic product and powder composition
CN108985873A (en) * 2017-05-30 2018-12-11 株式会社途伟尼 Cosmetics recommended method, the recording medium for being stored with program, the computer program to realize it and cosmetics recommender system
CN109726944A (en) * 2019-03-06 2019-05-07 义乌市悦美科技有限公司 A kind of cosmetic industry masses evaluation system Internet-based
CN113181531A (en) * 2021-04-26 2021-07-30 上海应用技术大学 Human body healthy skin irritation model and establishment method thereof
CN113392236A (en) * 2021-01-04 2021-09-14 腾讯科技(深圳)有限公司 Data classification method, computer equipment and readable storage medium
CN115908388A (en) * 2022-12-24 2023-04-04 广东药科大学 Sensitive skin full-period detection method and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5888879A (en) * 1996-06-13 1999-03-30 Pola Chemical Industries Inc. Method for evaluating make-up cosmetic product and powder composition
CN108985873A (en) * 2017-05-30 2018-12-11 株式会社途伟尼 Cosmetics recommended method, the recording medium for being stored with program, the computer program to realize it and cosmetics recommender system
CN109726944A (en) * 2019-03-06 2019-05-07 义乌市悦美科技有限公司 A kind of cosmetic industry masses evaluation system Internet-based
CN113392236A (en) * 2021-01-04 2021-09-14 腾讯科技(深圳)有限公司 Data classification method, computer equipment and readable storage medium
CN113181531A (en) * 2021-04-26 2021-07-30 上海应用技术大学 Human body healthy skin irritation model and establishment method thereof
CN115908388A (en) * 2022-12-24 2023-04-04 广东药科大学 Sensitive skin full-period detection method and system

Also Published As

Publication number Publication date
CN117517595A (en) 2024-02-06

Similar Documents

Publication Publication Date Title
Wootton Remote cognitive–behavior therapy for obsessive–compulsive symptoms: A meta-analysis
Williams et al. The roles of self-efficacy, outcome expectancies and social support in the self-care behaviours of diabetics
Jacques et al. Long-term nutrient intake and 5-year change in nuclear lens opacities
Keane Validity and reliability of the critical care pain observation tool: a replication study
Hurley The health belief model: evaluation of a diabetes scale
Koelmeyer et al. Prospective surveillance model in the home for breast cancer-related lymphoedema: a feasibility study
AU2003245948A1 (en) Method and system for detecting and analyzing clinical pictures and the causes thereof and for determining proposals for appropriate therapy
Abusharha et al. Analysis of basal and reflex human tear osmolarity in normal subjects: assessment of tear osmolarity
Al-Ghabeesh et al. Unidimentional and multidimensional breathlessness specific instruments for adult population: Literature review
Fasugba et al. The Fitzpatrick skin type scale: a reliability and validity study in women undergoing radiation therapy for breast cancer
CN117517595B (en) Cosmetic testing method and device, electronic equipment and storage medium
Liu et al. An optimal method for quantifying the facial sebum level and characterizing facial sebum features
Dai et al. Investigation of allergic sensitizations in children with allergic rhinitis and/or asthma
Chantelau A novel diagnostic test for end-stage sensory failure associated with diabetic foot ulceration: Proof-of-principle study
Hemström et al. Visual search for complex objects: Set-size effects for faces, words and cars
US20230351596A1 (en) Establishing method of wound grade assessment model, wound care assessment system and wound grade assessment method
Draelos Noxious sensory perceptions in patients with mild to moderate rosacea treated with azelaic acid 15% gel
Nozawa et al. Distress and impacts on daily life from appearance changes due to cancer treatment: A survey of 1,034 patients in Japan
Martins et al. Cross Cultural adaptation into Brazilian Portuguese language of Derriford Appearance Scale 24 (DAS-24) for people living with HIV/AIDS
Agoons et al. Effect of topical capsaicin on painful sensory peripheral neuropathy in patients with type 2 diabetes: a double-blind placebo-controlled randomised clinical trial
Kulakçi et al. Impact of nursing care services on self-efficacy perceptions and healthy lifestyle behaviors of nursing home residents
de Menezes et al. The efficacy and safety of mannitol challenge in a workplace setting for assessing asthma prevalence
Pinhasov et al. Reducing lower extremity hospital-acquired pressure injuries: a multidisciplinary clinical team approach
Rasmussen et al. Therapist-assisted rehabilitation of visual function and hemianopia after brain injury: intervention study on the effect of the neuro vision technology rehabilitation program
Feldman Self‐esteem, types of attributional style and sensation and distress pain ratings in males

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant