CN110245590B - Product recommendation method and system based on skin image detection - Google Patents

Product recommendation method and system based on skin image detection Download PDF

Info

Publication number
CN110245590B
CN110245590B CN201910461579.4A CN201910461579A CN110245590B CN 110245590 B CN110245590 B CN 110245590B CN 201910461579 A CN201910461579 A CN 201910461579A CN 110245590 B CN110245590 B CN 110245590B
Authority
CN
China
Prior art keywords
skin
image
data
wrinkle
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910461579.4A
Other languages
Chinese (zh)
Other versions
CN110245590A (en
Inventor
詹瑾
郑伟俊
叶丁荣
谢桂园
凌宏勋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Foshan Haixie Technology Co ltd
Zhiyan Future Shenzhen Information Technology Co ltd
Original Assignee
Guangdong Polytechnic Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Polytechnic Normal University filed Critical Guangdong Polytechnic Normal University
Priority to CN201910461579.4A priority Critical patent/CN110245590B/en
Publication of CN110245590A publication Critical patent/CN110245590A/en
Application granted granted Critical
Publication of CN110245590B publication Critical patent/CN110245590B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Development Economics (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention provides a product recommendation method and system based on skin image detection, wherein the method comprises the following steps: the server receives skin image data sent by the client based on the server communication module; the server processes the skin image data based on an analysis module and extracts skin information from the skin image data; the server builds a filter based on the skin information; the server screens the product library based on the screener to obtain recommended product data; the server sends the recommended product data to the client based on the server communication module. The method integrates the three steps of obtaining the skin image, analyzing the data of the skin image and deriving the recommended product according to the data analysis result of the skin image, and has good practicability and convenience.

Description

Product recommendation method and system based on skin image detection
Technical Field
The invention relates to the field of skin images, in particular to a product recommendation method and system based on skin image detection.
Background
When a consumer purchases a skin care product, the skin care product needs to be selected according to the skin condition of the consumer, and the consumer can hardly select a product suitable for the skin condition of the consumer because the related content of the skin detection in the industry is not clearly divided at present; in addition, the difficulty of consumers in purchasing skin care products is further increased due to the variability of skin conditions, which is detrimental to consumers' purchase of skin care products.
Disclosure of Invention
Aiming at the existing problems, the invention provides a product recommendation method and a system based on skin image detection, and the method integrates three steps of obtaining skin images, analyzing data of the skin images and deriving recommended products according to the data analysis result of the skin images, so that the method has good practicability and convenience.
Correspondingly, the invention provides a product recommendation method based on skin image detection, which comprises the following steps:
the server receives skin image data sent by the client based on the server communication module;
the server processes the skin image data based on an analysis module and extracts skin information from the skin image data;
the server builds a filter based on the skin information;
the server screens the product library based on the screener to obtain recommended product data;
the server sends the recommended product data to the client based on the server communication module.
In an alternative embodiment, the skin image data includes client photo data captured based on a camera detection module.
In an alternative embodiment, the server processes the skin image data based on an analysis module, and extracting skin information from the skin image data comprises the steps of:
and extracting skin images from the client photo data.
In an alternative embodiment, the extracting the skin image from the client photo data includes the steps of:
generating a skin color discrimination model for the YCrCb color space;
converting the client photo data into a YCrCb color space to obtain primary skin color processing image data;
processing the primary skin color processing image data based on the skin color discrimination model to obtain skin discrimination image data;
and processing the client photo data based on the skin judgment image data to obtain a skin image.
In an alternative embodiment, the skin information comprises wrinkled skin information;
the server processes the skin image data based on an analysis module, and extracts skin information from the skin image data further comprises the steps of:
the wrinkled skin information is extracted based on the skin image.
In an alternative embodiment, the extracting the wrinkle skin information based on the skin image includes the steps of:
performing Gaussian smoothing processing on the skin image to obtain a primary wrinkle processed image;
performing face alignment on the primary wrinkle processed image based on an ASM algorithm;
generating a feature face based on CAAE depth network structure prediction;
performing two-dimensional discrete wavelet transformation on the primary wrinkle treatment image to obtain a low-frequency subgraph of the primary wrinkle treatment image and high-frequency subgraphs in three directions;
performing two-dimensional discrete wavelet transformation on the characteristic face to obtain a low-frequency subgraph of the characteristic face and high-frequency subgraphs in three directions;
performing high-pass filtering processing on the low-frequency subgraph of the primary wrinkle processing image;
and performing aging synthesis of the primary wrinkle treatment image based on wavelet reconstruction to obtain a secondary wrinkle treatment image.
And comparing the similarity degree of the skin image and the secondary wrinkle treatment image to obtain wrinkle skin information.
In an alternative embodiment, the comparing the similarity between the skin image and the second wrinkle treatment image to obtain wrinkle skin information includes the steps of:
graying the skin image and the secondary wrinkle treatment image to obtain a skin graying image and a secondary wrinkle treatment graying image;
generating a skin image histogram and a secondary wrinkle treatment image histogram based on the skin graying image and the secondary wrinkle treatment graying image, respectively;
calculating normalized correlation coefficients of a skin image histogram and a secondary wrinkle treatment image histogram, and obtaining the similarity degree of the skin graying image and the secondary wrinkle treatment graying image;
and obtaining wrinkle skin information based on the similarity degree of the skin graying image and the secondary wrinkle treatment graying image.
In an alternative embodiment, the skin information further comprises acne skin information;
the server processes the skin image data based on an analysis module, and extracts skin information from the skin image data further comprises the steps of:
extracting the acne skin information based on the skin image.
Correspondingly, the invention also provides a product recommendation system based on the skin image detection, which comprises a client and a server, wherein the server comprises
Server communication module: the method comprises the steps of receiving skin image data sent by a client and sending recommended product data to the client;
and an analysis module: for extracting skin information from the skin image data;
product library: the method is used for storing the product data and screening the product data by a screening device to obtain recommended product data.
In an alternative embodiment, the client includes a camera detection module.
The invention provides a product recommending method and system based on skin image detection, which integrate three steps of skin image acquisition, skin image data analysis and recommended product derivation according to skin image data analysis results, and have good practicability and convenience. .
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 shows a schematic flow chart of a product recommendation method based on skin image detection according to an embodiment of the present invention;
fig. 2 is a flowchart of a method for extracting a skin image from client photo data according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of a method for extracting wrinkle skin information according to an embodiment of the present invention;
FIG. 4 is a flow chart of a method for comparing the similarity of skin images and secondary wrinkle treatment images according to an embodiment of the present invention;
fig. 5 is a flowchart of a method for acquiring acne skin information according to an embodiment of the present invention;
fig. 6 shows a schematic structural diagram of a product recommendation system based on skin image detection according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Embodiment one:
fig. 1 shows a schematic flow chart of a product recommendation method based on skin image detection according to an embodiment of the present invention. The embodiment of the invention provides a product recommendation method based on skin image detection, which comprises the following steps of:
s101: the server receives skin image data sent by the client based on the server communication module;
the types of data of the client are various, the data is mainly determined according to the type of a product used by the client, the client comprises two types of detection modules, namely a simple detection module and a complex detection module, and the skin detection data comprises two types of simple detection data and complex detection data.
The simple detection module is a detection module which can obtain a detection result only by means of hardware of a client, and is generally a detection module which obtains a detection structure based on circuit detection, such as a skin moisture detection module, a skin oil detection module and the like; correspondingly, the skin information can be directly derived through the simple detection module, and the simple detection data obtained through the simple detection module contains the corresponding skin information. Since the simple detection module can directly derive the skin information result, the embodiment of the invention is not described for this content.
The complex detection module refers to a detection module which cannot obtain a detection result by means of hardware equipment of a client, and is generally a detection module which obtains the detection result based on computer processing, such as a wrinkle detection module, an acne detection module and the like, and obtains the detection result by using a picture analysis technology; correspondingly, the skin information cannot be directly derived through the complex detection module, the corresponding skin information can be obtained after the detection data is uploaded to the server for analysis and processing, and the complex detection data obtained through the complex detection module does not contain the skin information and only contains the original data which can be used for analysis and processing.
Optionally, the client detection module in the embodiment of the present invention includes a skin moisture detection module, a skin oil detection module, a wrinkle detection module, and an acne detection module, where the wrinkle detection module and the acne detection module can obtain relevant detection data through analysis and processing of pictures of skin, and therefore, the wrinkle detection module and the acne detection module are essentially a group of camera detection modules; the complex detection data obtained by the camera detection module is a photographed image of the skin concerned.
Thus, in an embodiment of the present invention, the skin image data includes client photo data taken based on a camera detection module.
S102: the server processes the skin image data based on an analysis module and extracts skin information from the skin image data;
aiming at the photo data of the client, different treatments are needed to be carried out on the photo data of the client according to the type of skin information to be analyzed; generally, the photo data of the client shot by the client is a self-shot photo of the user, and contents such as a skin image, a five sense organs image, a hair image, a background image and the like are mixed in the self-shot photo. And then processing the skin image to obtain the required skin information. Optionally, in an embodiment of the present invention, the skin information includes wrinkle skin information and acne skin information.
It should be noted that, for the consumer, on one hand, specific skin information data is too obscure to be understood by the consumer; on the other hand, no specific quantification standard of skin information exists in the industry at present, so the skin information in the embodiment of the invention is skin evaluation information, and the skin specific data is classified by using the self-established standard to form the skin evaluation information, so that a reference is provided for a user.
S103: the server builds a filter based on the skin information;
for example, in an embodiment of the present invention, the skin information includes wrinkled skin information; the evaluation criteria of the wrinkle skin information are classified into three levels of lower than the average level, close to the average level, and higher than the average level; alternatively, in the evaluation criteria of the wrinkle skin information, the wrinkle skin information is set as a screening item a in which the wrinkle level lower than the average is 0, the wrinkle level close to the average is 1, and the wrinkle level higher than the average is 2.
Specifically, skin oil data, skin moisture data, acne skin information, and the like also need to be incorporated into the screener.
Based on the above manner, the finally formed filter format is a=a=b=b=c=c=d=d_e …, wherein capital A, B, C, D, E … … indicates the filter item and lowercase a, B, C, D … … indicates the evaluation grade corresponding to the filter item.
S104: the server screens the product library based on the screener to obtain recommended product data;
the product library stores data of a plurality of products, and when the products are registered in the warehouse, the products are required to be associated with related screening items according to the self attributes of the products; meanwhile, the targeted grade needs to be determined in the relevant screening items according to the target of the product for the crowd.
And (3) screening the product library through the screener determined in the step S103, and screening out products meeting the requirements of the screener to obtain recommended products.
S105: the server sends the recommended product data to the client based on the server communication module.
The server sends corresponding recommended product data to the client based on the server communication module for reference by a user, and the product recommending method based on the skin image is finished in one operation process.
Embodiment two:
fig. 2 shows a flow diagram of a method of extracting skin images from client photo data. In step S102, in the embodiment of the present invention, skin color judgment is used as a preprocessing means for skin image extraction. Specifically, statistics show that skin colors tend to be concentrated in one aggregation area in the CbCr plane of the YCbCr color space, so that whether the pixel points are skin pixel points can be determined by judging whether the pixel points in the original skin photo fall in the corresponding aggregation area, thereby extracting a skin image from the client photo data.
Specifically, the method for extracting skin images from client photo data comprises the following steps:
s201: generating a skin color discrimination model for the YCrCb color space;
in the embodiment of the invention, the contour structure of the skin color discrimination model is as follows
Figure BDA0002077283020000071
Wherein,,
Figure BDA0002077283020000072
wherein, the skin color gathering statistics is carried out through MIT face library images to obtain c x =109.38,c y =152.02,θ=2.53rad,ec x =1.60,ec y =2.41,a=25.39,b=14.03,C' b -c x C is the pixel point b Value, C' r -c y C is the pixel point r A value; based on the corresponding values of the parameter values and the pixel points, the values of the x and y parameters can be calculated; substituting x and y parameters into the formula
Figure BDA0002077283020000073
By determining the magnitude relation between the formula operation result and 1, whether the pixel points fall on the skin color judgment model can be determined.
It should be noted that, with the expansion of the MIT face library, part of the data will change in real time, and the cloud server may update the relevant parameter values in real time to obtain a more accurate skin color gathering statistical result. In addition, according to the difference of the face database, the related parameter values also change correspondingly. Alternatively, the face database may be a Yale face database, an ORL face database, a CMUPIE face database, a FERET face database, an MIT face database, a BANCA face database, an AS-PEAL face database, or the like.
S202: converting the client photo data into a YCrCb color space to obtain primary skin color processing image data;
in the embodiment of the invention, the client photo data is converted into the YCrCb color space, and the conversion formula for obtaining the primary skin color processing image data is as follows
Figure BDA0002077283020000081
It should be noted that, in the embodiment of the present invention, the client photo data is RGB color space data, and therefore, each pixel has R red, G green, and BThree blue color components, similarly, each pixel has Y luminance, C 'after the client photo data is converted from RGB color space to YCrCb color space' b -c x Blue concentration offset, C' r -c y Red density offset by three color components. Therefore, the above conversion formula is a conversion method of converting the client photo data from the RGB color space to the YCrCb color space; in the embodiment of the invention, the client photo data is also converted from the HSV color space to the YCrCb color space, and the subsequent processing is performed after the primary skin color processing image data is obtained.
S203: processing the primary skin color processing image data based on the skin color discrimination model to obtain skin discrimination image data;
specifically, in the embodiment of the present invention, the pixel point data (C 'in the primary skin color processing image data is extracted by traversal' b -c x ,C' r -c y ) Determine pixel data (C' b -c x ,C' r -c y ) Whether the skin color is within the skin color judging model or on the outline; if the pixel data (C' b -c x ,C' r -c y ) The pixel point skin judgment data is 0 when the pixel point skin judgment data falls into the skin color judgment model or on the outline; if the pixel data (C' b -c x ,C' r -c y ) The pixel point skin judgment data is 1 when the pixel point skin judgment data falls outside the skin color judgment model; obtaining skin judgment image data with the resolution consistent with the primary skin color processing image data by traversing all pixel points in the primary skin color processing image data; in the skin determination image data, the value of any pixel point is 0 or 1.
In the skin judging image data, due to the incomplete accuracy of the judgment, part of pixel points can be judged incorrectly, so that non-human facial skin pixels can be mixed incorrectly in a human facial skin region, and the skin image obtained in the subsequent step can be pothole to generate part of black points; the non-human face skin pixels are erroneously mixed in the human face skin region, which is specifically represented by mixing a small number of pixel points with a value of 0 in one image region with a value of 1, and the small number of pixel points with a value of 0 can be understood as noise. Therefore, in consideration of the objective fact that the skin pixels should be continuous, after the skin determination image data is obtained, reasonable skin determination image data can be obtained by denoising the skin determination image data.
Optionally, the noise reduction method includes a plurality of noise reduction methods such as median filtering, maximum filtering, minimum filtering, mean filtering, and the like, which are not described in the embodiments of the present invention.
S204: processing the client photo data based on the skin judgment image data to obtain a skin image;
and traversing and extracting pixel points in the client photo data, taking the pixel point value of the same position in the skin judging image data as a judging condition, and retaining the pixel point color parameter or modifying the pixel point color parameter into a constant value.
Specifically, a pixel point is extracted from the client photo data, and when the value of the pixel point in the skin judgment image data is 0, the color parameter of the pixel point in the client photo data is reserved; when the pixel point takes a value of 1 in the skin judging image data, the color parameter of the pixel point in the client photo data is smeared to be a constant value; alternatively, to avoid interference with the skin tone image, the constant value (RGB color space) may be set to (0, 0), i.e., black, for the yellow skin user population.
By the method, the skin image is extracted from the client photo data, and the skin image is further analyzed and processed according to the skin information to be detected.
In the embodiment of the present invention, the skin information required to be obtained by the server side further includes wrinkle skin information and the following steps of extracting different skin information are described respectively.
Embodiment III:
fig. 3 shows a flow chart of a method for extracting wrinkle skin information, and the extraction process of the wrinkle skin information according to the embodiment of the invention comprises the following steps:
s301: performing Gaussian smoothing processing on the skin image to obtain a primary wrinkle processed image;
the Gaussian blur processing of the skin image aims at eliminating original texture information in the skin image, and only retaining the contour features of the skin and the skin color of the skin;
s302: performing face alignment on the primary wrinkle processed image based on an ASM algorithm;
ASM (Active Shape Model ) is one method of extraction based on feature point distribution model (Point Distribution Model, PDM); the geometry of an image such as an eye, ear, mouth, nose, eyebrow, etc. can be represented by the coordinates of several key feature points (landmarks) in series in sequence to form a shape vector. In the embodiment of the invention, the facial features in the primary wrinkle treatment image are covered with black (step S240), the facial feature contour information is simple to identify, the positions of all feature points (usually facial feature contour points) on the face can be marked by an ASM algorithm, and the current position of the face image can be confirmed by the positions of all feature points, so that the face alignment can be realized, and the subsequent image processing is convenient.
S303: generating a feature face based on CAAE depth network structure prediction;
CAAE (conditional adversarial autoencoder) based depth network architecture for conditional against automatic encoders enables learning of faces of different ages to predict an all-aged face image of any one input face image
S304: performing two-dimensional discrete wavelet transformation on the primary wrinkle treatment image to obtain a low-frequency subgraph of the primary wrinkle treatment image and high-frequency subgraphs in three directions;
the primary wrinkle treated image two-dimensional discrete wavelet transform can be represented as L OF ,H OF ,V OF =D WT (face 1) wherein L OF For low-frequency subgraph of primary wrinkle treatment image, H OF For processing the horizontal high-frequency subgraph of the image for one wrinkle, V OF For the vertical high-frequency subgraph of the primary wrinkling treatment image, D WT (face 1) is a diagonal direction of a primary wrinkle-treated imageA high frequency subgraph;
s305: performing two-dimensional discrete wavelet transformation on the characteristic face to obtain a low-frequency subgraph of the characteristic face and high-frequency subgraphs in three directions;
the two-dimensional discrete wavelet transform of a eigenface may be represented as L TF ,H YF ,V TF =D WT (face 2) wherein L TF For low-frequency subgraph of primary wrinkle treatment image, H YF For processing the horizontal high-frequency subgraph of the image for one wrinkle, V TF For the vertical high-frequency subgraph of the primary wrinkling treatment image, D WT (face 2) is a diagonal high frequency subgraph of the primary wrinkle processed image;
s306: performing high-pass filtering processing on the low-frequency subgraph of the primary wrinkle processing image;
the processing method is HL TF =hpfilterGuass(L TF ,sigma)
S307: and performing aging synthesis of the primary wrinkle treatment image based on wavelet reconstruction to obtain a secondary wrinkle treatment image.
Transplanting high-frequency components of texture information reflecting age in characteristic face to wavelet components corresponding to primary wrinkle treatment image by sub-image replacement, wherein the synthesis formula is agent=ID WT (HL TF +L OF ,H TF ,V TF ,D TF ) The agent is a secondary wrinkle treated image.
S308: comparing the similarity degree of the skin image and the secondary wrinkle treatment image to obtain wrinkle skin information;
in the embodiment of the invention, the similarity degree of the skin image and the secondary wrinkle treatment image (namely the wrinkle predicted image) is compared, the wrinkle information evaluation of the skin image is obtained, and whether the positions and the number of wrinkles are reasonable or not is judged through the comparison of the skin image and the wrinkle predicted image of the current age range, so that whether a related wrinkle removal product is needed or not is judged.
Embodiment four:
fig. 4 shows a flow chart of a method of comparing the similarity of a skin image and a secondary wrinkle treatment image. Optionally, in step S306, the method for comparing the similarity degree between the skin image and the secondary wrinkle treated image includes the steps of:
s401: graying the skin image and the secondary wrinkle treatment image to obtain a skin graying image and a secondary wrinkle treatment graying image;
after the skin image and the secondary wrinkle treatment image are subjected to gray treatment, the skin color is weakened by selecting a proper gray conversion value, and the image texture (namely wrinkle sign) is highlighted; because the skin color of the secondary wrinkle treatment image is converted by Gaussian blur, only the texture characteristics of the image are reserved after graying; the skin image also comprises fine textures among the skins, and the fine textures can be eliminated and the wrinkle characteristics can be reserved by reasonably selecting a graying threshold value.
S402: generating a skin image histogram and a secondary wrinkle treatment image histogram based on the skin graying image and the secondary wrinkle treatment graying image, respectively;
the invention adopts the histogram matching method to compare the similarity degree of the skin image and the secondary wrinkle treatment image, and the histogram matching method has the advantages of small calculated amount and the like, and has the defect that the histogram matching method only compares the histogram difference to neglect the image information in the original image, for example, an upper black-and-lower white image and an upper white-and-lower black image, and the similarity degree obtained by comparing the histogram similarity is 100%.
In the embodiment of the invention, the secondary wrinkling treatment image is obtained by skin image treatment, and the basic structure of the secondary wrinkling treatment image is kept the same as that of the skin image, so that the defect of a histogram similarity comparison method can be overcome; therefore, the invention adopts a histogram similarity comparison method to carry out similarity comparison on the skin graying image and the secondary wrinkle treatment graying image;
s403: and calculating normalized correlation coefficients of the skin image histogram and the secondary wrinkle treatment image histogram, and obtaining the similarity degree of the skin graying image and the secondary wrinkle treatment graying image.
Alternatively, the normalized correlation coefficient may be a barbituric distance, a histogram intersection distance, or the like.
S404: and evaluating wrinkle information based on the degree of similarity of the skin-graying image and the secondary wrinkle-treated graying image.
Theoretically, the value range of the similarity degree is (0, 1), in the embodiment of the invention, since the secondary wrinkle treatment image is obtained through skin image treatment, for the common 720 p-size image, the value range of the similarity degree of the skin gray-scale image obtained based on the histogram matching mode and the secondary wrinkle treatment gray-scale image is [0.95,1], and in general, the wrinkle information can be considered to be the average level of wrinkles superior to the corresponding age range in the interval [0.95,0.96 ], and no wrinkle removal is required; interval [0.96,0.97) can be considered that the wrinkle information is similar to the average level of wrinkles in the corresponding age group; interval 0.97,1 the wrinkle information is considered to be higher than the average level of wrinkles in the corresponding age group.
It should be noted that, the secondary wrinkle treatment image is a predicted wrinkle skin image corresponding to the skin image, the secondary wrinkle treatment image can be simply understood as a reference object, and the degree of similarity between the reference objects of the skin image is determined by comparing the actual skin image with the reference object, and the degree of similarity substantially reflects the degree of similarity of wrinkles; in general, the similarity is within the interval [0.95,0.96), which means that the degree of differentiation is large and wrinkles in skin images are small; the degree of similarity is within the interval [0.96,0.97 ] interval, indicating that the degree of differentiation is reduced, and that the wrinkles of the wrinkle reference object of the skin image are closer; when the similarity of the skin image and the secondary wrinkle treated image is in the interval [0.97,1], two cases are specifically included: the number of wrinkles in the skin image is less than the number of wrinkles in the reference, and the number of wrinkles in the skin image is greater than the number of wrinkles in the reference, both of which indicate an excessive number of wrinkles in the skin image.
By the embodiments described in S301 to S306, wrinkle skin information can be obtained.
Fifth embodiment:
fig. 5 shows a flowchart of a method for obtaining acne skin information according to an embodiment of the present invention. The acne skin information acquisition method of the embodiment of the invention comprises the following steps:
s501: graying the skin image to obtain a skin gray image;
the purpose of the graying treatment of the skin image is to reduce the color level of the skin image and reduce the calculated amount; weakening the content ratio of skin color, eliminating partial fine textures and strengthening the outline characteristics of acne in skin images;
s502: extracting circle and ellipse information of the skin gray image based on a Hough transform algorithm;
the hough transform is a feature detection algorithm widely used in image analysis, computer vision, and digital image processing. Hough transforms are used to identify features in found objects, such as: the flow of the algorithm for the hough transform is roughly as follows, given a picture, to confirm the shape to be distinguished, the algorithm performs voting in the parameter space (parameter space) to determine the shape of the object, and this is determined by the local maximum (local maximum) in the accumulation space (accumulator space). As can be seen from the shape of acne, it may appear circular or elliptical depending on the angle of observation; the round and oval information in the skin gray image can be determined through a Hough transform algorithm; specifically, in the embodiment of the present invention, the circle and ellipse information includes the number information of circles and ellipses.
S503: extracting acne skin information based on the circle and ellipse information;
in an embodiment of the invention, the number of acnes is obtained based on the circle and ellipse information, the skin acne grade is confirmed based on the number of acnes, and the acne skin information is the skin acne grade. Alternatively, the number of acnes is 1-3, the number of acnes is 0, 4-6, the number of acnes is 1,7 and more than 7 are 2.
Fig. 6 shows a schematic diagram of a product recommendation system based on skin image detection according to an embodiment of the present invention. Correspondingly, the embodiment of the invention also provides a product recommendation system based on the skin image, which comprises a client and a server, wherein the server comprises
Server communication module: the method comprises the steps of receiving skin image data sent by a client and sending recommended product data to the client;
and an analysis module: for extracting skin information from the skin image data;
product library: the method is used for storing the product data and screening the product data by a screening device to obtain recommended product data.
Optionally, the client includes a camera detection module.
The embodiment of the invention provides a product recommending method and system based on skin image detection, which integrate three steps of obtaining skin images, analyzing data of the skin images and deriving recommended products according to the data analysis result of the skin images, reduce the difficulty of selecting skin care products for consumers and have good practicability and convenience.
The above description is made in detail on a product recommendation method and system based on skin image detection provided by the embodiment of the present invention, and specific examples are applied herein to illustrate the principles and embodiments of the present invention, and the above description of the embodiment is only used to help understand the method and core idea of the present invention; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.

Claims (8)

1. A product recommendation method based on skin image detection, characterized in that the product recommendation method based on skin image detection comprises the following steps:
the server receives skin image data sent by the client based on the server communication module;
the server processes the skin image data based on an analysis module and extracts skin information from the skin image data;
the server builds a filter based on the skin information;
the server screens the product library based on the screener to obtain recommended product data;
the server sends the recommended product data to the client based on the server communication module;
the skin information includes wrinkled skin information;
the server processes the skin image data based on an analysis module, and extracts skin information from the skin image data further comprises the steps of:
extracting the wrinkled skin information based on the skin image;
the extracting the wrinkled skin information based on the skin image includes the steps of:
performing Gaussian smoothing processing on the skin image to obtain a primary wrinkle processed image;
performing face alignment on the primary wrinkle processed image based on an ASM algorithm;
generating a feature face based on CAAE depth network structure prediction;
performing two-dimensional discrete wavelet transformation on the primary wrinkle treatment image to obtain a low-frequency subgraph of the primary wrinkle treatment image and high-frequency subgraphs in three directions;
performing two-dimensional discrete wavelet transformation on the characteristic face to obtain a low-frequency subgraph of the characteristic face and high-frequency subgraphs in three directions;
performing high-pass filtering processing on the low-frequency subgraph of the primary wrinkle processing image;
performing aging synthesis of the primary wrinkle treatment image based on wavelet reconstruction to obtain a secondary wrinkle treatment image;
and comparing the similarity degree of the skin image and the secondary wrinkle treatment image to obtain wrinkle skin information.
2. The skin image detection based product recommendation method of claim 1, wherein the skin image data comprises client photo data taken based on a camera detection module.
3. The skin image detection-based product recommendation method according to claim 2, wherein the server processes the skin image data based on an analysis module, and extracting skin information from the skin image data comprises the steps of:
and extracting skin images from the client photo data.
4. The skin image detection-based product recommendation method according to claim 3, wherein said extracting skin images from said client photo data comprises the steps of:
generating a skin color discrimination model for the YCrCb color space;
converting the client photo data into a YCrCb color space to obtain primary skin color processing image data;
processing the primary skin color processing image data based on the skin color discrimination model to obtain skin discrimination image data;
and processing the client photo data based on the skin judgment image data to obtain a skin image.
5. The skin image detection-based product recommendation method according to claim 1, wherein said comparing the degree of similarity of the skin image and the secondary wrinkle treatment image to obtain wrinkle skin information comprises the steps of:
graying the skin image and the secondary wrinkle treatment image to obtain a skin graying image and a secondary wrinkle treatment graying image;
generating a skin image histogram and a secondary wrinkle treatment image histogram based on the skin graying image and the secondary wrinkle treatment graying image, respectively;
calculating normalized correlation coefficients of a skin image histogram and a secondary wrinkle treatment image histogram, and obtaining the similarity degree of the skin graying image and the secondary wrinkle treatment graying image;
and obtaining wrinkle skin information based on the similarity degree of the skin graying image and the secondary wrinkle treatment graying image.
6. The skin image detection based product recommendation method according to claim 4, wherein said skin information further comprises acne skin information;
the server processes the skin image data based on an analysis module, and extracts skin information from the skin image data further comprises the steps of:
extracting the acne skin information based on the skin image.
7. A product recommendation system based on skin image detection, which is characterized by comprising a client and a server, wherein the server comprises
Server communication module: the method comprises the steps of receiving skin image data sent by a client and sending recommended product data to the client;
and an analysis module: for extracting skin information from the skin image data;
product library: the method comprises the steps of storing product data and screening by a screening device to obtain recommended product data;
the skin information includes wrinkled skin information;
the server processes the skin image data based on an analysis module, and extracts skin information from the skin image data further comprises the steps of:
extracting the wrinkled skin information based on the skin image;
the extracting the wrinkled skin information based on the skin image includes the steps of:
performing Gaussian smoothing processing on the skin image to obtain a primary wrinkle processed image;
performing face alignment on the primary wrinkle processed image based on an ASM algorithm;
generating a feature face based on CAAE depth network structure prediction;
performing two-dimensional discrete wavelet transformation on the primary wrinkle treatment image to obtain a low-frequency subgraph of the primary wrinkle treatment image and high-frequency subgraphs in three directions;
performing two-dimensional discrete wavelet transformation on the characteristic face to obtain a low-frequency subgraph of the characteristic face and high-frequency subgraphs in three directions;
performing high-pass filtering processing on the low-frequency subgraph of the primary wrinkle processing image;
performing aging synthesis of the primary wrinkle treatment image based on wavelet reconstruction to obtain a secondary wrinkle treatment image;
and comparing the similarity degree of the skin image and the secondary wrinkle treatment image to obtain wrinkle skin information.
8. The skin image detection based product recommendation system of claim 7, wherein the client comprises a camera detection module.
CN201910461579.4A 2019-05-29 2019-05-29 Product recommendation method and system based on skin image detection Active CN110245590B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910461579.4A CN110245590B (en) 2019-05-29 2019-05-29 Product recommendation method and system based on skin image detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910461579.4A CN110245590B (en) 2019-05-29 2019-05-29 Product recommendation method and system based on skin image detection

Publications (2)

Publication Number Publication Date
CN110245590A CN110245590A (en) 2019-09-17
CN110245590B true CN110245590B (en) 2023-04-28

Family

ID=67885316

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910461579.4A Active CN110245590B (en) 2019-05-29 2019-05-29 Product recommendation method and system based on skin image detection

Country Status (1)

Country Link
CN (1) CN110245590B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113450340B (en) * 2021-07-13 2024-03-19 北京美医医学技术研究院有限公司 Skin texture detecting system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101236599A (en) * 2007-12-29 2008-08-06 浙江工业大学 Human face recognition detection device based on multi- video camera information integration
CN107123027A (en) * 2017-04-28 2017-09-01 广东工业大学 A kind of cosmetics based on deep learning recommend method and system
TW201802735A (en) * 2016-07-06 2018-01-16 南臺科技大學 Cosmetics recommendation system and method
CN108701217A (en) * 2017-11-23 2018-10-23 深圳和而泰智能控制股份有限公司 A kind of face complexion recognition methods, device and intelligent terminal
CN109784281A (en) * 2019-01-18 2019-05-21 深圳壹账通智能科技有限公司 Products Show method, apparatus and computer equipment based on face characteristic

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101236599A (en) * 2007-12-29 2008-08-06 浙江工业大学 Human face recognition detection device based on multi- video camera information integration
TW201802735A (en) * 2016-07-06 2018-01-16 南臺科技大學 Cosmetics recommendation system and method
CN107123027A (en) * 2017-04-28 2017-09-01 广东工业大学 A kind of cosmetics based on deep learning recommend method and system
CN108701217A (en) * 2017-11-23 2018-10-23 深圳和而泰智能控制股份有限公司 A kind of face complexion recognition methods, device and intelligent terminal
CN109784281A (en) * 2019-01-18 2019-05-21 深圳壹账通智能科技有限公司 Products Show method, apparatus and computer equipment based on face characteristic

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于回归分析的人脸老化模型构建;胡伟平 et al.;《广西科技大学学报》;20160930;39-46页 *

Also Published As

Publication number Publication date
CN110245590A (en) 2019-09-17

Similar Documents

Publication Publication Date Title
CN103632132B (en) Face detection and recognition method based on skin color segmentation and template matching
EP1693782B1 (en) Method for facial features detection
US8385638B2 (en) Detecting skin tone in images
CN109086718A (en) Biopsy method, device, computer equipment and storage medium
CN104732200B (en) A kind of recognition methods of skin type and skin problem
CN106951869B (en) A kind of living body verification method and equipment
CN107507144B (en) Skin color enhancement processing method and device and image processing device
CN104966285B (en) A kind of detection method of salient region
CN106650606A (en) Matching and processing method of face image and face image model construction system
CN111507426A (en) No-reference image quality grading evaluation method and device based on visual fusion characteristics
Lee et al. Color image enhancement using histogram equalization method without changing hue and saturation
Gritzman et al. Comparison of colour transforms used in lip segmentation algorithms
JP4658532B2 (en) Method for detecting face and device for detecting face in image
CN111832464A (en) Living body detection method and device based on near-infrared camera
CN108875623A (en) A kind of face identification method based on multi-features correlation technique
KR20220078231A (en) Skin condition measuring apparatus, skin condition measring system and method thereof
Lionnie et al. A comparison of human skin color detection for biometrie identification
CN111709305A (en) Face age identification method based on local image block
CN109948570B (en) Real-time detection method for unmanned aerial vehicle in dynamic environment
CN110245590B (en) Product recommendation method and system based on skin image detection
Yusuf et al. Human face detection using skin color segmentation and watershed algorithm
CN116958880A (en) Video flame foreground segmentation preprocessing method, device, equipment and storage medium
CN111310703A (en) Identity recognition method, device, equipment and medium based on convolutional neural network
JP6851246B2 (en) Object detector
Parente et al. Assessing facial image accordance to ISO/ICAO requirements

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20231017

Address after: 518000 1405, Building F, Zhigu R&D Building, Shuguang Community, Xili Street, Nanshan District, Shenzhen, Guangdong

Patentee after: Zhiyan Future (Shenzhen) Information Technology Co.,Ltd.

Address before: Room 1706-07, 17 / F, Glory International Financial Center, 25 Ronghe Road, Guicheng Street, Nanhai District, Foshan City, Guangdong Province

Patentee before: Foshan Haixie Technology Co.,Ltd.

Effective date of registration: 20231017

Address after: Room 1706-07, 17 / F, Glory International Financial Center, 25 Ronghe Road, Guicheng Street, Nanhai District, Foshan City, Guangdong Province

Patentee after: Foshan Haixie Technology Co.,Ltd.

Address before: No. 293, Zhongshan Avenue, Shipai, Tianhe District, Guangzhou, Guangdong 510630

Patentee before: GUANGDONG POLYTECHNIC NORMAL University

TR01 Transfer of patent right