CN111429535A - Method, system, device and medium for evaluating difference degree between clothes and background in image - Google Patents

Method, system, device and medium for evaluating difference degree between clothes and background in image Download PDF

Info

Publication number
CN111429535A
CN111429535A CN202010176413.0A CN202010176413A CN111429535A CN 111429535 A CN111429535 A CN 111429535A CN 202010176413 A CN202010176413 A CN 202010176413A CN 111429535 A CN111429535 A CN 111429535A
Authority
CN
China
Prior art keywords
image
area
color
clothes
background
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010176413.0A
Other languages
Chinese (zh)
Other versions
CN111429535B (en
Inventor
丁凡
罗天煦
姜永胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Emperor Technology Co Ltd
Original Assignee
Shenzhen Emperor Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Emperor Technology Co Ltd filed Critical Shenzhen Emperor Technology Co Ltd
Priority to CN202010176413.0A priority Critical patent/CN111429535B/en
Publication of CN111429535A publication Critical patent/CN111429535A/en
Application granted granted Critical
Publication of CN111429535B publication Critical patent/CN111429535B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The invention relates to the technical field of image processing, in particular to a method, a system, equipment and a storage medium for evaluating the difference between clothes and a background in an image.

Description

Method, system, device and medium for evaluating difference degree between clothes and background in image
Technical Field
The invention relates to the technical field of image processing, in particular to a method for evaluating the difference between clothes and a background in an image, a system for evaluating the difference between the clothes and the background in the image, equipment of the system and a storage medium storing the method.
Background
With the rapid development of society, the types of certificates used by the majority of citizens are more and more, and part of the certificates are required to be attached with certificates, so the shooting requirement of the certificates is gradually improved; in order to facilitate the citizen to shoot the certificate photo, some self-service photographing devices appear on the market nowadays, and the application of the self-service photographing devices brings very convenient experience to people needing to shoot the certificate photo. However, the relevant state authorities have certain regulations on the certificates, and one of the regulations requires that the color of the clothes worn by the persons in the photos cannot be the same as the color of the background.
In the existing self-service photographing equipment, the function of comparing the dressing color of a user with the background color is not provided; the general user also sometimes makes it more difficult to judge whether the color of the clothes worn by the general user is proper, and the certificate photo shot by the general user is relatively similar to the background color due to the color of the clothes of the person, so that the generated photo cannot meet the requirements of related state departments, the use experience of the user is influenced to a certain extent, and the further popularization of the self-service photographing device in the market is also hindered.
Disclosure of Invention
In order to overcome the above drawbacks, the present invention provides a method, a system, a device, and a storage medium storing the method for determining a color difference between clothes and a background in an image captured by a self-service photography device.
The purpose of the invention is realized by the following technical scheme:
the invention discloses a method for evaluating the difference degree between clothes and background in an image, which comprises the following steps:
acquiring an image to be detected, and separating a foreground area and a background area of the image from the image to be detected;
converting the image to be detected from an RGB model to a YCrCb model to obtain a converted image; detecting the converted image by using a preset skin color detection model to determine a skin area in the image to be detected;
separating the skin area from the foreground area to obtain a clothes area in the image to be detected;
converting the image to be detected from an RGB model to an L AB model, and respectively calculating color values of the clothes area and the background area;
and calculating the color distance between the clothes area and the background area, judging whether the color distance is within a preset range, and if the color distance is within the preset range, judging that the color distance does not meet the imaging requirement.
In the present invention, the determining that the imaging request is not satisfied includes:
and prompting that the imaging requirements are not met.
In the present invention, the calculating color values of the clothing region and the background region respectively further includes:
the clothing region is divided into a plurality of sub-regions, and the area of each sub-region is calculated.
In the present invention, the calculating a color distance between the clothing region and the background region, and determining whether the color distance is within a predetermined range includes:
calculating color distances between the background area and each sub-area in the clothes area according to the color distances, carrying out normalization processing on the color distances, and carrying out statistics to obtain a minimum distance sub-area with the minimum color distance; and judging whether the color distance of the minimum distance sub-area is within a preset range.
In the present invention, before the determining whether the color distance of the minimum distance sub-region is within the predetermined range, the method includes:
and calculating the area ratio of the minimum distance sub-area in the clothes area, judging whether the area ratio reaches a preset area ratio, and if so, judging whether the color distance of the minimum distance sub-area is within a preset range.
In the present invention, before acquiring the image to be detected, the method includes:
the method comprises the steps of acquiring an original image through a camera, and selecting an image to be detected from the original image according to a positioning frame of preset parameters.
In the present invention, the determining whether the color distance is within a predetermined range further includes:
if the color distance is not within the preset range, judging that the imaging requirement is met;
and generating a certificate photo according to the original image.
The invention relates to a system for evaluating the difference degree between clothes and background in an image, which comprises:
the image acquisition module is used for converting an original image into an image to be detected;
the foreground and background separation module is connected with the image acquisition module and is used for separating a foreground area and a background area of an image from the image to be detected;
the skin area determining module is connected with the image acquiring module and is used for converting the image to be detected from an RGB model to a YCrCb model to obtain a converted image; detecting the converted image by using a preset skin color detection model to determine a skin area in the image to be detected;
the clothes area determining module is respectively connected with the skin area determining module and the foreground and background separating module and is used for separating the skin area from the foreground area to obtain a clothes area in the image to be detected;
the color value calculation module is respectively connected with the clothes area determination module, the foreground and background separation module and the image acquisition module, and is used for converting the to-be-detected image from an RGB model to an L AB model and respectively calculating color values of the clothes area and the background area;
the color distance judging module is connected with the color value calculating module and is used for calculating the color distance between the clothes area and the background area and judging whether the color distance is within a preset range, and if the color distance is within the preset range, the color distance is judged not to meet the requirement of imaging; and if the color distance is not within the preset range, judging that the imaging requirement is met.
The invention relates to a self-service photographing device, which comprises: the system for evaluating the difference degree between the clothes and the background in the image, the camera, the prompter and the certificate photo generator are adopted;
the camera is connected with the image acquisition module and used for generating an original image;
the prompter is connected with the color distance judging module and is used for prompting the imaging-nonconforming requirement when the imaging-nonconforming requirement is not met;
the certificate photo generator is respectively connected with the color distance judging module and the camera and used for generating the certificate photo according to the original image when the requirement of imaging is met.
The present invention is a computer readable program storage medium storing computer program instructions which, when executed by a computer, cause the computer to perform the method as described above.
The method and the device can calculate the color distance between the clothes color and the background color in the image, provide possibility for automatic reminding of the device aiming at the clothes color of the user, enable the generated certificate photo to meet the standards of national departments, effectively improve the certificate photo yield of the self-service photographing device and the use experience of the user, and are beneficial to further popularization of the self-service photographing device in the market.
Drawings
For the purpose of easy explanation, the present invention will be described in detail with reference to the following preferred embodiments and the accompanying drawings.
FIG. 1 is a schematic view of a workflow of an embodiment of a method for evaluating a degree of difference between clothing and a background in an image according to the present invention;
FIG. 2 is a schematic view of a workflow of another embodiment of the method for evaluating the degree of difference between clothes and background in an image according to the present invention;
FIG. 3 is a schematic diagram illustrating a logic structure of an embodiment of a system for evaluating a degree of difference between clothing and a background in an image according to the present invention;
fig. 4 is a schematic diagram of a logic structure of the self-service photographing device of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", and the like, indicate orientations and positional relationships based on those shown in the drawings, and are used only for convenience of description and simplicity of description, and do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be considered as limiting the present invention. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the description of the present invention, it should be noted that the terms "mounted," "connected," and "connected" are to be construed broadly and may be, for example, fixedly connected, detachably connected, or integrally connected unless otherwise explicitly stated or limited. Either mechanically or electrically. Either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
An embodiment of the method for evaluating the degree of difference between clothes and background in an image according to the present invention is described in detail below with reference to fig. 1, which includes:
s101, separating a foreground area from a background area
Acquiring an image to be detected, and separating a foreground area and a background area of the image from the image to be detected; in the present embodiment, the foreground region refers to the region of the body part of the photographed person in the photograph, which includes the skin region and the clothing region; and the other areas belong to the background area;
sorting the original image to obtain an image I to be detectedoAn image I to be detectedoFiltering the image, and adopting a median filtering method to treat the image I to be detectedoSmoothing the median filter to obtain a filtered image Iv(ii) a For the filtered image IvThen, a Sobel method is adopted to carry out gradient extraction in the x direction and the y direction to obtain Itx、ItyAdding the gradient images to obtain a gradient image It(ii) a The Sobel operator is a method for detecting the edge by adding the weighted difference of the gray values of the upper, lower, left and right fields of each pixel in the image to reach an extreme value at the edge. For the gradient image ItAdding 1 to each pixel point; the gradient image ItSelecting 4-8 lines for zero clearing treatment, then randomly selecting a certain point of one line as a seed point, carrying out treatment by a prior method of filling the overflowing water, taking the value of each pixel point of the filled region mark as 1, and finally carrying out binarization treatment by taking 1 as a threshold value and respectively according to the value of more than 1 or less than 2 to obtain a foreground region I in the image to be detectedfgBackground region Ibg
S102, determining a skin area in an image to be detected
Converting the image to be detected from an RGB model to a YCrCb model to obtain a converted image; where YCrCb is YUV, where "Y" represents brightness, i.e., a gray scale value; the "U" and "V" represent the chromaticity, which is used to describe the color and saturation of the image for specifying the color of the pixel. "luminance" is established through the RGB input signals by superimposing specific parts of the RGB signals together. "chroma" defines two aspects of color-hue and saturation, represented by Cr and Cb, respectively. Wherein Cr reflects a difference between a red portion of the RGB input signal and a luminance value of the RGB signal; and Cb reflects the difference between the blue part of the RGB input signal and the luminance value of the RGB signal. Detecting the converted image by using a preset skin color detection model to determine a skin area in the image to be detected;
the method comprises the following steps: to-be-detected image IoConversion from RGB space to YCrCb color space to obtain image IoyTraining an elliptical skin color model of skin color in a color space YCrCb according to the existing photo sample, and detecting an image I by using the skin color modeloySkin color region, obtaining skin region Iskin
S103, obtaining a clothes area in the image to be detected
Separating the skin area from the foreground area to obtain a clothes area in the image to be detected;
the method comprises the following steps: using foreground regions IfgMinus the skin area IskinObtaining a foreground area I without skinfgo(ii) a Then taking the lower boundary of the face positioning frame as a relative boundary line, and taking the foreground area I with the character at the chinfgoIs divided to obtain a clothing region Iclo
S104, calculating color values of the clothes area and the background area
Converting the image to be detected from an RGB model to an L AB model, wherein a L AB color model consists of three elements, one element is brightness (L), a and b are two color channels, the color included in a is from dark green to gray to bright pink red, the color included in b is from bright blue to gray to yellow, and color values of the clothes area and the background area are calculated respectively;
the method comprises the following steps: to-be-detected image IoConversion from RGB space to L AB color space yields L AB image Ilab(ii) a With a background region IbgAs template constraints, image I is rendered at L ABlabObtaining the color value Vbg of the background arealabIn a clothing region IcloAs template constraints, image I is rendered at L ABlabObtaining the color value of the clothing region.
S105, judging whether the color distance is within a preset range
Calculating a color distance l between the clothing region and the background regiontAnd judging the color distance ltWhether it is within a predetermined range; wherein the predetermined range is an empirically set range.
S106, defining as not meeting the imaging requirement
If the color distance is within a preset range, the background color is similar to the color of the clothes, and the imaging requirement is judged not to be met; so that the device automatically reminds the user in the subsequent steps according to the judgment result.
The method for evaluating the difference degree between the clothes and the background in the image can be applied to self-service photographing equipment so as to remind users with clothes color not meeting requirements before the user certificate photo is produced; the method can also be applied to a certification device which is used for automatically screening whether the electronic photo provided by the user meets the color requirement of the clothes.
In the following, a method for evaluating a difference between clothes and a background in an image according to an embodiment of a self-service photographing device is described in detail, referring to fig. 2, which includes:
s201, selecting an image to be detected from an original image
The method comprises the steps of acquiring an original image through a camera, and selecting an image to be detected from the original image according to a positioning frame of preset parameters.
The method comprises the following steps: setting a location box (x) of predetermined parameters empiricallylftp,ylftpWidth, height), wherein xlftpIs the horizontal coordinate of the upper left corner of the frame, ylftpIs the vertical coordinate of the upper left corner of the frame, width is the width of the frame, and height is the height of the frame; and the original image size is (width)s,heights) Obtaining an image I to be detectedoThe positioning frame on the original is (x)o,yo,widtho,heighto) And calculating by a formula, and if the calculation result exceeds the range of the original image, limiting by the actual edge of the original image.
xo=xlftp-width
yo=ylftp-height
widtho=xlftp+width-xo
heighto=heights
S202, separating the foreground area from the background area
Acquiring an image to be detected, and separating a foreground area and a background area of the image from the image to be detected; in the present embodiment, the foreground region refers to the region of the body part of the photographed person in the photograph, which includes the skin region and the clothing region; and the other areas belong to the background area;
sorting the original image to obtain an image I to be detectedoAn image I to be detectedoFiltering the image, and adopting a median filtering method to treat the image I to be detectedoSmoothing the median filter to obtain a filtered image Iv(ii) a For the filtered image IvThen, a Sobel method is adopted to carry out gradient extraction in the x direction and the y direction to obtain Itx、ItyAdding the gradient images to obtain a gradient image It(ii) a The Sobel operator is a method for detecting the edge by adding the weighted difference of the gray values of the upper, lower, left and right fields of each pixel in the image to reach an extreme value at the edge. For the gradient image ItAdding 1 to each pixel point; the gradient image ItSelecting 4-8 lines for zero clearing treatment, then randomly selecting a certain point of one line as a seed point, carrying out treatment by a prior method of filling the overflowing water, taking the value of each pixel point of the filled region mark as 1, and finally carrying out binarization treatment by taking 1 as a threshold value and respectively according to the value of more than 1 or less than 2 to obtain the to-be-detected to be binary-treatedForeground region I in survey imagefgBackground region Ibg
S203, determining the skin area in the image to be detected
Converting the image to be detected from an RGB model to a YCrCb model to obtain a converted image; where YCrCb is YUV, where "Y" represents brightness, i.e., a gray scale value; the "U" and "V" represent the chromaticity, which is used to describe the color and saturation of the image for specifying the color of the pixel. "luminance" is established through the RGB input signals by superimposing specific parts of the RGB signals together. "chroma" defines two aspects of color-hue and saturation, represented by Cr and Cb, respectively. Wherein Cr reflects a difference between a red portion of the RGB input signal and a luminance value of the RGB signal; and Cb reflects the difference between the blue part of the RGB input signal and the luminance value of the RGB signal. Detecting the converted image by using a preset skin color detection model to determine a skin area in the image to be detected;
the method comprises the following steps: to-be-detected image IoConversion from RGB space to YCrCb color space to obtain image IoyTraining an elliptical skin color model of skin color in a color space YCrCb according to the existing photo sample, and detecting an image I by using the skin color modeloySkin color region, obtaining skin region Iskin
S204, obtaining the clothes area in the image to be detected
Separating the skin area from the foreground area to obtain a clothes area in the image to be detected;
the method comprises the following steps: using foreground regions IfgMinus the skin area IskinObtaining a foreground area I without skinfgo(ii) a Then taking the lower boundary of the face positioning frame as a relative boundary line, and taking the foreground area I with the character at the chinfgoIs divided to obtain a clothing region Iclo
S205, calculating color values of background areas
Converting the image to be detected from an RGB model to an L AB model, wherein a L AB color model consists of three elements, one element is brightness (L), and a and b are two color channels, wherein a comprises colors from dark green to gray to bright pink red, b comprises colors from bright blue to gray to yellow, and calculating the color value of the background area;
the method comprises the following steps: to-be-detected image IoConversion from RGB space to L AB color space yields L AB image Ilab(ii) a With a background region IbgAs template constraints, image I is rendered at L ABlabObtaining the color value Vbg of the background arealab
S206, color values and areas of all sub-regions of the clothes are calculated
Dividing the clothes area into a plurality of sub-areas, and calculating the area and color value of each sub-area;
the method comprises the following steps: by the clothing region IcloAs a template constraint, a K-means method (K-means clustering algorithm) is used to perform clustering processing on each pixel point in the region, where K is 8 in this embodiment, each pixel point in the clothes region is clustered and divided into K sub-regions, and the area size { a ] of each sub-region is counted1,A2,...,AkThen, taking each subregion as a template constraint, and counting L AB color mean value { V ] of each subregion1,V2,...,VkComparing L Euclidean distance between AB colors in pairs among all sub-regions, and if the distance is smaller than a set threshold value TaThen the two sub-regions are merged into a new clustering sub-region, and the area size a of the sub-region is recalculatedtSize of color mean Vt. Repeatedly traversing, comparing and merging the sub-regions until the color distances among the new sub-regions are all larger than the threshold value TaSo far, new k' sub-regions are formed, and the new sub-area size is { A1,A2,...,Ak′Size of color mean { V }1,V2,...,Vk′}。
S207, acquiring the color distance and the area ratio of the minimum distance sub-region
Calculating the color distance l between each subarea in the background area and the clothes area according to the color distance ltAnd normalizing the color distancet=ltAnd/255, counting the color distance l of the minimum distance subarea with the minimum color distanceret(ii) a Calculating the area ratio P of the minimum distance sub-area in the clothes arearet=Aret/AtotalWherein is AtotalThe size of the area of the entire garment region.
S208, judging whether the area ratio reaches a preset area ratio or not
And S209, judging whether the color distance of the minimum distance sub-region is within a preset range. In the present embodiment, the distance threshold T is set by integrationlretArea ratio threshold TpretColor distance from subregion of lretArea ratio PretAnd comparing, and judging a similarity result: it is meaningless if the area of the sub-region does not reach a certain level, so the T is set in this embodimentpretIn a manner that solves this problem.
S209, judging whether the color distance of the minimum distance sub-area is within a preset range
Judging whether the color distance of the minimum distance sub-area is within a preset range, and if so, judging that the color distance does not meet the imaging requirement; step S210, prompting that the imaging requirements are not met; if the image is not in the preset range, judging that the image meets the imaging requirement; and step S211, generating the certificate photo according to the original image.
S210, prompting the non-conforming imaging requirements
The information that the clothes color does not accord with the imaging requirement is prompted in a mode of sound, an indicator light and screen display, and the user can replace the clothes in time.
S211, generating a certificate photo according to the original image
Generating a certificate photo according to an original image, wherein the generation mode comprises the following steps: printing the certificate photo, uploading the certificate photo to a network, and the like. Because in this embodiment, accord with the formation of image requirement just can generate the certificate and shine, it reduces unqualified certificate effectively and shines the production, has improved the qualification rate of certificate and shines effectively, has improved user's use and has experienced.
Referring to fig. 3, the present invention is a system for evaluating the difference between clothes and background in an image, comprising:
the image acquisition module 101 is used for converting an original image into an image to be detected;
the foreground and background separation module 102, the foreground and background separation module 102 is connected to the image acquisition module 101, and is configured to separate a foreground region and a background region of an image from an image to be detected; the foreground area refers to an area of a body part of a photographed person in a picture, and comprises a skin area and a clothes area; and the other areas belong to the background area;
the skin region determining module 103 is connected to the image obtaining module 101, and is configured to convert the image to be detected from an RGB model to a YCrCb model, where YCrCb is YUV, where "Y" represents brightness, i.e., a gray level value; the "U" and "V" represent the chromaticity, which is used to describe the color and saturation of the image for specifying the color of the pixel. "luminance" is established through the RGB input signals by superimposing specific parts of the RGB signals together. "chroma" defines two aspects of color-hue and saturation, represented by Cr and Cb, respectively. Wherein Cr reflects a difference between a red portion of the RGB input signal and a luminance value of the RGB signal; and Cb reflects the difference between the blue part of the RGB input signal and the luminance value of the RGB signal; obtaining a conversion image; detecting the converted image by using a preset skin color detection model to determine a skin area in the image to be detected;
a clothing region determining module 104, where the clothing region determining module 104 is respectively connected to the skin region determining module 103 and the foreground-background separating module 102, and is configured to separate the skin region from the foreground region to obtain a clothing region in the image to be detected;
the color value calculation module 105, where the color value calculation module 105 is connected to the clothing region determination module 104, the foreground-background separation module 102, and the image acquisition module 101, respectively, and is configured to convert the to-be-detected image from an RGB model to an L AB model, and calculate color values of the clothing region and the background region, respectively;
the color distance judging module 106 is connected to the color value calculating module 105, and is configured to calculate a color distance between the clothing region and the background region, and judge whether the color distance is within a predetermined range, if the color distance is within the predetermined range, it indicates that the background color is similar to the clothing color, and it is judged that the background color is not in accordance with the imaging requirement; and if the color distance is not within the preset range, judging that the imaging requirement is met.
The system for evaluating the difference degree between the clothes and the background in the image can be applied to self-service photographing equipment, so that users with clothes color not meeting requirements can be reminded before the user certificate photo is produced; the method can also be applied to a certification device which is used for automatically screening whether the electronic photo provided by the user meets the color requirement of the clothes.
Referring to fig. 4, the present invention is a self-help photographing apparatus, including: the system 100 for evaluating the difference degree between clothes and background in the image, the camera 200, the prompter 300 and the certificate photo generator 400;
the camera 200 is connected to an image acquisition module in the evaluation system 100, and is used for generating an original image;
the prompter 300 is connected to a color distance determination module in the evaluation system 100, and is configured to prompt that the imaging requirement is not met when the imaging requirement is not met; among them, the prompter 300 may include: loudspeakers, displays, indicator lights, etc.;
the certificate photo generator 400 is respectively connected to the color distance determination module and the camera 200 in the evaluation system 100, and is configured to generate a certificate photo according to the original image when the requirement of imaging is met. Wherein, its mode of generation includes: printing the certificate photo, uploading the certificate photo to a network, and the like.
The present invention includes a computer readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on the above readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
In the description of the present specification, reference to the description of the terms "one embodiment", "some embodiments", "an illustrative embodiment", "an example", "a specific example", or "some examples", etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (10)

1. A method for evaluating the difference degree between clothes and background in an image is characterized by comprising the following steps:
acquiring an image to be detected, and separating a foreground area and a background area of the image from the image to be detected;
converting the image to be detected from an RGB model to a YCrCb model to obtain a converted image; detecting the converted image by using a preset skin color detection model to determine a skin area in the image to be detected;
separating the skin area from the foreground area to obtain a clothes area in the image to be detected;
converting the image to be detected from an RGB model to an L AB model, and respectively calculating color values of the clothes area and the background area;
and calculating the color distance between the clothes area and the background area, judging whether the color distance is within a preset range, and if the color distance is within the preset range, judging that the color distance does not meet the imaging requirement.
2. The method for evaluating the degree of difference between clothes and a background in an image according to claim 1, wherein the determining that the image does not meet the requirement for imaging comprises:
and prompting that the imaging requirements are not met.
3. The method for evaluating the degree of difference between clothes and background in an image according to claim 2, wherein the calculating the color values of the clothes area and the background area respectively further comprises:
the clothing region is divided into a plurality of sub-regions, and the area of each sub-region is calculated.
4. The method for evaluating the degree of difference between clothes and background in an image according to claim 3, wherein the calculating the color distance between the clothes area and the background area and the judging whether the color distance is within a predetermined range comprises:
calculating color distances between the background area and each sub-area in the clothes area according to the color distances, carrying out normalization processing on the color distances, and carrying out statistics to obtain a minimum distance sub-area with the minimum color distance; and judging whether the color distance of the minimum distance sub-area is within a preset range.
5. The method for evaluating the degree of difference between clothes and background in an image according to claim 4, wherein the determining whether the color distance of the minimum distance sub-area is within a predetermined range comprises:
and calculating the area ratio of the minimum distance sub-area in the clothes area, judging whether the area ratio reaches a preset area ratio, and if so, judging whether the color distance of the minimum distance sub-area is within a preset range.
6. The method for evaluating the degree of difference between clothes and background in an image according to claim 5, wherein the step of obtaining the image to be detected comprises:
the method comprises the steps of acquiring an original image through a camera, and selecting an image to be detected from the original image according to a positioning frame of preset parameters.
7. The method for evaluating the degree of difference between clothes and background in an image according to claim 6, wherein said determining whether the color distance is within a predetermined range further comprises:
if the color distance calculation result is not within a preset range, judging that the color distance calculation result meets the imaging requirement;
and generating a certificate photo according to the original image.
8. A system for evaluating the degree of difference between clothing and background in an image, comprising:
the image acquisition module is used for converting an original image into an image to be detected;
the foreground and background separation module is connected with the image acquisition module and is used for separating a foreground area and a background area of an image from the image to be detected;
the skin area determining module is connected with the image acquiring module and is used for converting the image to be detected from an RGB model to a YCrCb model to obtain a converted image; detecting the converted image by using a preset skin color detection model to determine a skin area in the image to be detected;
the clothes area determining module is respectively connected with the skin area determining module and the foreground and background separating module and is used for separating the skin area from the foreground area to obtain a clothes area in the image to be detected;
the color value calculation module is respectively connected with the clothes area determination module, the foreground and background separation module and the image acquisition module, and is used for converting the to-be-detected image from an RGB model to an L AB model and respectively calculating color values of the clothes area and the background area;
the color distance judging module is connected with the color value calculating module and is used for calculating the color distance between the clothes area and the background area and judging whether the color distance is within a preset range, and if the color distance is within the preset range, the color distance is judged not to meet the requirement of imaging; and if the color distance is not within the preset range, judging that the imaging requirement is met.
9. A self-service photographing device, comprising: the system for evaluating the degree of difference between clothes and background in an image according to claim 8, and a camera, a prompter and a certificate photo generator;
the camera is connected with the image acquisition module and used for generating an original image;
the prompter is connected with the color distance judging module and is used for prompting the imaging-nonconforming requirement when the imaging-nonconforming requirement is not met;
the certificate photo generator is respectively connected with the color distance judging module and the camera and used for generating the certificate photo according to the original image when the requirement of imaging is met.
10. A computer-readable program storage medium storing computer program instructions which, when executed by a computer, cause the computer to perform the method according to any one of claims 1 to 7.
CN202010176413.0A 2020-03-13 2020-03-13 Method, system, equipment and medium for evaluating difference degree between clothes and background in image Active CN111429535B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010176413.0A CN111429535B (en) 2020-03-13 2020-03-13 Method, system, equipment and medium for evaluating difference degree between clothes and background in image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010176413.0A CN111429535B (en) 2020-03-13 2020-03-13 Method, system, equipment and medium for evaluating difference degree between clothes and background in image

Publications (2)

Publication Number Publication Date
CN111429535A true CN111429535A (en) 2020-07-17
CN111429535B CN111429535B (en) 2023-09-08

Family

ID=71547897

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010176413.0A Active CN111429535B (en) 2020-03-13 2020-03-13 Method, system, equipment and medium for evaluating difference degree between clothes and background in image

Country Status (1)

Country Link
CN (1) CN111429535B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103473780A (en) * 2013-09-22 2013-12-25 广州市幸福网络技术有限公司 Portrait background cutout method
US20140177955A1 (en) * 2012-12-21 2014-06-26 Sadagopan Srinivasan System and method for adaptive skin tone detection
CN105825161A (en) * 2015-01-07 2016-08-03 阿里巴巴集团控股有限公司 Image skin color detection method and system thereof
CN106558046A (en) * 2016-10-31 2017-04-05 深圳市飘飘宝贝有限公司 A kind of quality determining method and detection means of certificate photo
WO2017092431A1 (en) * 2015-12-01 2017-06-08 乐视控股(北京)有限公司 Human hand detection method and device based on skin colour
CN110222555A (en) * 2019-04-18 2019-09-10 江苏图云智能科技发展有限公司 The detection method and device of area of skin color

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140177955A1 (en) * 2012-12-21 2014-06-26 Sadagopan Srinivasan System and method for adaptive skin tone detection
CN103473780A (en) * 2013-09-22 2013-12-25 广州市幸福网络技术有限公司 Portrait background cutout method
CN105825161A (en) * 2015-01-07 2016-08-03 阿里巴巴集团控股有限公司 Image skin color detection method and system thereof
WO2017092431A1 (en) * 2015-12-01 2017-06-08 乐视控股(北京)有限公司 Human hand detection method and device based on skin colour
CN106558046A (en) * 2016-10-31 2017-04-05 深圳市飘飘宝贝有限公司 A kind of quality determining method and detection means of certificate photo
CN110222555A (en) * 2019-04-18 2019-09-10 江苏图云智能科技发展有限公司 The detection method and device of area of skin color

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈倩 等: "基于内容的服装检索系统中颜色特征提取算法的研究和改进", 《激光杂志》 *

Also Published As

Publication number Publication date
CN111429535B (en) 2023-09-08

Similar Documents

Publication Publication Date Title
CN108764091B (en) Living body detection method and apparatus, electronic device, and storage medium
US10452894B2 (en) Systems and method for facial verification
JP4416795B2 (en) Correction method
CN101599175B (en) Detection method for determining alteration of shooting background and image processing device
US8055067B2 (en) Color segmentation
US9530045B2 (en) Method, system and non-transitory computer storage medium for face detection
JP4351911B2 (en) Method and apparatus for evaluating photographic quality of captured images in a digital still camera
US8363933B2 (en) Image identification method and imaging apparatus
CN103098078B (en) Smile's detecting system and method
US8446494B2 (en) Automatic redeye detection based on redeye and facial metric values
CN105279487B (en) Method and system for screening beauty tools
CN107563976B (en) Beauty parameter obtaining method and device, readable storage medium and computer equipment
CN110807759B (en) Method and device for evaluating photo quality, electronic equipment and readable storage medium
EP2797051B1 (en) Image processing device, image processing method, program, and recording medium
WO2015070723A1 (en) Eye image processing method and apparatus
CN108293092A (en) Parameter adjustment based on Strength Changes
JP2002507035A (en) How to authenticate the validity of an image recorded for personal identification
CN101794406A (en) Automatic counting system for density of Bemisia tabaci adults
CN103501411A (en) Image shooting method and system
CN109089041A (en) Recognition methods, device, electronic equipment and the storage medium of photographed scene
CN111539311A (en) Living body distinguishing method, device and system based on IR and RGB double photographing
CN111080577A (en) Method, system, device and storage medium for evaluating quality of fundus image
CN110838102A (en) Intelligent image uploading method
CN111429536B (en) Method, system and storage medium for toning skin color in image
CN106920266B (en) The Background Generation Method and device of identifying code

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant