CN113204663A - Information processing method and device for clothing matching - Google Patents

Information processing method and device for clothing matching Download PDF

Info

Publication number
CN113204663A
CN113204663A CN202110443950.1A CN202110443950A CN113204663A CN 113204663 A CN113204663 A CN 113204663A CN 202110443950 A CN202110443950 A CN 202110443950A CN 113204663 A CN113204663 A CN 113204663A
Authority
CN
China
Prior art keywords
image
clothing
human body
evaluated
skin color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110443950.1A
Other languages
Chinese (zh)
Inventor
蒋昀
张黛
阳文娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Future Yishou Network Technology Co ltd
Original Assignee
Guangzhou Future Yishou Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Future Yishou Network Technology Co ltd filed Critical Guangzhou Future Yishou Network Technology Co ltd
Priority to CN202110443950.1A priority Critical patent/CN113204663A/en
Publication of CN113204663A publication Critical patent/CN113204663A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/535Filtering based on additional data, e.g. user or group profiles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations

Abstract

The embodiment of the disclosure discloses an information processing method and device for clothing matching, wherein the method comprises the following steps: firstly, after a human body image of a target user is obtained, preprocessing the human body image to respectively obtain a human body face image and a human body three-dimensional modeling image; inputting the human face image into a pre-trained recognition model, and recognizing the skin color of the human face image; then matching a preset clothing style for the target user based on the skin color and the three-dimensional modeling image to obtain a clothing style to be evaluated, which is matched with the target user; and finally, sending the clothing style to be evaluated and the facial image to a preset expert group to obtain the score after the skin color, the facial image and the clothing style to be evaluated are matched. The matching accuracy is improved, and the purchasing efficiency and the transaction rate are further improved. The technical problem of poor matching rate in the related technology is solved.

Description

Information processing method and device for clothing matching
Technical Field
The disclosure relates to the technical field of clothing data processing, in particular to an information processing method and device for clothing matching.
Background
For a consumer, it is difficult to determine a garment matching with the consumer from a large number of garments, generally by means of manual offline searching and fitting; or the clothes are newly tried on in an on-line 3D fitting mode, and by adopting the mode, a person needs to make the three-dimensional animation aiming at each piece of clothes, the three-dimensional animation is uneven, the style of the clothes cannot be truly reflected, and the fitting effect with the person is poor.
Disclosure of Invention
The main objective of the present disclosure is to provide an information processing method and apparatus for clothing matching, so as to solve the technical problem that the matching degree between the clothing and the user is not good.
In order to achieve the above object, according to a first aspect of the present disclosure, there is provided an information processing method for clothing matching, including: after a human body image of a target user is obtained, preprocessing the human body image to respectively obtain a human body face image and a human body three-dimensional modeling image; inputting the human face image into a pre-trained recognition model, and recognizing the skin color of the human face image; matching a preset clothing style for the target user based on the skin color and the three-dimensional modeling image to obtain a clothing style to be evaluated, which is matched with the target user; and sending the clothing style to be evaluated and the facial image to a preset expert group to obtain scores after the skin color, the facial image and the clothing style to be evaluated are matched.
Optionally, the method further comprises pre-training the recognition model, including: acquiring a large number of face images to serve as training samples; inputting the training sample into a pre-established recognition model to obtain a probability predicted value of the chroma corresponding to the face image, and determining the chroma corresponding to the maximum probability predicted value as the chroma of the face image; determining a chroma division threshold value based on corresponding chroma of the large amount of face images; and determining the skin color type based on the interval determined by the chroma division threshold value.
Optionally, sending the skin color, the clothing style to be evaluated, and the face image to a preset expert group to obtain a score after the skin color, the face image, and the clothing style to be evaluated are matched comprises: combining the clothing style to be evaluated with the face image; sending the combined image to the preset expert group so that the preset expert group scores the combined image; and determining the combination with the highest score as the matched clothing style matched with the target user.
Optionally, the method further comprises: excavating a human body dressing image from a preset position by utilizing a big data mining technology; determining the colors of the clothes suitable for different skin colors by utilizing a big data analysis technology; and/or, the clothing style of the human body clothing image is segmented to obtain clothing styles of different parts; and correspondingly storing the sizes of all parts of the human body in the human body dressing image and the clothes styles of different parts.
According to a second aspect of the present disclosure, there is provided an information processing apparatus for clothing matching, including: the preprocessing unit is configured to preprocess the human body image after the human body image of the target user is obtained, and obtain a human body face image and a human body three-dimensional modeling image respectively; a recognition unit configured to input the human head image into a pre-trained recognition model to recognize a skin color of the human head image; the matching unit is configured to match preset clothing styles for the target user based on the skin color and the three-dimensional modeling image to obtain clothing styles to be evaluated, wherein the clothing styles are matched with the target user; and the scoring unit is configured to send the clothing style to be evaluated and the facial image to a preset expert group so as to obtain scores after the skin color and the facial image are matched with the clothing style to be evaluated.
Optionally, the apparatus further comprises a training unit: acquiring a large number of face images to serve as training samples; inputting the training sample into a pre-established recognition model to obtain a probability predicted value of the chroma corresponding to the face image, and determining the chroma corresponding to the maximum probability predicted value as the chroma of the face image; determining a chroma division threshold value based on corresponding chroma of the large amount of face images; and determining the skin color type based on the interval determined by the chroma division threshold value.
Optionally, the scoring unit is further configured to: combining the clothing style to be evaluated with the face image; sending the combined image to the preset expert group so that the preset expert group scores the combined image; and determining the combination with the highest score as the matched clothing style matched with the target user.
Optionally, the apparatus further comprises: a big data mining unit configured to mine a human body dressing image from a preset position using a big data mining technique; a big data analysis unit configured to determine different skin-color adapted garment colors using big data analysis techniques; and/or the image segmentation unit is configured to segment the clothing style of the human body clothing image to obtain clothing styles of different parts; and the storage unit is configured to correspondingly store the sizes of all parts of the human body in the human body dressing image and the clothes styles of different parts.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium storing computer instructions for causing a computer to execute the information processing method for clothing matching according to any one of the embodiments of the first aspect.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to cause the at least one processor to perform the information processing method for clothing matching as described in any one of the embodiments of the first aspect.
In the embodiment of the disclosure, firstly, after a human body image of a target user is obtained, the human body image is preprocessed to respectively obtain a human body face image and a human body three-dimensional modeling image; inputting the human face image into a pre-trained recognition model, and recognizing the skin color of the human face image; then matching a preset clothing style for the target user based on the skin color and the three-dimensional modeling image to obtain a clothing style to be evaluated, which is matched with the target user; and finally, sending the clothing style to be evaluated and the facial image to a preset expert group to obtain the score after the skin color, the facial image and the clothing style to be evaluated are matched. The method comprises the steps of firstly identifying skin color, carrying out three-dimensional modeling on a human body, then determining a clothing style matched with the skin color based on the skin color and the three-dimensional model obtained through identification, and grading the clothing style by utilizing expert group grading, so that clothing matched with a target user most can be obtained, the matching accuracy is improved, and the purchasing efficiency and the transaction rate are further improved. The technical problem of poor matching rate in the related technology is solved.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of an information processing method for clothing matching according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of an information processing apparatus for clothing matching according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those skilled in the art, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only some embodiments of the present disclosure, not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the present disclosure may be described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
According to an embodiment of the present disclosure, there is provided an information processing method for clothing matching, as shown in fig. 1, the method includes the following steps 101 to 104:
step 101: after the human body image of the target user is obtained, preprocessing is carried out on the human body image, and a human body face image and a human body three-dimensional modeling image are obtained respectively.
In this embodiment, after the image of the human body of the target user is obtained in real time or the image of the human body of the target user is obtained from the gallery of the target user, the image is segmented to obtain a facial image to be processed, and the facial image to be processed is subjected to denoising and highlight removal processing to obtain the facial image of the human body. And determining the modeling size of the human body image in an image recognition mode, and determining human body three-dimensional modeling based on the modeling size. It is understood that based on the identified modeled dimensions, a fine-tuning modification of the dimensions of the target user may also be received, and three-dimensional modeling may be performed based on the modified dimensions. The image model can be obtained by three-dimensional modeling, and can reflect the sizes of the parts of the target user, such as neck length, head circumference, arm length, shoulder width, shoulder thickness, crotch width, hip circumference, leg length, leg thickness, ankle length, and the like. The three-dimensional image after three-dimensional modeling can be used for adapting the clothing style, for example, for an image with a shank circumference larger than a certain circumference, the length of the matched skirt style cannot be the shank full exposure, and can be the length of the ankle; or to linear legged jeans and the like.
It is understood that a human body image, i.e., an image including all parts of a human body, is used for three-dimensional modeling and skin color recognition.
Step 102: and inputting the human head image into a pre-trained recognition model, and recognizing the skin color of the human face image.
In this embodiment, the human face image obtained in step 101 may be used as an input of the recognition model, and a skin color type corresponding to the face image may be output, where the obtained skin color type may be black, yellow, neutral, or white based on chromaticity division.
As an optional implementation manner of this embodiment, the method further includes training the recognition model in advance, including: acquiring a large number of face images to serve as training samples; inputting the training sample into a pre-established recognition model to obtain a probability predicted value of the chroma corresponding to the face image, and determining the chroma corresponding to the maximum probability predicted value as the chroma of the face image; determining a chroma division threshold value based on corresponding chroma of the large amount of face images; and determining the skin color type based on the interval determined by the chroma division threshold value.
In this optional implementation, an image recognition model may be trained in advance, and the image recognition model may implement recognition of the skin color of the face image. The sample of the training process can be to obtain the human face from a human face image library, and because the image is influenced by shooting, the image needs to be denoised and subjected to highlight removal, and the image capable of reflecting the natural state of the human body is obtained. And taking a large number of facial images as training samples, and inputting the training samples into the deep neural network model to obtain a chroma probability predicted value corresponding to the facial images. The chroma can comprise 36 skin colors in the Luxuan skin color table, and after the identification of the identification model, the probability value of the corresponding chroma can be output. Then threshold division is carried out based on the maximum probability values, for example, the probability value is more than 99.99 percent and is determined to be black, the probability value is between 80 percent and 99 percent and is determined to be white, and the rest is determined to be yellow, because different skin colors have very high requirements on the color of the clothes, for example, if yellow skin is matched with orange clothes, the matching is very unadapted, so that the clothes style matched with the skin color can be predetermined after the skin color is identified.
Step 103: and matching a preset clothing style for the target user based on the skin color and the three-dimensional modeling image to obtain a clothing style to be evaluated, which is matched with the target user.
In this embodiment, after the obtained skin color and the human body image obtained by three-dimensional modeling are identified, the color of the preset clothing style which is stored in the database and is matched with the skin color can be determined based on the identified skin color. The database stores various images of clothes in the store, each image corresponds to a stored version of the clothes, which can be manually marked content (such as A-shaped one-piece dress, straight-tube trousers, jeans, medium-length western-style clothes), size (size of each part of each clothes) and clothes color. For example, if the clothes color matched with yellow skin color in the store has milky white, apricot color, etc., the clothes with milky white, apricot color, etc. can be screened out preliminarily. And then determining the clothes style matched with the three-dimensional modeling image in the clothes styles obtained by the primary screening. The matching process can be based on preset matching rules, match each part of the human body, and comprehensively obtain a matched style. For example, for a person with a wide crotch, the fitted version is a character a and a straight tube, so if the crotch width in the human body scale is displayed in the three-dimensional modeling image, the clothing style of the elements such as the character a and the straight tube is matched, then based on the specific three-dimensional size, for example, the neck is short, the matching rule is to screen out the high-neckline clothing, the high-neckline style can be deleted from the styles such as the character a and the straight tube, so as to obtain the element styles such as the character a, the straight tube, the half-high collar, the low collar, and the like, and if the user belongs to a long waist, the low-waist style is screened out, and the high-waist style is preferentially determined from the element styles such as the character a, the straight tube, the half-high collar. The final result is a result of matching with the target user, and for example, a high-waistline a-shaped dress, a high-waistline a-shaped half-skirt, a low-neckline sweater, or the like is used as the final recommendation result. The recommended clothes are recommended to the target user, and the target user can conduct fitting based on the recommended clothes, so that matching accuracy and matching efficiency are improved.
Specifically, after the initial recommendation result is determined, the initial recommendation result may be sent to the expert group as a clothing style to be evaluated.
As an optional implementation manner of this embodiment, a big data mining technology is used to mine a human body dressing image from a preset position; determining the colors of the clothes suitable for different skin colors by utilizing a big data analysis technology; and/or, the clothing style of the human body clothing image is segmented to obtain clothing styles of different parts; and correspondingly storing the sizes of all parts of the human body in the human body dressing image and the clothes styles of different parts.
In this optional implementation, the establishment of the preset matching rule may be determined based on a big data technology, and after legally crawling the picture from each fashion show gallery or the website, the picture is analyzed and processed to determine the clothing color adapted to the skin color and the clothing detail design structure adapted to each part of the human body, and the adapted clothing detail structure is calibrated in an automated manner. For example, a short neck fits half-high collar and low collar, etc.
Step 104: and sending the clothing style to be evaluated and the facial image to a preset expert group to obtain scores after the skin color, the facial image and the clothing style to be evaluated are matched.
In this embodiment, after the clothing style to be evaluated is obtained, the face image and the clothing style to be evaluated are subjected to expert group scoring.
As an optional implementation manner of this embodiment, the clothing style to be evaluated is combined with the facial image; sending the combined image to the preset expert group so that the preset expert group scores the combined image; and determining the combination with the highest score as the matched clothing style matched with the target user.
In this optional implementation, by combining the facial image of the target user (which may be used to represent the image of the target user's growth) with the clothing style to be evaluated, the combination is to fit the solid clothing onto the image of the target user by an image fusion technique, and adjust the chromaticity of the facial image to the identified skin color type. And then, scoring each set of clothes through the expert group, and sending the top N positions with the highest score as recommended clothes styles to the target user so that the target user can take off-line fitting.
Through expert group grading, the matching accuracy of the garment and the target user can be improved, and the optimal aesthetic grade is achieved.
From the above description, it can be seen that the present disclosure achieves the following technical effects: the clothes with the best aesthetic feeling and accuracy can be matched for the target user, so that the purchase transaction rate and the purchase efficiency are further improved.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
According to an embodiment of the present disclosure, there is also provided an apparatus for implementing the information processing method for clothing matching, as shown in fig. 2, the apparatus including: the preprocessing unit 201 is configured to, after acquiring a human body image of a target user, preprocess the human body image to obtain a human body face image and a human body three-dimensional modeling image respectively; a recognition unit 202 configured to input the human head image into a pre-trained recognition model to recognize the skin color of the human head image; a matching unit 203, configured to match a preset clothing style for the target user based on the skin color and the three-dimensional modeling image, so as to obtain a clothing style to be evaluated, which is matched with the target user; the scoring unit 204 is configured to send the clothing style to be evaluated and the facial image to a preset expert group to obtain scores after matching the skin color and the facial image with the clothing style to be evaluated.
As an optional implementation manner of this embodiment, the apparatus further includes a training unit: taking a large number of face images to serve as training samples; inputting the training sample into a pre-established recognition model to obtain a probability predicted value of the chroma corresponding to the face image, and determining the chroma corresponding to the maximum probability predicted value as the chroma of the face image; determining a chroma division threshold value based on corresponding chroma of the large amount of face images; and determining the skin color type based on the interval determined by the chroma division threshold value.
As an optional implementation manner of this embodiment, the scoring unit 204 is further configured to: combining the clothing style to be evaluated with the face image; sending the combined image to the preset expert group so that the preset expert group scores the combined image; and determining the combination with the highest score as the matched clothing style matched with the target user.
As an optional implementation manner of this embodiment, the apparatus further includes: a big data mining unit configured to mine a human body dressing image from a preset position using a big data mining technique; a big data analysis unit configured to determine different skin-color adapted garment colors using big data analysis techniques; and/or the image segmentation unit is configured to segment the clothing style of the human body clothing image to obtain clothing styles of different parts; and the storage unit is configured to correspondingly store the sizes of all parts of the human body in the human body dressing image and the clothes styles of different parts.
The embodiment of the present disclosure provides an electronic device, as shown in fig. 3, the electronic device includes one or more processors 31 and a memory 32, where one processor 31 is taken as an example in fig. 3.
The controller may further include: an input device 33 and an output device 34.
The processor 31, the memory 32, the input device 33 and the output device 34 may be connected by a bus or other means, and fig. 3 illustrates the connection by a bus as an example.
The processor 31 may be a Central Processing Unit (CPU). The processor 31 may also be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or combinations thereof. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 32, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the control methods in the embodiments of the present disclosure. The processor 31 executes various functional applications of the server and data processing by running the non-transitory software programs, instructions and modules stored in the memory 32, namely, implements the information processing method for clothing matching of the above-described method embodiment.
The memory 32 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of a processing device operated by the server, and the like. Further, the memory 32 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 32 may optionally include memory located remotely from the processor 31, which may be connected to a network connection device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 33 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the processing device of the server. The output device 34 may include a display device such as a display screen.
One or more modules are stored in the memory 32, which when executed by the one or more processors 31 perform the method as shown in fig. 1.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program to instruct related hardware, and the program can be stored in a computer readable storage medium, and when executed, the program can include the processes of the embodiments of the motor control methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-only memory (ROM), a Random Access Memory (RAM), a flash memory (FlashMemory), a hard disk (hard disk drive, abbreviated as HDD) or a Solid State Drive (SSD), etc.; the storage medium may also comprise a combination of memories of the kind described above.
Although the embodiments of the present disclosure have been described in conjunction with the accompanying drawings, those skilled in the art may make various modifications and variations without departing from the spirit and scope of the present disclosure, and such modifications and variations fall within the scope defined by the appended claims.

Claims (10)

1. An information processing method for clothing matching, characterized by comprising:
after a human body image of a target user is obtained, preprocessing the human body image to respectively obtain a human body face image and a human body three-dimensional modeling image;
inputting the human face image into a pre-trained recognition model, and recognizing the skin color of the human face image;
matching a preset clothing style for the target user based on the skin color and the three-dimensional modeling image to obtain a clothing style to be evaluated, which is matched with the target user;
and sending the clothing style to be evaluated and the facial image to a preset expert group to obtain scores after the skin color, the facial image and the clothing style to be evaluated are matched.
2. The information processing method for clothing matching according to claim 1, further comprising pre-training a recognition model, comprising:
acquiring a large number of face images to serve as training samples;
inputting the training sample into a pre-established recognition model to obtain a probability predicted value of the chroma corresponding to the face image, and determining the chroma corresponding to the maximum probability predicted value as the chroma of the face image;
determining a chroma division threshold value based on corresponding chroma of the large amount of face images;
and determining the skin color type based on the interval determined by the chroma division threshold value.
3. The information processing method for clothing matching according to claim 1, wherein sending the skin color, the clothing style to be evaluated, and the facial image to a preset expert group to obtain the skin color, the facial image, and a score after matching with the clothing style to be evaluated comprises:
combining the clothing style to be evaluated with the face image;
sending the combined image to the preset expert group so that the preset expert group scores the combined image;
and determining the combination with the highest score as the matched clothing style matched with the target user.
4. The information processing method for clothing matching according to claim 1, further comprising:
excavating a human body dressing image from a preset position by utilizing a big data mining technology;
determining the colors of the clothes suitable for different skin colors by utilizing a big data analysis technology; and/or the presence of a gas in the gas,
dividing the clothing style of the human body clothing image to obtain clothing styles of different parts;
and correspondingly storing the sizes of all parts of the human body in the human body dressing image and the clothes styles of different parts.
5. An information processing apparatus for clothing matching, characterized by comprising:
the preprocessing unit is configured to preprocess the human body image after the human body image of the target user is obtained, and obtain a human body face image and a human body three-dimensional modeling image respectively;
a recognition unit configured to input the human head image into a pre-trained recognition model to recognize a skin color of the human head image;
the matching unit is configured to match preset clothing styles for the target user based on the skin color and the three-dimensional modeling image to obtain clothing styles to be evaluated, wherein the clothing styles are matched with the target user;
and the scoring unit is configured to send the clothing style to be evaluated and the facial image to a preset expert group so as to obtain scores after the skin color and the facial image are matched with the clothing style to be evaluated.
6. The information processing apparatus for clothing matching according to claim 5, wherein the apparatus further comprises a training unit:
acquiring a large number of face images to serve as training samples;
inputting the training sample into a pre-established recognition model to obtain a probability predicted value of the chroma corresponding to the face image, and determining the chroma corresponding to the maximum probability predicted value as the chroma of the face image;
determining a chroma division threshold value based on corresponding chroma of the large amount of face images;
and determining the skin color type based on the interval determined by the chroma division threshold value.
7. The information processing apparatus for clothing matching according to claim 5, wherein the scoring unit is further configured to:
combining the clothing style to be evaluated with the face image;
sending the combined image to the preset expert group so that the preset expert group scores the combined image;
and determining the combination with the highest score as the matched clothing style matched with the target user.
8. The information processing apparatus for clothing matching according to claim 5, said apparatus further comprising:
a big data mining unit configured to mine a human body dressing image from a preset position using a big data mining technique;
a big data analysis unit configured to determine different skin-color adapted garment colors using big data analysis techniques; and/or the presence of a gas in the gas,
the image segmentation unit is configured to segment the clothing style of the human body clothing image to obtain clothing styles of different parts;
and the storage unit is configured to correspondingly store the sizes of all parts of the human body in the human body dressing image and the clothes styles of different parts.
9. A computer-readable storage medium storing computer instructions for causing a computer to execute the information processing method for clothing matching according to any one of claims 1 to 4.
10. An electronic device, comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to cause the at least one processor to perform the information processing method for clothing matching according to any one of claims 1 to 4.
CN202110443950.1A 2021-04-23 2021-04-23 Information processing method and device for clothing matching Pending CN113204663A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110443950.1A CN113204663A (en) 2021-04-23 2021-04-23 Information processing method and device for clothing matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110443950.1A CN113204663A (en) 2021-04-23 2021-04-23 Information processing method and device for clothing matching

Publications (1)

Publication Number Publication Date
CN113204663A true CN113204663A (en) 2021-08-03

Family

ID=77028338

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110443950.1A Pending CN113204663A (en) 2021-04-23 2021-04-23 Information processing method and device for clothing matching

Country Status (1)

Country Link
CN (1) CN113204663A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101251898A (en) * 2008-03-25 2008-08-27 腾讯科技(深圳)有限公司 Skin color detection method and apparatus
CN105808774A (en) * 2016-03-28 2016-07-27 北京小米移动软件有限公司 Information providing method and device
CN109801380A (en) * 2018-12-14 2019-05-24 深圳壹账通智能科技有限公司 A kind of method, apparatus of virtual fitting, storage medium and computer equipment
CN110264299A (en) * 2019-05-07 2019-09-20 平安科技(深圳)有限公司 Clothes recommended method, device and computer equipment based on recognition of face
CN111508079A (en) * 2020-04-22 2020-08-07 深圳追一科技有限公司 Virtual clothing fitting method and device, terminal equipment and storage medium
CN111612584A (en) * 2020-05-22 2020-09-01 杭州智珺智能科技有限公司 AI intelligent clothing recommendation method based on wearing and putting-on theory

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101251898A (en) * 2008-03-25 2008-08-27 腾讯科技(深圳)有限公司 Skin color detection method and apparatus
CN105808774A (en) * 2016-03-28 2016-07-27 北京小米移动软件有限公司 Information providing method and device
CN109801380A (en) * 2018-12-14 2019-05-24 深圳壹账通智能科技有限公司 A kind of method, apparatus of virtual fitting, storage medium and computer equipment
CN110264299A (en) * 2019-05-07 2019-09-20 平安科技(深圳)有限公司 Clothes recommended method, device and computer equipment based on recognition of face
CN111508079A (en) * 2020-04-22 2020-08-07 深圳追一科技有限公司 Virtual clothing fitting method and device, terminal equipment and storage medium
CN111612584A (en) * 2020-05-22 2020-09-01 杭州智珺智能科技有限公司 AI intelligent clothing recommendation method based on wearing and putting-on theory

Similar Documents

Publication Publication Date Title
US11869194B2 (en) Image processing method and apparatus, computer-readable storage medium
CN110309706B (en) Face key point detection method and device, computer equipment and storage medium
CN109685013B (en) Method and device for detecting head key points in human body posture recognition
US20210287091A1 (en) Neural network training method and image matching method and apparatus
CN110895702B (en) Clothing attribute identification detection method and device
WO2017008435A1 (en) Method for recognizing picture, method and apparatus for labelling picture, and storage medium
CN109325398A (en) A kind of face character analysis method based on transfer learning
CN110110715A (en) Text detection model training method, text filed, content determine method and apparatus
WO2019090769A1 (en) Human face shape recognition method and apparatus, and intelligent terminal
WO2018228448A1 (en) Method and apparatus for recommending matching clothing, electronic device and storage medium
CN112232117A (en) Face recognition method, face recognition device and storage medium
CN108022251B (en) Method and system for extracting central line of tubular structure
CN112970047A (en) System and method for automatically generating three-dimensional virtual garment models using product descriptions
CN106599785B (en) Method and equipment for establishing human body 3D characteristic identity information base
CN110069983A (en) Vivo identification method, device, terminal and readable medium based on display medium
JP2010262425A (en) Computer execution method for recognizing and classifying clothes
CN112330383A (en) Apparatus and method for visual element-based item recommendation
Ma et al. Kinematic skeleton extraction from 3D articulated models
CN112905889A (en) Clothing searching method and device, electronic equipment and medium
KR20210090456A (en) Image-based Posture Preservation Virtual Fitting System Supporting Multi-Poses
CN110175974A (en) Image significance detection method, device, computer equipment and storage medium
CN108230297B (en) Color collocation assessment method based on garment replacement
WO2013160663A2 (en) A system and method for image analysis
CN108596094B (en) Character style detection system, method, terminal and medium
CN111145242A (en) Method, smart device, and computer-readable storage medium for predicting popularity trend

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination