CN113159876B - Clothing collocation recommendation device, method and storage medium - Google Patents

Clothing collocation recommendation device, method and storage medium Download PDF

Info

Publication number
CN113159876B
CN113159876B CN202010072538.9A CN202010072538A CN113159876B CN 113159876 B CN113159876 B CN 113159876B CN 202010072538 A CN202010072538 A CN 202010072538A CN 113159876 B CN113159876 B CN 113159876B
Authority
CN
China
Prior art keywords
clothing
user
image
garment
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010072538.9A
Other languages
Chinese (zh)
Other versions
CN113159876A (en
Inventor
陈维强
李广琴
黄利
孙锦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Co Ltd
Original Assignee
Hisense Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Co Ltd filed Critical Hisense Co Ltd
Priority to CN202010072538.9A priority Critical patent/CN113159876B/en
Publication of CN113159876A publication Critical patent/CN113159876A/en
Application granted granted Critical
Publication of CN113159876B publication Critical patent/CN113159876B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20182Noise reduction or smoothing in the temporal domain; Spatio-temporal filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Development Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The application provides a clothing collocation recommendation device, a clothing collocation recommendation method and a storage medium, wherein the device comprises a display panel for displaying images; the camera is used for collecting an image containing clothing in front of the display panel; the controller is used for obtaining the clothing category of the clothing target in the image through a clothing classification algorithm; obtaining a clothing image of each clothing target through a clothing segmentation algorithm; generating clothing data according to the clothing category and the clothing image; generating a clothing matching scheme through a clothing matching algorithm according to the user body type and clothing data from the intelligent body measuring instrument; sending a clothing collocation scheme to a display panel; the intelligent body measuring instrument comprises a photoelectric identification module and a communication module, wherein the photoelectric identification module is used for acquiring body measuring parameters of a user; and obtaining the user body type according to the body measurement parameters, wherein the communication module is used for sending the user body type to the controller. According to the application, the clothing matching scheme is generated based on the user body type and clothing data, so that personalized matching aiming at different users is realized, and the matching flexibility is improved.

Description

Clothing collocation recommendation device, method and storage medium
Technical Field
The application relates to the technical field of deep learning, in particular to a clothes collocation recommending device, a clothes collocation recommending method and a storage medium.
Background
Clothing is not only a living necessity for people to hide bodies, but also an important ornament for people to show identity, living attitudes and personal charm. With the continuous improvement of the living standard of people, the consumer demand of people for clothes is more and more strong, and the rapid development of internet shopping also aggravates the impulse of people to purchase clothes to a certain extent. When people choose the clothes types to match, people often have difficulty in remembering which clothes are available for selection, so that excessive clothes bring trouble to life of people.
In the related art, the clothes matching recommendation device stores the existing clothes of the user as clothes images, utilizes a deep learning algorithm to construct a matching model, utilizes the matching model to match the clothes images, and recommends a matching scheme to the user, so that the problem of difficult matching of the user is solved. However, when the clothing matching recommendation device builds the matching model, matching is performed only according to the attribute of the clothing, such as color, length, etc., when different users adopt the matching scheme recommended by the clothing matching recommendation device, the matching effect difference is large, and even when some users adopt the matching scheme recommended by the clothing matching recommendation device, the figure defects of the users can be more prominent.
Disclosure of Invention
The application provides a clothing collocation recommendation device, a clothing collocation recommendation method and a storage medium, which are used for solving the problem that personalized collocation cannot be carried out.
In a first aspect, the present application provides a clothing collocation recommendation device, including:
a display panel for displaying an image;
the camera is used for collecting an image containing clothing in front of the display panel;
the controller is used for obtaining the clothing category of the clothing target in the image through a clothing classification algorithm; obtaining a clothing image of each clothing target through a clothing segmentation algorithm; generating clothing data according to the clothing category and the clothing image; generating a clothing matching scheme through a clothing matching algorithm according to the user body type from the intelligent body measuring instrument and the clothing data; transmitting the clothing collocation scheme to a display panel;
the intelligent body measuring instrument comprises a photoelectric identification module and a communication module, wherein the photoelectric identification module is used for acquiring body measuring parameters of a user; and obtaining the user body type according to the body measurement parameters, wherein the communication module is used for sending the user body type to the controller.
In a second aspect, the present application provides a clothing matching recommendation method, for a clothing matching recommendation device, where the clothing matching recommendation device includes a camera, a controller, an intelligent body measuring instrument and a display panel, and the method includes:
collecting an image in front of a display panel through a camera;
obtaining the clothing category of the clothing object in the image through a clothing classification algorithm;
obtaining a clothing image of each clothing target through a clothing segmentation algorithm;
generating clothing data according to the clothing category and the clothing image;
generating a clothing matching scheme through a clothing matching algorithm according to the user body type from the intelligent body measuring instrument and the clothing data;
and controlling the display panel to display the clothing collocation scheme.
In a third aspect, the present application provides a computer readable storage medium, on which a computer program is stored, the computer program implementing the clothing collocation recommendation method according to the second aspect when executed by a controller.
The clothing collocation recommending device, method and storage medium provided by the application have the beneficial effects that:
according to the application, the clothing of the user is classified through the clothing classification algorithm, the clothing image of each piece of clothing is obtained through the clothing segmentation algorithm, clothing data is generated according to the clothing types and the clothing images, so that the clothing images are conveniently called according to the clothing types when the clothing is matched, the user body type is obtained through the intelligent body measuring instrument, and the clothing matching scheme is generated according to the clothing matching algorithm based on the user body type and clothing information, thereby realizing personalized matching for different users and improving the matching flexibility.
Drawings
In order to more clearly illustrate the technical solution of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1 is a schematic structural diagram of a clothing matching recommendation device according to an embodiment of the present application;
fig. 2 is a schematic diagram of a user body type according to an embodiment of the present application;
fig. 3 is an interaction schematic diagram of a clothing collocation recommendation device according to an embodiment of the present application;
fig. 4 is a flowchart of a clothing collocation recommendation method according to an embodiment of the present application;
fig. 5 is a schematic flow chart of a clothing classifying method according to an embodiment of the present application;
fig. 6 is a schematic flow chart of a clothing dividing method according to an embodiment of the present application;
fig. 7 is a schematic diagram of clothing classification management according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a clothing matching scheme according to an embodiment of the present application;
Detailed Description
In order to make the technical solution of the present application better understood by those skilled in the art, the technical solution of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
In a first aspect, the present application provides a clothing matching recommendation device for performing clothing matching recommendation, where the clothing matching recommendation device may be an intelligent dressing mirror, as shown in fig. 1, the clothing matching recommendation device may include a mirror body 100, a controller 200 disposed on the mirror body 100, a camera 300, a multimedia module 400, a display panel 500, a voice recognition module 600, a communication interface 700, and an intelligent measuring instrument 800 communicatively connected to the communication interface 700.
The multimedia module 400 may include a speaker capable of playing music; the camera 300 may be used to capture images in front of the mirror 100; the display panel 500 may display an image photographed by the camera 300, and the display panel 500 may be provided with a touch screen with which information can be input to the display panel 500; the voice recognition module can be used for recognizing voice instructions of a user, and further realize functions corresponding to the voice instructions through the controller 200.
The communication interface 700 may include various interfaces, such as a WIFI interface, a USB interface, a network interface, bluetooth, etc., so as to facilitate connection with various external devices, enrich functions of the garment matching recommendation device, and the garment matching recommendation device is further in communication connection with a server of the garment matching recommendation device through the communication interface 700.
The intelligent body measuring apparatus 800 is provided with a photoelectric identification module and a communication module, wherein the photoelectric identification module can collect body measurement parameters of a user and obtain body types of the user according to the body measurement parameters, wherein the body measurement parameters can comprise shoulder width, chest circumference, waistline, hip circumference and other data, and the body types of the user can be banana type, apple type, pear type and hourglass type in sequence from left to right as shown in fig. 2. The communication module may be connected to the communication interface 700, and the smart body meter 800 may transmit the user profile to the communication interface 700 through the communication module, which may include a bluetooth module.
The controller 200 may include multiple processors, such as a CPU and GPU, to increase data computing power. The controller 200 is respectively connected with the camera 300, the multimedia module 400, the display panel 500, the voice recognition module 600 and the communication interface 700, wherein the CPU can be provided with an operating system, such as an android operating system, to control each module connected with the controller 200 to realize the functions thereof, and acquire the user body type acquired by the intelligent body measuring instrument 800 through the communication interface 700, and recommend a clothing collocation scheme for the user according to the user body type.
The controller 200 may determine a clothing matching scheme according to user's clothing, may perform on-line clothing customization according to user's body parameters, determine a clothing matching scheme according to customized clothing, or determine a matching scheme by combining customized clothing and user's clothing; in addition, the controller 200 may also be in communication connection with a server of the clothing matching recommendation device, to send the user's clothing and the user's body types to the server, and the server performs clothing matching to generate a clothing matching scheme, and pushes the clothing matching scheme to the controller 200 through a network, and the controller 200 then controls the display panel 500 to display the clothing matching scheme.
As shown in fig. 3, the controller 200 may control the display panel 500 to display a front-end interactive interface to interact with a user to obtain user clothing, perform online clothing customization, and display a clothing collocation scheme.
The front-end interactive interface displayed by the display panel 500 may include a plurality of controls, the names of which may be garment entry, garment management, garment collocation, smart metering, garment customization, and family member information. The controller 200 or the server of the clothing collocation recommendation device is internally provided with background logic, and background data management is performed on each control according to the background logic, wherein when the background logic is arranged in the server, the controller 200 sends a trigger signal of each control to the server through a network by utilizing the communication interface 700 for response processing.
The garment input control can be triggered according to a touch instruction corresponding to the control position received by the touch screen, the garment input control can start the camera 300 to shoot an image in front of the lens body 100 when responding to the trigger, the shot image can be displayed on the display panel 500, the camera 300 is controlled to freeze the image according to the user input instruction to generate an input photo, the input photo is stored in the controller 200, or the input photo is sent to the server through a network. The user-entered instructions may be "shot" speech received by the speech recognition module 600. After receiving the input photo, the controller 200 or the server may call an algorithm interface to obtain a core algorithm stored in the controller 200 or the server, such as a clothing segmentation algorithm and a clothing classification algorithm, extract the clothing image in the input photo by using the clothing segmentation algorithm, identify the clothing category of the clothing in the input photo by using the clothing classification algorithm, and generate user clothing data according to the clothing image and the clothing category.
The clothing management control can be triggered according to the touch instruction received by the touch screen and corresponding to the control position. The garment management control may control the display panel 500 to present user garment data in response to a trigger.
The clothing collocation control can be triggered according to the touch instruction received by the touch screen and corresponding to the position of the control. The clothing matching control can acquire clothing data when triggered, an algorithm interface is called to acquire clothing matching algorithms stored in the controller 200 or the server, clothing images in the clothing data are matched by using the clothing matching algorithms, and matching data is generated, wherein the clothing data can comprise user clothing data and/or custom-made clothing data, and the custom-made clothing data can be acquired when triggered according to the clothing custom-made control.
The intelligent volume control can be triggered according to the touch instruction received by the touch screen and corresponding to the control position. The intelligent volume control can start the intelligent volume meter 800 to volume the user in response to the triggering to obtain volume parameters, generate the user body shape according to the volume parameters, and store the volume parameters and the user body shape as body data.
The clothing customization control can be triggered according to the touch instruction received by the touch screen and corresponding to the control position. The clothing customization control can acquire the volume parameters in the body data when in response to triggering, the volume parameters are sent to a preset server, the preset server is a server of a clothing customization brand which cooperates with the clothing collocation recommending device, then the custom-made clothing data generated by the clothing customization brand according to the volume parameters is received, and the custom-made clothing data and the purchase control are displayed on the display panel 500. In response to the purchase control being triggered, the controller 200 will generate order information and go to a payment page, upon payment by the user, the custom-made brands of apparel may be produced from the order information and mailed to the user for custom-made apparel purchased by the user.
The family member information control can be triggered according to the touch instruction received by the touch screen and corresponding to the control position. The family member information control can display the added member control and the added member control when in response to the triggering, wherein the added member control can establish a user account when in response to the triggering, and user identity information is generated. The user can go to the front-end interactive interface to operate under the newly-built user account, such as operating clothing input controls, intelligent volume controls and the like, so as to generate clothing data, body data and the like of the user account; the added member controls may include user account controls for a plurality of established user accounts, and the user may switch user accounts by operating different user account controls. A clothing collocation recommendation device can establish a plurality of user accounts for different users of a family to use.
To further describe the background logic of the clothing matching recommendation device, a second aspect of the present application shows a clothing matching recommendation method, as shown in fig. 4, including the following steps:
step S110: and acquiring an image in front of the display panel through a camera.
The user can trigger the family member information control on the front-end interactive interface displayed on the display panel, further trigger the added member control of the corresponding user, and then return to the front-end interactive interface to trigger the clothing input control.
The garment input control starts to shoot images in front of the lens body after being triggered, the shot images are displayed on the display panel, a user can stand in front of the lens body, the user can wear the images of the garment according to the images of the garment worn by the user displayed on the display panel, the user can speak the shooting, the voice recognition module generates a voice signal to be sent to the controller, the controller triggers a user input command according to the voice signal, and the user input command can control the camera to stop-motion images to generate input photos according to triggering.
Further, skin color information of the user can be analyzed according to the input photo, for example, RGB value ranges corresponding to different skin color types can be preset, face recognition can be performed according to the input photo, a face area can be obtained, and the skin color type of the user can be obtained by analyzing the RGB value of the face area. In addition, skin tone information of the user can be obtained by adopting a user input mode, for example, different skin tone types can be displayed on a display panel, and example images corresponding to the skin tone types can be displayed for the user to select the skin tone type.
Step S111: and obtaining the clothing type of the clothing object in the image through a clothing classification algorithm.
The flow of the clothing classification algorithm is shown in fig. 5, and includes steps S210-S212:
step S210: and extracting the position information of the clothing target and the characteristic information of the clothing target.
The position information of the clothing target and the characteristic information of the clothing target can be extracted based on a pre-trained depth residual error network. The depth residual error network can be trained based on a large number of garment training samples, and the position characteristics and the category characteristics of the garment targets can be automatically learned by the depth residual error network by marking the positions and the categories of the garment targets in the garment training samples. After the image shot by the camera is input into the depth residual error network, the depth residual error network can analyze the image, and the position information of the clothing target and the characteristic information of the clothing target are extracted.
Step S211: the location of the clothing target and the estimated class of the clothing target are predicted.
After the position information of the clothing target and the feature information of the clothing target are extracted through the depth residual error network, the position of the clothing and the type of the clothing can be predicted based on the multi-scale feature map of yolov3, and the clothing type obtained in the step is called an estimated type because the clothing of similar types such as a short-body skirt and a long-body skirt, shorts and trousers, a short coat and a long coat are easy to be recognized by mistake, and the clothing type can be further predicted in the next step.
Step S212: and carrying out similar clothing category analysis on the clothing targets according to the estimated categories to obtain accurate categories of the clothing targets.
The analysis of the similar clothing categories can adopt a size threshold analysis method, namely, clothing of the similar categories respectively determine the size threshold ranges of the clothing, and judge which size threshold range the clothing target belongs to, so that the accurate category of the clothing target is obtained.
After the garment classification algorithm shown in fig. 5 is invoked, the garment type of the garment target in the image shot by the camera can be obtained.
After the clothing category is obtained, the clothing category can be displayed on one side of the clothing target, the clothing attribute editing area can be displayed in response to the clothing category being selected, the user can modify the clothing category in the clothing attribute editing area so as to solve the problem of error in clothing category identification under individual conditions, in order to determine a collocation scheme according to the clothing category when the clothing is matched, the clothing attribute editing area can display the recommended category of the clothing target for the user to select, and the recommended category can be the accurate category and the similar clothing category which are currently judged, and can also be all the clothing categories.
In addition to the clothing category, the user may also input other attribute information of the clothing target, such as clothing fabric, clothing color, etc., in the clothing attribute editing area, and these attribute information may also be provided in a selected form for the user to set.
Step S112: and obtaining a clothing image of each clothing object through a clothing segmentation algorithm.
The flow of the clothing segmentation algorithm is shown in fig. 6, and includes steps S310-S314:
step S310: and extracting the clothing image through the segmentation network model.
The extraction of the clothing targets may be based on a pre-trained segmentation network model. The segmentation network model can be established based on a Mask-RNN algorithm and trained according to a large number of clothing training samples, so that the segmentation network model can distinguish boundaries between clothing images and background images thereof, thereby eliminating the background images in clothing targets, and reserving images of clothing areas in the clothing targets, wherein the partial images are called clothing images. However, for some reasons, such as the closer color of the clothing image and its background image, the extracted clothing image may have flaws, and the clothing image may be further processed in steps S311-S313 to improve the segmentation effect.
Step S311: and detecting the connected region of the clothing image.
The clothing image is converted into a single-channel gray image, in which each pixel corresponds to a gray value, the range of the gray value is between (0 and 255), 0 represents black, and 255 represents white. The black color represents the background, and the rest of the non-black block areas form connected areas, which represent the extracted clothing image.
And acquiring a connected region, namely a boundary of the clothing image, according to the gray value of each pixel, judging whether pixel points with large difference between the pixel value and surrounding pixel values exist in the boundary, and if the pixel points exist, considering the region corresponding to the pixel points as a defect region.
Step S312: repairing the cavity in the communication area.
The gray value of the pixel corresponding to the defective area detected in step S311 is higher than the gray values of the other pixels in the connected area, so that the defective area can be called as a hole, and the hole can be repaired by adjusting the gray value of the pixel corresponding to the hole.
Step S313: the edges of the garment image are smoothed.
By carrying out convolution operation and binary operation on the connected domain, smoothness of the clothing image edge can be improved. After the garment segmentation algorithm described in fig. 6 is invoked, a garment image of the garment target in the image shot by the camera can be obtained.
Further, after the clothing image is obtained, the clothing image may be displayed on the clothing target side, in response to the clothing image being selected, the clothing attribute editing area may be displayed, the user may set a clothing storage position in the clothing attribute editing area, for example, the clothing attribute editing area may be provided with a wardrobe shooting control, the user may move the intelligent dressing mirror to the front of the user's wardrobe, the wardrobe shooting control is controlled by a touch control camera to shoot the wardrobe image in response to triggering, the wardrobe image is displayed in the clothing attribute editing area, the user may divide and name a plurality of storage areas for the wardrobe image, and the clothing storage position corresponding to the clothing image is set by selecting the storage area. The number of the wardrobe images can be set to be multiple, and when a user selects a clothes image, one of the wardrobe images can be selected to determine the clothes storage position corresponding to the clothes image.
Step S113: user garment data is generated from the garment categories and the garment images.
After the garment categories and garment images are obtained, user garment data may be generated. Further, in addition to the clothing category and clothing image, the user clothing data may include information of clothing category, clothing image, storage area, etc. input by the user.
Steps S110-S113 may be repeatedly performed to enter a plurality of user ' S clothing, and the controller may store and manage the user ' S clothing in a classified manner according to the clothing category so that the user can view his/her user ' S clothing data according to the clothing category. As shown in fig. 7, when the user triggers the garment management control, the user can view the garments corresponding to different garment categories, such as T-shirts and shortcuts, respectively.
Step S114: and acquiring the measuring parameters through an intelligent measuring instrument.
After the user triggers the intelligent body measuring control, the controller can start the intelligent body measuring instrument to measure the body measuring parameters of the user, and the body type of the user is generated according to the body measuring parameters. The intelligent body measuring instrument is connected with the Bluetooth of the controller, and the controller can acquire body measuring parameters and user body types from the intelligent body measuring instrument.
Step S115: the body parameters are sent to the custom-made brand of clothing.
After the controller acquires the body measurement parameters and the body types of the user, the body measurement parameters can be sent to a server of a clothing customization brand according to the user triggering of the clothing customization control, and after the server of the clothing customization brand acquires the body measurement parameters from the controller, customized clothing data can be generated according to the body measurement parameters and customized clothing information is sent to the controller.
Step S116: custom-made garment data from custom-made brands of garments is presented.
And after receiving the custom-made garment data from the custom-made garment brand, the controller controls the display panel to display the image of the custom-made garment and the purchase control.
Step S117: and generating a clothing matching scheme through a clothing matching algorithm according to the user body type and clothing data from the intelligent body measuring instrument.
The clothing collocation algorithm is obtained by collocation training of the deepfashion public clothing data set based on the FashionaI deep learning algorithm.
Further, during the collocation training, the collocation scheme can be determined by combining collocation influencing factors, and the collocation influencing factors can comprise the body type, skin color, age, air temperature and the like of the user. When the intelligent body measuring instrument is actually matched, the body type of the user can be obtained according to the body measuring parameters of the intelligent body measuring instrument, the skin color and the age can be obtained by inputting or analyzing photos input by the user, and the like, and the air temperature can be obtained by obtaining weather information corresponding to the current geographic position through networking. According to the matching elements and the user clothes or custom-made clothes, a multi-set clothes matching scheme can be calculated for the user, wherein the clothes in the clothes matching scheme can be only the user clothes, can be only custom-made clothes, and can also comprise the user clothes and custom-made clothes.
Step S118: and controlling the display panel to display the clothing collocation scheme.
After calculating the clothing matching scheme, the controller may control the display panel to display a multi-suit clothing matching scheme for the user to select, as shown in fig. 8, the clothing in recommended matching 1 includes a T-shirt and a half-skirt, and the clothing in recommended matching 2 includes a T-shirt and a half-skirt, wherein the T-shirt and the half-skirt are user clothing, and the half-skirt is custom-made clothing.
Further, the clothing matching scheme on the display panel can display clothing information of clothing, such as clothing storage positions of user clothing, purchasing controls of custom-made clothing, and the like, for the user to find or purchase clothing in response to selection by the user.
According to the method and the device, the clothing of the user is classified through the clothing classification algorithm, the clothing image of each piece of clothing is obtained through the clothing segmentation algorithm, the clothing data are generated according to the clothing types and the clothing images, the clothing images are conveniently called according to the clothing types when the clothing is matched, the user body type is obtained through the intelligent body measuring instrument, the clothing matching scheme is generated according to the clothing matching algorithm based on the user body type and the clothing information, personalized matching for different users is achieved, and matching flexibility is improved.
Since the foregoing embodiments are all described in other modes by reference to the above, the same parts are provided between different embodiments, and the same and similar parts are provided between the embodiments in the present specification. And will not be described in detail herein.
It should be noted that in this specification, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive arrangement, such that a circuit structure, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such circuit structure, article, or apparatus. Without further limitation, the statement "comprises" or "comprising" a … … "does not exclude that an additional identical element is present in a circuit structure, article or apparatus that comprises the element.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure of the application herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
The embodiments of the present application described above do not limit the scope of the present application.

Claims (12)

1. A garment matching recommendation device, comprising:
a display panel for displaying an image;
the camera is used for collecting a first image containing clothing in front of the display panel and collecting a second image of a user wardrobe;
the controller is used for receiving the operation of setting a storage area on the second image by a user and generating a garment storage position corresponding to the storage area; predicting the estimated class of the clothing target in the first image through a clothing classification algorithm, comparing the size of the clothing target with the size threshold range of the estimated class and the size threshold range of the preset similar class of the estimated class, and determining the clothing class of the clothing target according to the comparison result; obtaining a clothing image of each clothing target through a clothing segmentation algorithm; generating clothing data according to the clothing category, the clothing image and clothing storage positions input by a user; analyzing the RGB value of a face area in the first image to obtain the skin color type of a user or obtaining skin color information input by the user; generating a clothing matching scheme through a clothing matching algorithm according to the skin color type, the user body type from the intelligent body measuring instrument and the entered clothing data of the clothing; transmitting the clothing collocation scheme to a display panel;
the intelligent body measuring instrument comprises a photoelectric identification module and a communication module, wherein the photoelectric identification module is used for acquiring body measuring parameters of a user; and obtaining the user body type according to the body measurement parameters, wherein the communication module is used for sending the user body type to the controller.
2. The garment matching recommendation device of claim 1, wherein the controller is further configured to send the body measurement parameters to a preset server for generating custom-made garments from the body measurement parameters.
3. The clothing matching recommendation device of claim 1, wherein the display panel includes a touch screen configured to obtain clothing materials and clothing colors entered by a user; sending the garment materials and the garment colors to the controller; the controller is configured to generate the garment data from the garment shell fabric, garment color, garment category, garment image, and garment storage location.
4. The clothing matching recommendation device of claim 1, wherein the display panel includes a touch screen configured to obtain user identity information entered by a user; transmitting the user identity information to the controller; the controller is further configured to establish a correspondence between the user identity information and the apparel data.
5. The clothing collocation recommendation device of claim 1, further comprising a communication interface, wherein the communication interface comprises any one or more of a WIFI interface, a USB interface, a network port, and bluetooth, and the communication module of the intelligent body measuring instrument is connected with the controller through the communication interface.
6. The clothing collocation recommendation device according to claim 1, wherein the clothing classification algorithm is obtained by training a depth residual network model, the clothing segmentation algorithm comprises a Mask-RCNN segmentation algorithm, and the clothing collocation algorithm is obtained by training a deepfashion public clothing data set based on a fashionAI (advanced learning interface) deep learning algorithm.
7. A garment matching recommendation method, the method comprising:
collecting a second image of the user wardrobe;
receiving an operation of setting a storage area on the second image by a user, and generating a clothing storage position corresponding to the storage area;
acquiring a first image containing a garment;
analyzing the RGB value of the face area in the second image to obtain the skin color type of the user, or obtaining skin color information input by the user;
obtaining an estimated class of the clothing target in the first image through a clothing classification algorithm, comparing the size of the clothing target with a size threshold range of the estimated class and a size threshold range of a preset similar class of the estimated class, and determining the clothing class of the clothing target according to a comparison result;
obtaining a clothing image of each clothing target through a clothing segmentation algorithm;
receiving a clothing storage position of the clothing target input by a user;
generating garment data according to the garment category, the garment image and the garment storage position;
generating a clothing matching scheme through a clothing matching algorithm according to the skin color type, the user body type obtained based on the body measurement parameters and the entered clothing data of the clothing;
and displaying the clothing collocation scheme.
8. The garment matching recommendation method of claim 7, further comprising:
and generating custom-made clothes according to the volume parameters.
9. The clothing collocation recommendation method of claim 7, wherein the generating clothing data from the clothing category, clothing image, and clothing storage location entered by the user comprises:
acquiring clothing fabric and clothing color input by a user;
and generating clothing data according to the clothing fabric, the clothing color, the clothing category, the clothing image and the clothing storage position.
10. The garment matching recommendation method of claim 9, further comprising:
acquiring user identity information input by a user;
and establishing a corresponding relation between the user identity information and the service data.
11. The clothing collocation recommendation method according to claim 7, wherein the clothing classification algorithm is obtained by training a depth residual network model, the clothing segmentation algorithm comprises a Mask-RCNN segmentation algorithm, and the clothing collocation algorithm is obtained by training a deepfashion public clothing dataset based on a fashionAI (fast learning interface) deep learning algorithm.
12. A computer readable storage medium, wherein a computer program is stored on the computer readable storage medium, which when executed by a controller, implements the garment matching recommendation method of claim 7.
CN202010072538.9A 2020-01-21 2020-01-21 Clothing collocation recommendation device, method and storage medium Active CN113159876B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010072538.9A CN113159876B (en) 2020-01-21 2020-01-21 Clothing collocation recommendation device, method and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010072538.9A CN113159876B (en) 2020-01-21 2020-01-21 Clothing collocation recommendation device, method and storage medium

Publications (2)

Publication Number Publication Date
CN113159876A CN113159876A (en) 2021-07-23
CN113159876B true CN113159876B (en) 2023-08-22

Family

ID=76882403

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010072538.9A Active CN113159876B (en) 2020-01-21 2020-01-21 Clothing collocation recommendation device, method and storage medium

Country Status (1)

Country Link
CN (1) CN113159876B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116883119B (en) * 2023-07-27 2024-03-19 深圳市慧聚数字软件运营有限公司 Clothing recommendation method, device, terminal equipment and computer readable storage medium
CN117707409B (en) * 2024-02-04 2024-05-24 深圳市爱科贝电子有限公司 Earphone information display method, device, system and equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105812754A (en) * 2016-05-31 2016-07-27 珠海市魅族科技有限公司 Garment fitting method and garment fitting device
CN105989617A (en) * 2014-08-08 2016-10-05 株式会社东芝 Virtual try-on apparatus and virtual try-on method
CN107943955A (en) * 2017-11-24 2018-04-20 谭云 A kind of clothing information collecting device, information management system and method
CN109949116A (en) * 2017-12-20 2019-06-28 广东欧珀移动通信有限公司 Clothing matching recommended method, device, storage medium and mobile terminal
CN110021061A (en) * 2018-01-08 2019-07-16 广东欧珀移动通信有限公司 Collocation model building method, dress ornament recommended method, device, medium and terminal
CN110264574A (en) * 2019-05-21 2019-09-20 深圳市博克时代科技开发有限公司 Virtual fit method, device and intelligent terminal, storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140180864A1 (en) * 2012-12-20 2014-06-26 Ebay Inc. Personalized clothing recommendation system and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105989617A (en) * 2014-08-08 2016-10-05 株式会社东芝 Virtual try-on apparatus and virtual try-on method
CN105812754A (en) * 2016-05-31 2016-07-27 珠海市魅族科技有限公司 Garment fitting method and garment fitting device
CN107943955A (en) * 2017-11-24 2018-04-20 谭云 A kind of clothing information collecting device, information management system and method
CN109949116A (en) * 2017-12-20 2019-06-28 广东欧珀移动通信有限公司 Clothing matching recommended method, device, storage medium and mobile terminal
CN110021061A (en) * 2018-01-08 2019-07-16 广东欧珀移动通信有限公司 Collocation model building method, dress ornament recommended method, device, medium and terminal
CN110264574A (en) * 2019-05-21 2019-09-20 深圳市博克时代科技开发有限公司 Virtual fit method, device and intelligent terminal, storage medium

Also Published As

Publication number Publication date
CN113159876A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
CN106776619B (en) Method and device for determining attribute information of target object
CN107784282B (en) Object attribute identification method, device and system
KR101140533B1 (en) Method and system for recommending a product based upon skin color estimated from an image
JP2021101384A (en) Image processing apparatus, image processing method and program
CN108933925A (en) Information processing unit, information processing method and storage medium
CN107133576A (en) Age of user recognition methods and device
KR102045223B1 (en) Apparatus, method and computer program for analyzing bone age
CN113159876B (en) Clothing collocation recommendation device, method and storage medium
CN108647625A (en) A kind of expression recognition method and device
CN109829418B (en) Card punching method, device and system based on shadow features
KR102437199B1 (en) Method And Apparatus for Recommending Clothing by Using Body Shape Information
JP2018084890A (en) Information processing unit, information processing method, and program
CN109858552A (en) A kind of object detection method and equipment for fine grit classification
CN108447061A (en) Merchandise information processing method, device, computer equipment and storage medium
CN106951448A (en) A kind of personalization, which is worn, takes recommendation method and system
KR20200079721A (en) Smart clothes management system and method thereof
JP2022553884A (en) Shoe authentication device and authentication process
JP2019192082A (en) Server for learning, image collection assisting system for insufficient learning, and image estimation program for insufficient learning
KR20140124087A (en) System and method for recommending hair based on face and style recognition
CN114201681A (en) Method and device for recommending clothes
CN111126179A (en) Information acquisition method and device, storage medium and electronic device
JP2009266166A (en) Harmony determination apparatus, harmony determination method and harmony determination program
US20200065631A1 (en) Produce Assessment System
CN106649300A (en) Intelligent clothing matching recommendation method and system based on cloud platform
CN112418273B (en) Clothing popularity evaluation method and device, intelligent terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant