CN109978720A - Wear methods of marking, device, smart machine and storage medium - Google Patents

Wear methods of marking, device, smart machine and storage medium Download PDF

Info

Publication number
CN109978720A
CN109978720A CN201711466285.8A CN201711466285A CN109978720A CN 109978720 A CN109978720 A CN 109978720A CN 201711466285 A CN201711466285 A CN 201711466285A CN 109978720 A CN109978720 A CN 109978720A
Authority
CN
China
Prior art keywords
clothes
area image
score
scene
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711466285.8A
Other languages
Chinese (zh)
Inventor
熊友军
朱志刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN201711466285.8A priority Critical patent/CN109978720A/en
Publication of CN109978720A publication Critical patent/CN109978720A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Image Analysis (AREA)

Abstract

The present invention is suitable for technical field of intelligent equipment, provides and a kind of wears methods of marking, device, smart machine and storage medium, comprising: obtains the whole body images of user;The clothes area image in the whole body images is determined according to the whole body images of acquisition;Obtain user's selection attends scene;Scene is attended with described according to the clothes area image, determines that the user's wears comprehensive score.By the above method can user arrange in pairs or groups dress provide reference, equipment can be made more intelligent, also can for user arrange in pairs or groups dress save the time, user experience can be improved.

Description

Wear methods of marking, device, smart machine and storage medium
Technical field
The invention belongs to technical field of intelligent equipment more particularly to a kind of wear methods of marking, device, smart machine and deposit Storage media.
Background technique
With the development of society, people have been not content with the life having enough to eat and wear, self-image, each large size are more paid close attention to The construction in market is also as emerging rapidly in large numbersBamboo shoots after a spring rain, and dazzling commodity have attracted a large amount of customer in market, and clothes industry also obtains Rapid development is arrived, more and more clothes shops in market, the clothes in people's wardrobe is more and more.But also just so, Customer is often difficult to make decision in selection clothes before outgoing.
Currently, intelligent terminal technology is fast-developing.With popularizing for smart home concept, Intelligent hardware product is in people It is very universal in daily life, existing Intelligent hardware product is concentrated mainly on the controlling filed of music and smart home Scape, but dressing for various occasions cannot be attended for user and provided with reference to evaluation, collocation dress waste is more before user gos out Time.
Summary of the invention
In view of this, methods of marking, device, smart machine and storage medium are worn the embodiment of the invention provides a kind of, To solve the control scene that existing Intelligent hardware product is concentrated mainly on music and smart home, but cannot go out for user Dressing for the various occasions of seat is provided with reference to evaluation, before user gos out the problem of collocation dress waste more time.
First aspect present invention provide it is a kind of wear methods of marking, the methods of marking of wearing includes:
Obtain the whole body images of user;
The clothes area image in the whole body images is determined according to the whole body images of acquisition;
Obtain user's selection attends scene;
Scene is attended with described according to the clothes area image, determines that the user's wears comprehensive score.
With reference to first aspect, described according to the clothes administrative division map in the first possible implementation of first aspect As attending scene with described, the step of wearing comprehensive score of the user is determined, comprising:
The clothes area image is divided by color lump, the color of the clothes area image is determined according to the color lump of division Collocation score;
The corresponding clothes fashion of the clothes area image is searched from preset sample style library;
Scene is attended according to the clothes fashion and according to described, determines the scene matching point of the clothes area image Number;
According to the colour match score and the scene matching score, determine that the user's wears comprehensive score.
It is described that scene is attended with described according to the clothes area image, determine the step for wearing comprehensive score of the user Suddenly, comprising:
The clothes area image is divided by color lump, the color of the clothes area image is determined according to the color lump of division Collocation score;
The corresponding clothes fashion of the clothes area image is searched from preset sample style library;
Scene is attended according to the clothes fashion and according to described, determines the scene matching point of the clothes area image Number;
According to the colour match score and the scene matching score, determine that the user's wears comprehensive score.
The possible implementation of with reference to first aspect the first, in second of mode in the cards of first aspect, It is described to divide the clothes area image by color lump, the colour match of the clothes area image is determined according to the color lump of division The step of score, comprising:
Identical color in the clothes area image is merged into a color lump;
When, there are when more than one color lump, determining the first colour match according to the quantity of color lump in the clothes area image Score;
According to preset color matching list, the match colors degree between each color lump is obtained;
According to the match colors degree, the second colour match score is determined;
According to the first colour match score and the second colour match score, the clothes area image is determined Colour match score.
The possible implementation of with reference to first aspect the first, in the third mode in the cards of first aspect, The described the step of corresponding clothes fashion of the clothes area image is searched from preset sample style library, comprising:
The Edge Feature Points of the clothes area image are extracted using edge detection operator;
According to the Edge Feature Points, straightway and angle point in the clothes area image are determined;
From the style searched in preset sample style library with straightway and corners Matching in the clothes area image, really The style of clothes in the fixed clothes region.
With reference to first aspect, or with reference to first aspect the possible implementation of the first, or with reference to first aspect second The possible implementation of kind, or the third possible implementation with reference to first aspect, the 4th kind in first aspect can be able to achieve In mode, it is described according to the clothes area image and it is described attend scene, determine the comprehensive score of wearing of the user After step, comprising:
Judge described to wear whether comprehensive score is lower than default scoring;
If the comprehensive score of wearing attends scene according to described, recommends to attend scene with described lower than default scoring The arranging scheme matched.
Second aspect of the present invention provide it is a kind of wear scoring apparatus, the scoring apparatus of wearing includes:
Whole body images acquiring unit, for obtaining the whole body images of user;
Image of clothing determination unit determines the clothes administrative division map in the whole body images for the whole body images according to acquisition Picture;
Attend scene acquiring unit, for obtain user selection attend scene;
Comprehensive score determination unit determines the user for attending scene with described according to the clothes area image Wear comprehensive score.
In conjunction with second aspect, in the first possible implementation of second aspect, the comprehensive score determination unit packet It includes:
Color score determining module is determined for dividing the clothes area image by color lump according to the color lump of division The colour match score of the clothes area image;
Style searching module, for searching the corresponding clothes money of the clothes area image from preset sample style library Formula;
Scene score determining module determines the clothes for attending scene according to the clothes fashion and according to described The scene matching score of area image;
Comprehensive score determining module, described in determining according to the colour match score and the scene matching score User's wears comprehensive score.
In conjunction with the first possible implementation of second aspect, in second of mode in the cards of second aspect, The color score determining module includes:
Color lump merges submodule, for identical color in the clothes area image to be merged into a color lump;
First color score submodule, for when in the clothes area image there are when more than one color lump, according to color The quantity of block determines the first colour match score;
Match colors degree determines submodule, for obtaining the color between each color lump according to preset color matching list Color matching degree;
Second color score submodule, for determining the second colour match score according to the match colors degree;
Colour match submodule is used for according to the first colour match score and the second colour match score, really The colour match score of the fixed clothes area image.
In conjunction with the first possible implementation of second aspect, in the third mode in the cards of second aspect, The scene score determining module includes:
Edge extracting submodule, for extracting the Edge Feature Points of the clothes area image using edge detection operator;
Feature determines submodule, for according to the Edge Feature Points, determine in the clothes area image straightway with Angle point;
Style determines submodule, for the straightway from lookup in preset sample style library and the clothes area image With the style of corners Matching, the style of clothes in the clothes region is determined.
In conjunction with second aspect, or second of possible implementation of second aspect is combined, or combine the second of second aspect The possible implementation of kind, or the third possible implementation of second aspect is combined, the 4th kind in second aspect can be able to achieve It is described to wear scoring apparatus in mode further include:
Score comparing unit, described wears whether comprehensive score is lower than default scoring for judging;
Recommendation unit of arranging in pairs or groups is attended scene according to described, is recommended if being lower than default scoring for the comprehensive score of wearing With the arranging scheme for attending scene matching.
Third aspect present invention provides a kind of smart machine, comprising: memory, processor and is stored in the storage In device and the computer program that can run on the processor, the processor are realized as above when executing the computer program The step of methods of marking is worn described in first aspect.
Fourth aspect present invention provides a kind of computer readable storage medium, the computer-readable recording medium storage There is computer program, the computer program realizes the step that methods of marking is worn described in first aspect as above when being executed by processor Suddenly.
Existing beneficial effect is the embodiment of the present invention compared with prior art: the embodiment of the present invention is by obtaining user Whole body images determine the clothes area image in the whole body images according to the whole body images of acquisition, obtain going out for user's selection Seat scene attends scene with described according to the clothes area image, determines the comprehensive score of wearing of the user, this programme knot Share family attends the dress collocation scoring that scene is user, and arranging in pairs or groups to wear for user provides reference, can make equipment more Intelligence, the dress that can also arrange in pairs or groups for user save the time, can improve user experience.
Detailed description of the invention
It to describe the technical solutions in the embodiments of the present invention more clearly, below will be to embodiment or description of the prior art Needed in attached drawing be briefly described, it should be apparent that, the accompanying drawings in the following description is only of the invention some Embodiment for those of ordinary skill in the art without any creative labor, can also be according to these Attached drawing obtains other attached drawings.
Fig. 1 is a kind of implementation flow chart for wearing methods of marking provided in an embodiment of the present invention;
Fig. 2 is a kind of implementation flow chart for wearing methods of marking S104 provided in an embodiment of the present invention;
Fig. 3 is a kind of implementation process for wearing methods of marking including recommending arranging scheme provided in an embodiment of the present invention Figure;
Fig. 4 is a kind of structural block diagram for wearing scoring apparatus provided in an embodiment of the present invention;
Fig. 5 is a kind of schematic diagram of smart machine provided in an embodiment of the present invention.
Specific embodiment
In being described below, for illustration and not for limitation, the tool of such as particular system structure, technology etc is proposed Body details, to understand thoroughly the embodiment of the present invention.However, it will be clear to one skilled in the art that there is no these specific The present invention also may be implemented in the other embodiments of details.In other situations, it omits to well-known system, device, electricity The detailed description of road and method, in case unnecessary details interferes description of the invention.
The embodiment of the present invention provides to attend dressing for various occasions to user with reference to evaluation, provides one kind and wears Methods of marking, device, smart machine and storage medium, wherein mainly by obtain user whole body images, according to acquisition Whole body images determine the clothes area image in the whole body images, obtain user's selection attends scene, according to the clothes Dress area image attends scene with described, determines that the user's wears comprehensive score.In order to illustrate above-mentioned dress scoring Method, apparatus, smart machine and storage medium, the following is a description of specific embodiments.
Embodiment one:
Fig. 1 shows a kind of flow chart for wearing methods of marking provided in an embodiment of the present invention, and this method process includes step Rapid S101 to S104.Details are as follows for the specific implementation principle of each step:
Step S101 obtains the whole body images of user.
Wherein, whole body images refer to the human body image including face front and upright whole body.Specifically, user is being detected Scoring instruction after, obtain the whole body images of user.In embodiments of the present invention, user can upload whole body images, alternatively, logical Cross the whole body images of camera multi-angled shooting current human.
Step S102 determines the clothes area image in the whole body images according to the whole body images of acquisition.
In embodiments of the present invention, clothes area image is determined according to the whole body images of human body.It is determined by Face datection Position of the human body in the whole body images, further according to the certain positional relationship of human body and clothes, in the whole body images Clothes are positioned, and clothes administrative division map is partitioned into from the whole body images using the seed growth algorithm of self-adaptive growth threshold value Picture.Specifically, median filter process is carried out to the whole body images, side is done to filtered whole body images using Sobel operator Edge detection, obtains gradient image, determines the marginal point set of pixels for constituting clothes edge.According to marginal point set of pixels, by clothes area Area image is split from the whole body images, to score for dress ornament.
Step S103, obtain user's selection attends scene.
Specifically, smart machine, which is stored in advance, attends scene there are many common, and attending scene includes participating in wedding, the meeting of attending View, job interview etc., so that user voluntarily selects.In embodiments of the present invention, it counts various by big data and attends scene The dress of lower prevalence is arranged in pairs or groups, and generates collocation model library according to statistical result.It is stored in the collocation model library and each attends field What is be suitble under scape includes the dress collocation of clothes fashion, color, further includes each the attending under scene according to big data statistics Worst collocation.When user, which has selected, attends scene, optimal collocation and worst collocation can be determined according to collocation model.The present invention In embodiment, in conjunction with the various dress collocation attended under scene of big data analysis, the confidence level of collocation scoring can be improved.
Step S104 attends scene with described according to the clothes area image, determines that the dress synthesis of the user is commented Point.
Optionally, in order to improve the accuracy of scoring, as shown in Fig. 2, the step S104 includes:
A1, the clothes area image is divided by color lump, the clothes area image is determined according to the color lump of division Colour match score.
A2 searches the corresponding clothes fashion of the clothes area image from preset sample style library.
A3 attends scene according to the clothes fashion and according to described, determines the scene matching point of the clothes area image Number.
A4 determines that the user's wears comprehensive score according to the colour match score and the scene matching score.
In embodiments of the present invention, in advance by the collocation of various colors determined according to garment coordination expert and its corresponding Score is stored in smart machine, and the clothes area image split from whole body images is divided by color lump, determines the clothes The color that clothes in dress area image include, so that it is determined that the colour match score of the clothes area image.According to described What the corresponding clothes fashion of clothes area image and user selected attends scene, determines the scene matching score of user.Color is taken It is respectively accounted for score and scene matching score scene matching score and wears 50 the percent of comprehensive score.
Optionally, the step A1 includes:
A11, identical color in the clothes area image is merged into a color lump;
A12, when, there are when more than one color lump, determining the first color according to the quantity of color lump in the clothes area image Collocation score;
A13, according to preset color matching list, obtain the match colors degree between each color lump;
A14, according to the match colors degree, determine the second colour match score;
A15, according to the first colour match score and the second colour match score, determine the clothes administrative division map The colour match score of picture.
In embodiments of the present invention, the color lump of same color in clothes area image is merged, when in clothes area image There are when more than one color lump, i.e., there are more than one colors in the clothes area image, determine the color of each color lump, thus Determine the collocation of the clothes area image color.
Optionally, when more than one color lump, the area and color lump for calculating separately each color lump account for entire clothes region Area percentage.Color lump is grouped according to the area of color lump and area percentage.Specifically, if the area category of the color lump It is then first group of color lump in the first area threshold section;It is if the area of the color lump belongs to second area threshold interval Two groups of color lumps;If the area of the color lump belongs to third area threshold section, for third group color lump;Described first is obtained respectively The number of color lump and the color of color lump in group color lump, second group of color lump and the third group color lump, calculate corresponding face Color collocation score.
Illustratively, in embodiments of the present invention, the quantity collocation total score of each group color lump is 25 points in clothes area image, The collocation total score of color between each group color lump is 25 points.Color lump is grouped by the area percentage for accounting for clothes area image, point For big color lump (L), medium color lump (M), small color lump (S), it is as shown in the table to be respectively grouped area percentage:
Big color lump (L) Medium color lump (M) Small color lump (S)
100%~50% 49.9%~20% 19.9%~5%
In embodiments of the present invention, the reference factor of colour match score includes big color lump (L), medium color lump (M), small color The collocation of color between the size matching and each color lump of block (S).The color that should have more than one dominant hue i.e. big on clothes Block adds 10 points if big color lump number is greater than 0.Medium color lump in clothes should not be excessive, and excessive medium color lump seems It is colourful unnatural, if medium color lump number, in predetermined number, such as 0-2, then proper, Ying Tianjia 7.5 divides;It is small Color lump plays the role of interspersing well, if clothes ifs seems that comparison is dull, so if containing in clothes without small color lump Suitable number of small color lump then adds 7.5 points.Colour match between each color lump is also the key factor of an investigation, this to take With the collocation being mainly reflected between big color lump, and the collocation between big color lump and medium color lump, color difference between these color lumps Value cannot differ too big, not so seem and be discord.The matching degree between the color lump of different colours is stored in advance.Such as:
Matching degree between big color lump: [color: red;Color: blue matching degree: 20%];
Matching degree between big color lump and medium color lump: [color: red;Color: blue;Matching degree: 60%];
Matching degree between medium color lump: [color: red;Color: blue;Matching degree: 30%].
Big color lump is grouped first, determines the matching degree between big color lump, packet count num calculates each group matching degree Then the sum of degree totalDegree calculates this group of Mean match degree averageDegree=totalDegree/num.Class As, big color lump and medium shade Block- matching degree are calculated, average value is calculated;Then the Mean match degree between medium color lump is calculated. Finally calculate Mean match degree between Mean match degree, big color lump and the medium color lump between big color lump and medium color lump it Between Mean match degree between whole matching degree average value totalAverageDegree, then basis TotalAverageDegree calculates scoring, and scoring formula is as follows: score=totalAverageDegree*25.
It optionally, is the style for accurately determining the corresponding clothes of the clothes area image, the step A2 includes:
A21, the Edge Feature Points that the clothes area image is extracted using edge detection operator;
A22, according to the Edge Feature Points, determine straightway and angle point in the clothes area image;
A23, from preset sample style library search with the clothes area image in straightway and corners Matching money Formula determines the style of clothes in the clothes region.
In embodiments of the present invention, the image of clothing for obtaining various styles in advance, is extracted in the image of clothing of various styles Straightway and angle point sample money is established according to the straightway and angle point in the image of clothing and its image of clothing of various styles Formula library.By from preset sample style library search with the clothes area image in straightway and corners Matching style, Determine the style of clothes in the clothes region.
The sample image in sample style library mainly contains the shape feature and spatial relation characteristics of image.By to clothes The quantization of area image determines the edge contour of clothes.Edge image be clothes area image it is most basic be also most important feature One of.In embodiments of the present invention, the Edge Feature Points of the clothes area image are extracted using edge detection operator.Edge inspection Measurement, detection and the positioning of the mainly grey scale change of image are surveyed, it is one of the most important content in Digital Image Processing.Such as The present, edge detecting technology have become the important content of computer vision.The essence of edge detection is mentioned using certain algorithm Take out the boundary line between the target and background in image.Variation of image grayscale situation can be with the tonsure that image grayscale is distributed come anti- It reflects, therefore we can use topography's differential technology edge detection operator.Classical edge detecting method is to original image Certain of middle pixel small field constructs edge detection operator.Common edge detection operator includes first differential edge detection, two Rank differential edge detection, Canny operator etc..
The main purpose of the quantization of image clothing information is just easy for these features required for extracting from image.Directly Line has simple geometrical characteristic and good geometrical analysis, can readily describe scene and target.The line of clothes itself Item is also largely exactly to be made of straightway or near linear section, and a small amount of curved portion also may be split into several continuously The combination of straightway.So carrying out the detection of straight line to the set of image characteristic point first in embodiments of the present invention, and by institute The straightway of acquisition is gathered to complete the extraction of style details in image.Hough transform straight line can be used in the present embodiment Detection is to detect straight line.
In embodiments of the present invention, smart machine has pre-established scene style library, and the scene style stores in library The clothes fashion and style of different scenes and the matching degree of scene.Illustratively, the clothes fashion and style of different scenes It is as follows: with the matching degree of scene
Scene Style Matching degree
Scene 1 Style 1,2,3,4 90%-100%
Scene 1 Style 5,7,8 80%-90%
Scene 1 Style 9 70%-60%
Scene 1 Style 6,13 60%-50%
In determining the clothes area image after the style of clothes, style is attended in conjunction with scene with what user selected Get up, inquire the matching degree for attending scene Yu the style that user selects in scene style library, calculates scoring.Calculation formula Are as follows: scene matching score=matching degree * 50.
In first embodiment of the invention, by obtaining the whole body images of user, according to the determination of the whole body images of acquisition Clothes area image in whole body images is divided the clothes area image by color lump, according to the determination of the color lump of division The colour match score of clothes area image searches the corresponding clothes of the clothes area image from preset sample style library Style, obtain user's selection attends scene, attends scene according to the clothes fashion and according to described, determines the clothes area The scene matching score of area image determines wearing for the user according to the colour match score and the scene matching score Comprehensive score, the accuracy for wearing comprehensive score can be improved.The dress that scene is user of attending of this programme combination user is taken With scoring, arranging in pairs or groups to wear for user provides reference, and equipment can be made more intelligent, when the dress that can also arrange in pairs or groups for user is saved Between, user experience can be improved.
Embodiment two:
Fig. 3 show the embodiment of the invention provides include recommend to user it is another with the arranging scheme of attending scene matching A kind of flow chart for wearing methods of marking, details are as follows:
Step S201 obtains the whole body images of user.
Step S202 determines the clothes area image in the whole body images according to the whole body images of acquisition.
Step S203, obtain user's selection attends scene.
Step S204 attends scene with described according to the clothes area image, determines that the dress synthesis of the user is commented Point.
In the present embodiment, the specific steps of step S201 to step S204 are referring to one step S101 of embodiment to step S104, details are not described herein.
Step S205 judges described to wear whether comprehensive score is lower than default scoring.
In embodiments of the present invention, default scoring is the basis point by the determining dress collocation of big data statistics.Pass through Judge that the user's wears whether comprehensive score is lower than default scoring to determine whether to recommend dress collocation to user.
Step S206, if the comprehensive score of wearing attends scene according to described lower than default scoring, recommend with it is described go out The arranging scheme of seat scene matching.
In embodiments of the present invention, pattern assortment library is previously stored in smart machine, each scene of attending has correspondence With the arranging scheme of attending scene matching.If the comprehensive score of wearing is lower than default scoring, according to attending for user's selection Scene recommends not knowing the worry how to arrange in pairs or groups with the arranging scheme for attending scene matching, solution user.Further, if The comprehensive score of wearing plays voice praise, then not less than default scoring to increase the confidence of user.
Optionally, in embodiments of the present invention, the personal care garment library that user is pre-established in smart machine, for storing The image for the clothes that user oneself has.When needing smart machine to recommend garment coordination, smart machine can be according to the individual of user The style of existing clothes is that user arranges in pairs or groups in clothes library, so that user be facilitated to complete to arrange in pairs or groups immediately.
In second embodiment of the invention, by obtaining the whole body images of user, according to the determination of the whole body images of acquisition Clothes area image in whole body images;Obtain user selection attend scene, according to the clothes area image and it is described go out Seat scene determines that the user's wears comprehensive score, judge it is described wear whether comprehensive score is lower than default scoring, if described Comprehensive score is worn lower than default scoring, the dress collocation scoring that scene is user is attended in conjunction with user, is worn for user's collocation Offer reference, attend scene further according to described, recommend with the arranging scheme for attending scene matching, if dress synthesis Scoring attends scene, recommendation and the arranging scheme for attending scene matching lower than default scoring, according to described, to improve use The efficiency of family collocation dress, saves the time for user, improves user experience.
It should be understood that the size of the serial number of each step is not meant that the order of the execution order in above-described embodiment, each process Execution sequence should be determined by its function and internal logic, the implementation process without coping with the embodiment of the present invention constitutes any limit It is fixed.
Embodiment three:
Corresponding to methods of marking is worn described in foregoing embodiments, Fig. 4 shows dress provided in an embodiment of the present invention and comments The structural block diagram of separating device, the device can be applied to intelligent terminal, which may include through wireless access network RAN and one The user equipment that a or multiple cores net is communicated, the user equipment can be mobile phone, intelligent robot.For the ease of Illustrate, only parts related to embodiments of the present invention are shown.
Referring to Fig. 4, it includes: whole body images acquiring unit 31 that this, which wears scoring apparatus, and image of clothing determination unit 32 is attended Scene acquiring unit 33, comprehensive score determination unit 34, in which:
Whole body images acquiring unit 31, for obtaining the whole body images of user;
Image of clothing determination unit 32 determines the clothes region in the whole body images for the whole body images according to acquisition Image;
Attend scene acquiring unit 33, for obtain user selection attend scene;
Comprehensive score determination unit 34 determines the use for attending scene with described according to the clothes area image Wear comprehensive score in family.
Optionally, the comprehensive score determination unit 34 includes:
Color score determining module is determined for dividing the clothes area image by color lump according to the color lump of division The colour match score of the clothes area image;
Style searching module, for searching the corresponding clothes money of the clothes area image from preset sample style library Formula;
Scene score determining module determines the clothes for attending scene according to the clothes fashion and according to described The scene matching score of area image;
Comprehensive score determining module, described in determining according to the colour match score and the scene matching score User's wears comprehensive score.
Optionally, the color score determining module includes:
Color lump merges submodule, for identical color in the clothes area image to be merged into a color lump;
First color score submodule, for when in the clothes area image there are when more than one color lump, according to color The quantity of block determines the first colour match score;
Match colors degree determines submodule, for obtaining the color between each color lump according to preset color matching list Color matching degree;
Second color score submodule, for determining the second colour match score according to the match colors degree;
Colour match submodule is used for according to the first colour match score and the second colour match score, really The colour match score of the fixed clothes area image.
Optionally, the scene score determining module includes:
Edge extracting submodule, for extracting the Edge Feature Points of the clothes area image using edge detection operator;
Feature determines submodule, for according to the Edge Feature Points, determine in the clothes area image straightway with Angle point;
Style determines submodule, for the straightway from lookup in preset sample style library and the clothes area image With the style of corners Matching, the style of clothes in the clothes region is determined.
It is optionally, described to wear scoring apparatus further include:
Score comparing unit, described wears whether comprehensive score is lower than default scoring for judging;
Recommendation unit of arranging in pairs or groups is attended scene according to described, is recommended if being lower than default scoring for the comprehensive score of wearing With the arranging scheme for attending scene matching.
In third embodiment of the invention, by obtaining the whole body images of user, according to the determination of the whole body images of acquisition Clothes area image in whole body images, obtain user selection attends scene, according to the clothes area image and it is described go out Seat scene, determines the comprehensive score of wearing of the user, and the dress collocation that scene is user of attending of this programme combination user is commented Point, arranging in pairs or groups to wear for user provides reference, and equipment can be made more intelligent, and the time is saved in the dress that can also arrange in pairs or groups for user, User experience can be improved.
Example IV:
Fig. 5 is the schematic diagram for the smart machine that one embodiment of the invention provides.As shown in figure 5, the intelligence of the embodiment is set Standby 4 include: processor 40, memory 41 and are stored in the meter that can be run in the memory 41 and on the processor 40 Calculation machine program 42, such as wear scoring procedures.The processor 40 realizes above-mentioned each wear when executing the computer program 42 The step in methods of marking embodiment, such as step 101 shown in FIG. 1 is to 104.Alternatively, the processor 40 execute it is described The function of each module/unit in above-mentioned each Installation practice, such as unit 31 to 34 shown in Fig. 4 are realized when computer program 42 Function.
Illustratively, the computer program 42 can be divided into one or more module/units, it is one or Multiple module/units are stored in the memory 41, and are executed by the processor 40, to complete the present invention.Described one A or multiple module/units can be the series of computation machine program instruction section that can complete specific function, which is used for Implementation procedure of the computer program 42 in the smart machine 4 is described.For example, the computer program 42 can be divided It is cut into whole body images acquiring unit, image of clothing determination unit, attends scene acquiring unit, comprehensive score determination unit, each list First concrete function is as follows:
Whole body images acquiring unit, for obtaining the whole body images of user;
Image of clothing determination unit determines the clothes administrative division map in the whole body images for the whole body images according to acquisition Picture;
Attend scene acquiring unit, for obtain user selection attend scene;
Comprehensive score determination unit determines the user for attending scene with described according to the clothes area image Wear comprehensive score.
The smart machine 4 can be with intelligent robot.The smart machine 4 may include, but be not limited only to, processor 40, Memory 41.It will be understood by those skilled in the art that Fig. 5 is only the example of smart machine 4, do not constitute to smart machine 4 Restriction, may include perhaps combining certain components or different components, such as institute than illustrating more or fewer components Stating smart machine can also include input-output equipment, network access equipment, bus etc..
Alleged processor 40 can be central processing unit (Central Processing Unit, CPU), can also be Other general processors, digital signal processor (Digital Signal Processor, DSP), specific integrated circuit (Application Specific Integrated Circuit, ASIC), ready-made programmable gate array (Field- Programmable Gate Array, FPGA) either other programmable logic device, discrete gate or transistor logic, Discrete hardware components etc..General processor can be microprocessor or the processor is also possible to any conventional processor Deng.
The memory 41 can be the internal storage unit of the smart machine 4, such as the hard disk or interior of smart machine 4 It deposits.The memory 41 is also possible to the External memory equipment of the smart machine 4, such as be equipped on the smart machine 4 Plug-in type hard disk, intelligent memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) card dodge Deposit card (Flash Card) etc..Further, the memory 41 can also both include the storage inside list of the smart machine 4 Member also includes External memory equipment.The memory 41 is for storing needed for the computer program and the smart machine Other programs and data.The memory 41 can be also used for temporarily storing the data that has exported or will export.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each function Can unit, module division progress for example, in practical application, can according to need and by above-mentioned function distribution by different Functional unit, module are completed, i.e., the internal structure of described device is divided into different functional unit or module, more than completing The all or part of function of description.Each functional unit in embodiment, module can integrate in one processing unit, can also To be that each unit physically exists alone, can also be integrated in one unit with two or more units, it is above-mentioned integrated Unit both can take the form of hardware realization, can also realize in the form of software functional units.In addition, each function list Member, the specific name of module are also only for convenience of distinguishing each other, the protection scope being not intended to limit this application.Above system The specific work process of middle unit, module, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, is not described in detail or remembers in some embodiment The part of load may refer to the associated description of other embodiments.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed The scope of the present invention.
In embodiment provided by the present invention, it should be understood that disclosed device and method can pass through others Mode is realized.For example, system embodiment described above is only schematical, for example, the division of the module or unit, Only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple units or components can be with In conjunction with or be desirably integrated into another system, or some features can be ignored or not executed.Another point, it is shown or discussed Mutual coupling or direct-coupling or communication connection can be through some interfaces, the INDIRECT COUPLING of device or unit or Communication connection can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme 's.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list Member both can take the form of hardware realization, can also realize in the form of software functional units.
If the integrated unit is realized in the form of SFU software functional unit and sells or use as independent product When, it can store in a computer readable storage medium.Based on this understanding, the present invention realizes above-described embodiment side All or part of the process in method can also instruct relevant hardware to complete, the computer by computer program Program can be stored in a computer readable storage medium, and the computer program is when being executed by processor, it can be achieved that above-mentioned each The step of a embodiment of the method.Wherein, the computer program includes computer program code, and the computer program code can Think source code form, object identification code form, executable file or certain intermediate forms etc..The computer-readable medium can be with It include: any entity or device, recording medium, USB flash disk, mobile hard disk, magnetic disk, light that can carry the computer program code Disk, computer storage, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electric carrier signal, telecommunication signal and software distribution medium etc..It should be noted that the computer The content that readable medium includes can carry out increase and decrease appropriate according to the requirement made laws in jurisdiction with patent practice, such as In certain jurisdictions, according to legislation and patent practice, computer-readable medium do not include be electric carrier signal and telecommunications letter Number.
Embodiment described above is merely illustrative of the technical solution of the present invention, rather than its limitations;Although referring to aforementioned reality Applying example, invention is explained in detail, those skilled in the art should understand that: it still can be to aforementioned each Technical solution documented by embodiment is modified or equivalent replacement of some of the technical features;And these are modified Or replacement, the spirit and scope for technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution should all It is included within protection scope of the present invention.

Claims (10)

1. a kind of wear methods of marking, which is characterized in that the methods of marking of wearing includes:
Obtain the whole body images of user;
The clothes area image in the whole body images is determined according to the whole body images of acquisition;
Obtain user's selection attends scene;
Scene is attended with described according to the clothes area image, determines that the user's wears comprehensive score.
2. wearing methods of marking as described in claim 1, which is characterized in that it is described according to the clothes area image with it is described Scene is attended, determines the step of wearing comprehensive score of the user, comprising:
The clothes area image is divided by color lump, the colour match of the clothes area image is determined according to the color lump of division Score;
The corresponding clothes fashion of the clothes area image is searched from preset sample style library;
Scene is attended according to the clothes fashion and according to described, determines the scene matching score of the clothes area image;
According to the colour match score and the scene matching score, determine that the user's wears comprehensive score.
3. wearing methods of marking as claimed in claim 2, which is characterized in that described to draw the clothes area image by color lump The step of dividing, the colour match score of the clothes area image determined according to the color lump of division, comprising:
Identical color in the clothes area image is merged into a color lump;
When there are when more than one color lump, determining the first colour match point according to the quantity of color lump in the clothes area image Number;
According to preset color matching list, the match colors degree between each color lump is obtained;
According to the match colors degree, the second colour match score is determined;
According to the first colour match score and the second colour match score, the color of the clothes area image is determined Collocation score.
4. wearing methods of marking as claimed in claim 2, which is characterized in that described to search institute from preset sample style library The step of stating clothes area image corresponding clothes fashion, comprising:
The Edge Feature Points of the clothes area image are extracted using edge detection operator;
According to the Edge Feature Points, straightway and angle point in the clothes area image are determined;
From the style searched in preset sample style library with straightway and corners Matching in the clothes area image, institute is determined State the style of clothes in clothes region.
5. wearing methods of marking as Claims 1-4 is described in any item, which is characterized in that described according to the clothes area Area image and it is described attend scene, determine the user the step of wearing comprehensive score after, comprising:
Judge described to wear whether comprehensive score is lower than default scoring;
If the comprehensive score of wearing attends scene according to described lower than default scoring, recommend and the scene matching of attending Arranging scheme.
6. a kind of wear scoring apparatus, which is characterized in that the scoring apparatus of wearing includes:
Whole body images acquiring unit, for obtaining the whole body images of user;
Image of clothing determination unit determines the clothes area image in the whole body images for the whole body images according to acquisition;
Attend scene acquiring unit, for obtain user selection attend scene;
Comprehensive score determination unit determines wearing for the user for attending scene with described according to the clothes area image Comprehensive score.
7. wearing scoring apparatus as claimed in claim 6, which is characterized in that the comprehensive score determination unit includes:
Color score determining module, for dividing the clothes area image by color lump, according to the determination of the color lump of division The colour match score of clothes area image;
Style searching module, for searching the corresponding clothes fashion of the clothes area image from preset sample style library;
Scene score determining module determines the clothes region for attending scene according to the clothes fashion and according to described The scene matching score of image;
Comprehensive score determining module, for determining the user according to the colour match score and the scene matching score Wear comprehensive score.
8. wearing scoring apparatus as claimed in claim 6, which is characterized in that described to wear scoring apparatus further include:
Score comparing unit, described wears whether comprehensive score is lower than default scoring for judging;
Collocation recommendation unit, if attending scene, recommendation and institute according to described lower than default scoring for the comprehensive score of wearing State the arranging scheme for attending scene matching.
9. a kind of smart machine, including memory, processor and storage are in the memory and can be on the processor The computer program of operation, which is characterized in that the processor realizes such as claim 1 to 5 when executing the computer program The step of any one the method.
10. a kind of computer readable storage medium, the computer-readable recording medium storage has computer program, and feature exists In when the computer program is executed by processor the step of any one of such as claim 1 to 5 of realization the method.
CN201711466285.8A 2017-12-28 2017-12-28 Wear methods of marking, device, smart machine and storage medium Pending CN109978720A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711466285.8A CN109978720A (en) 2017-12-28 2017-12-28 Wear methods of marking, device, smart machine and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711466285.8A CN109978720A (en) 2017-12-28 2017-12-28 Wear methods of marking, device, smart machine and storage medium

Publications (1)

Publication Number Publication Date
CN109978720A true CN109978720A (en) 2019-07-05

Family

ID=67075379

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711466285.8A Pending CN109978720A (en) 2017-12-28 2017-12-28 Wear methods of marking, device, smart machine and storage medium

Country Status (1)

Country Link
CN (1) CN109978720A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111310097A (en) * 2020-03-17 2020-06-19 李照宇 Intelligent estimation method for clothes color positioning big data based on working scene
CN111401748A (en) * 2020-03-17 2020-07-10 李照宇 Intelligent dressing matching big data evaluation method based on working scene
CN113538368A (en) * 2021-07-14 2021-10-22 Oppo广东移动通信有限公司 Image selection method, image selection device, storage medium, and electronic apparatus
CN113808118A (en) * 2021-09-24 2021-12-17 孙红 Intelligent matching method for colors of upper garment and lower garment of clothes

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104484450A (en) * 2014-12-25 2015-04-01 广东欧珀移动通信有限公司 Clothing matching recommendation method and clothing matching recommendation device based on pictures
CN104851022A (en) * 2015-04-30 2015-08-19 江苏卡罗卡国际动漫城有限公司 Fitting system
CN105760882A (en) * 2016-01-29 2016-07-13 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal device
CN105761120A (en) * 2016-03-31 2016-07-13 南京云创大数据科技股份有限公司 Virtual fitting system automatically matching fitting scene and application method
CN105808774A (en) * 2016-03-28 2016-07-27 北京小米移动软件有限公司 Information providing method and device
CN106446065A (en) * 2016-09-06 2017-02-22 珠海市魅族科技有限公司 Clothes collocation recommendation method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104484450A (en) * 2014-12-25 2015-04-01 广东欧珀移动通信有限公司 Clothing matching recommendation method and clothing matching recommendation device based on pictures
CN104851022A (en) * 2015-04-30 2015-08-19 江苏卡罗卡国际动漫城有限公司 Fitting system
CN105760882A (en) * 2016-01-29 2016-07-13 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal device
CN105808774A (en) * 2016-03-28 2016-07-27 北京小米移动软件有限公司 Information providing method and device
CN105761120A (en) * 2016-03-31 2016-07-13 南京云创大数据科技股份有限公司 Virtual fitting system automatically matching fitting scene and application method
CN106446065A (en) * 2016-09-06 2017-02-22 珠海市魅族科技有限公司 Clothes collocation recommendation method and device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111310097A (en) * 2020-03-17 2020-06-19 李照宇 Intelligent estimation method for clothes color positioning big data based on working scene
CN111401748A (en) * 2020-03-17 2020-07-10 李照宇 Intelligent dressing matching big data evaluation method based on working scene
CN111310097B (en) * 2020-03-17 2024-02-09 李照宇 Intelligent evaluation method for big data of clothes color positioning based on working scene
CN113538368A (en) * 2021-07-14 2021-10-22 Oppo广东移动通信有限公司 Image selection method, image selection device, storage medium, and electronic apparatus
CN113808118A (en) * 2021-09-24 2021-12-17 孙红 Intelligent matching method for colors of upper garment and lower garment of clothes

Similar Documents

Publication Publication Date Title
CN109978720A (en) Wear methods of marking, device, smart machine and storage medium
CN105989594B (en) A kind of image region detection method and device
CN106898026B (en) A kind of the dominant hue extracting method and device of picture
CN104484450B (en) Clothing matching based on image recommends method and clothing matching recommendation apparatus
CN104636759B (en) A kind of method and picture filter information recommendation system for obtaining picture and recommending filter information
JP2022510712A (en) Neural network training method and image matching method, as well as equipment
CN106202317A (en) Method of Commodity Recommendation based on video and device
CN110147483A (en) A kind of title method for reconstructing and device
CN106649383A (en) Clothes management method and system
CN110232253B (en) Computer device, equipment, storage medium and method for generating clothing matching scheme
KR20100005072A (en) Method and system for recommending a product based upon skin color estimated from an image
CN107992820A (en) Counter automatic selling method based on binocular vision
CN113987344B (en) Intelligent 3D garment style simulation method based on layout library and cost estimation method thereof
CN107080435A (en) Virtual wardrobe management system and method and the dress ornament marketing method based on the system
CN106354768B (en) Color-based user and commodity matching method and commodity matching recommendation method
CN109271930A (en) Micro- expression recognition method, device and storage medium
CN106951448A (en) A kind of personalization, which is worn, takes recommendation method and system
CN109360050A (en) Personal care garment management and personalized collocation recommendation intelligence system based on perceptual demand
CN107729380A (en) Clothing matching method, terminal, terminal
Miura et al. SNAPPER: fashion coordinate image retrieval system
CN108920828A (en) A kind of method and system of garment coordination
CN108932703A (en) Image processing method, picture processing unit and terminal device
TWI524286B (en) Popular with the recommended system
CN113538074A (en) Method, device and equipment for recommending clothes
CN106649300A (en) Intelligent clothing matching recommendation method and system based on cloud platform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190705

RJ01 Rejection of invention patent application after publication