CN111832350A - Clothes try-on detection method and system - Google Patents

Clothes try-on detection method and system Download PDF

Info

Publication number
CN111832350A
CN111832350A CN201910312406.6A CN201910312406A CN111832350A CN 111832350 A CN111832350 A CN 111832350A CN 201910312406 A CN201910312406 A CN 201910312406A CN 111832350 A CN111832350 A CN 111832350A
Authority
CN
China
Prior art keywords
user
clothing
image
information
fitting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910312406.6A
Other languages
Chinese (zh)
Other versions
CN111832350B (en
Inventor
朱舒舒
陈喆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201910312406.6A priority Critical patent/CN111832350B/en
Publication of CN111832350A publication Critical patent/CN111832350A/en
Application granted granted Critical
Publication of CN111832350B publication Critical patent/CN111832350B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • G06Q30/0203Market surveys; Market polls
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0623Item investigation
    • G06Q30/0625Directed, with specific intent or strategy

Abstract

The application provides a method and a system for detecting try-on of clothes, wherein the method comprises the following steps: acquiring a user fitting image acquired by a camera; identifying clothing information corresponding to the clothing to be tried on in the user trying image; and updating the fitting times corresponding to the clothing information. The user try-on image collected by the camera is identified to obtain which type of clothes is tried on, and then the try-on times of the clothes are updated, so that a manager can know which types of clothes are liked by customers in a store according to the recorded try-on times, and the clothes display strategy and the goods feeding and replenishment strategy are adjusted conveniently.

Description

Clothes try-on detection method and system
Technical Field
The application relates to the technical field of image processing, in particular to a method and a system for detecting try-on of clothes.
Background
At present, most garment industries still adopt a traditional sales mode, in the whole garment sales process, a sales attendant only verbally communicates with a customer to complete sales, and the customer cannot record which types of garments are liked by the customer, so that the information feedback in the whole sales process is not smooth, the condition of unskilled labor occurs in many links, and the situation that the sold products are broken and sold at a standstill is caused.
Disclosure of Invention
In view of this, the present application provides a method and a system for detecting a try-on of a garment, so as to solve the problem of low efficiency of current sales management.
According to a first aspect of embodiments of the present application, there is provided a method for detecting a try-on of a garment, the method including:
acquiring a user fitting image acquired by a camera;
identifying clothing information corresponding to the clothing to be tried on in the user trying image;
and updating the fitting times corresponding to the clothing information.
According to a second aspect of embodiments of the present application, there is provided a garment fitting detection system, the system comprising:
the camera is used for acquiring a user fitting image and sending the user fitting image to the processor;
and the processor is used for identifying the clothing information corresponding to the clothing to be tried on in the user trying image and updating the trying-on times corresponding to the clothing information.
By applying the embodiment of the application, the fitting times corresponding to the clothing information are updated by acquiring the user fitting image acquired by the camera and identifying the clothing information corresponding to the clothing to be fitted in the user fitting image.
Based on the description, the user try-on images collected by the camera are identified to obtain which type of clothes is tried on, and then the try-on times of the type of clothes are updated, so that a manager can know which types of clothes are liked by a store-entering customer according to the recorded try-on times, and the clothes display strategy and the goods-entering replenishment strategy are adjusted conveniently.
Drawings
Fig. 1 is a flow chart illustrating an embodiment of a method for detecting a try-on in a garment according to an exemplary embodiment of the present application;
fig. 2 is a block diagram of a garment try-on detection system according to an exemplary embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of systems and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
The current sales mode can not record which styles of clothes are liked by customers, and the information feedback in the whole sales process is not smooth, so that the problem of low sales management efficiency exists.
In order to solve the problems, the application provides a clothes fitting detection method, which includes the steps of acquiring a user fitting image acquired by a camera, identifying clothes information corresponding to a garment to be fitted in the user fitting image, and updating fitting times corresponding to the clothes information.
Based on the description, the user try-on images collected by the camera are identified to obtain which type of clothes is tried on, and then the try-on times of the type of clothes are updated, so that a manager can know which types of clothes are liked by store-entering customers according to the recorded try-on times, and the clothes display strategy and the goods feeding and replenishment strategy are convenient to adjust.
The following describes the clothes try-on detection method proposed by the present application in detail with specific examples.
Fig. 1 is a flowchart of an embodiment of a method for detecting a try-on of a garment according to an exemplary embodiment of the present application, in which a user usually stands in front of a fitting mirror during a fitting process, and can look up a fitting effect in front of the fitting mirror after wearing a garment to be fitted, so that a camera may be disposed above the fitting mirror or above a roof, and the like, so as to conveniently acquire a fitting image of the user.
As shown in fig. 1, the method for detecting the try-on of the garment comprises the following steps:
step 101: and acquiring a user fitting image acquired by the camera.
In one possible example, the camera may monitor a human body within the field of view in real time, and when the human body is monitored, start to capture a video image, detect a human face in the video image, and use the video image with the detected human face as a fitting image of the user.
Illustratively, the user fitting image may be a two-dimensional image or a three-dimensional image. In the case of three-dimensional images, the camera may be a binocular camera or a depth camera.
Step 102: and identifying clothing information corresponding to the clothing to be tried on in the user trying image.
In an embodiment, the user fitting image may be divided into a clothing sub-image and a face sub-image, then a target clothing image matching the clothing sub-image is searched from a preset clothing database, and clothing information corresponding to the target clothing image is used as clothing information corresponding to a fitted clothing, where the clothing database includes the clothing image and the clothing information.
In a possible implementation manner, for the process of searching for the target clothing image matching with the clothing sub-image, the similarity between the clothing sub-image and each frame of clothing image in the clothing database may be calculated, then the maximum similarity is selected from the calculated similarities, and if the maximum similarity is greater than the similarity threshold, the clothing image corresponding to the maximum similarity is used as the matching target clothing image.
For example, the clothing information in the clothing database may include the style ID number, color, size information, etc. of the clothing, and the clothing map in the clothing database refers to a physical shooting map of the clothing.
As will be understood by those skilled in the art, in the process of segmenting the fitting image of the user into the clothing sub-image and the face sub-image, the clothing and the face can be segmented by using a conventional segmentation algorithm, or the segmentation can be implemented by using a neural network based on a deep learning method.
In another embodiment, the clothing information of the tried-on clothing in the clothing sub-image can be identified by the first neural network by inputting the clothing sub-image to the trained first neural network.
For example, the first neural network may be obtained by training based on a deep learning method, and the training samples for training the first neural network include clothing maps of various styles of clothing and labeled clothing information labels, for example, the clothing maps and the clothing information in a clothing database may be used as the training samples.
Based on the above description, after the user fitting image is divided into the clothing sub-image and the face sub-image, the face sub-image can be used for identifying the user information of the fitting user, and the identified user information and the fitting times corresponding to the identified clothing information are correspondingly stored, so that a manager can check which users fit through.
In an embodiment, in a process of recognizing user information of a try-on user by using a face sub-image, a target face image matched with the face sub-image is searched from a preset user database, and user information corresponding to the target face image is acquired, wherein the user database includes a face image and user information of a store-entering user.
The principle of searching for the target face image matched with the face subimage can adopt the above-mentioned searching principle of searching for the target clothing image matched with the clothing subimage, and the searching process of the target face image is not described in detail.
For example, the face image in the user database may be a face image uploaded by the user when the user registers a member, or may be a face sub-image obtained in the process of performing the clothing try-on detection; the user information in the user database may be information input by the user when the user registers a member, or may be information identified based on the face sub-image, and the user information may be one or a combination of multiple information among information such as name, sex, height, weight, favorite color, mobile phone number, and the like.
In another embodiment, the user information of the face sub-image can be recognized by the second neural network by inputting the face sub-image into the trained second neural network.
For example, the second neural network may also be trained based on a deep learning method, and the training sample for training the second neural network includes a face map of the store-entering user and a labeled user information label, for example, the user information and the face map in the user database may be used as the training sample.
It should be noted that, if the target face image matched with the face sub-image is not found from the user database, the user information of the new registered user input from the outside is received, and the received user information and the face sub-image are added to the user database, or the gender of the user trying on in the face sub-image is identified, and the identified gender is added to the user database as the user information and the face sub-image.
It should be further noted that, whether the size of the tried-on garment is suitable for the trying-on user may also be determined based on the sub-image of the garment or the obtained user information, and the determination result is stored correspondingly with the obtained user information and the number of trying-on times corresponding to the identified garment information, so as to adjust the garment design strategy by subsequently combining the user information and the trying-on effect of each user.
The process for determining whether the size of the tried-on garment is suitable for the trying-on user can include the following three ways:
the first embodiment: if the obtained user information comprises the height and the weight of the user, the height and the weight of the user can be respectively compared with the suitable height and the suitable weight corresponding to the tried-on clothes, and whether the size of the tried-on clothes is suitable for the tried-on user or not is judged according to the obtained comparison result.
For example, if the height of the user is not within the range of suitable height corresponding to the tried-on garment and the weight of the user is not within the range of suitable weight corresponding to the tried-on garment, the size of the tried-on garment is determined not to be suitable for the tried-on user, otherwise, the size of the tried-on garment is determined to be suitable for the tried-on user.
Assuming that the suitable weight range corresponding to the tried-on clothes is 25 kg-50 kg, and the suitable height range is 1 m-1.2 m, if the height of the user is not in the 1 m-1.2 m range corresponding to the tried-on clothes and the weight of the user is not in the 25 kg-50 kg range corresponding to the tried-on clothes, the size of the tried-on clothes is determined to be not suitable for the trying-on user.
The second embodiment: and determining the expansion degree of the tried-on garment in the garment sub-image, and judging whether the size of the tried-on garment is suitable for a trying-on user or not according to the expansion degree.
Illustratively, the unfolding degree of the tried-on garment can be determined by identifying the information of the wrinkles, the curling degree, the line deformation degree and the like of the garment in the sub-image of the garment; or the similarity comparison between the screenshot of the clothes and the sub-image of the clothes when the fitting model tries on is utilized, the comparison process can be realized by adopting a neural network, and can also be determined only by linear calculation or simple similarity calculation.
Assuming that the fit unfolding degree range corresponding to the tried-on clothes is 10% -20%, if the determined unfolding degree is not in the range of 10% -20% corresponding to the tried-on clothes, the size of the tried-on clothes is determined to be not suitable for the trying-on user.
Third embodiment: and determining the proportion of the sub-image of the clothes in the fitting image, and judging whether the size of the fitted clothes is suitable for a fitting user according to the proportion.
In one example, after determining the proportion of the sub-image of the garment in the fitting image, the distance between the fitting user and the fitting mirror may be determined, and whether the size of the fitted garment is suitable for the fitting user is determined according to the distance and the proportion.
The distance between the fitting user and the fitting mirror can be measured through a distance measuring sensor (such as a distance sensor and a laser sensor) arranged on the fitting mirror, or determined through a user fitting image acquired by a camera and coordinate position information of the fitting mirror in a camera coordinate system. And if the distance exceeds the distance threshold value and the proportion is not in the preset proportion range, determining that the size of the tried-on clothes is not suitable for the trying-on user.
In another example, after determining the proportion of the sub-image of the garment in the fitting image, if the user information includes a height and a weight of the user, it may be determined whether the size of the fitted garment is suitable for the fitting user based on the proportion and the height and the weight of the user.
For example, if the percentage is not within the preset ratio range or the height or weight of the user is not within the height range or weight range corresponding to the tried-on garment, the size of the tried-on garment is determined to be not suitable for the tried-on user.
It will be appreciated by those skilled in the art that, in addition to the three determination methods listed above, the size of the tried-on garment can be determined to be suitable for the trying-on user according to the fit degree of the tried-on garment in the image.
On the basis of the description, if the acquired user information comprises the mobile phone number, the fitting image of the user, the clothing information of the fitted clothing and the fit information can be pushed to the mobile phone using the mobile phone number, and meanwhile, the clothing information of other types of clothing can be pushed to the mobile phone number according to the favorite color in the user information or the selection of the user.
Step 103: and updating the fitting times corresponding to the clothing information.
In an embodiment, after the fitting times corresponding to the clothing information are updated, the sales information corresponding to the clothing information may be obtained, and the clothing information, the updated fitting times and the obtained sales information may be stored; or the display position corresponding to the clothing information can be obtained, and the clothing information, the updated fitting times and the obtained display position are stored; or the sales information and the display position corresponding to the clothing information can be acquired, and the clothing information, the updated fitting times, the acquired sales information and the acquired display position can be stored.
Illustratively, the sales information may include information on the number of sales, the time of sale, and the like. The subsequent manager can adjust the display position of the tried-on clothes or the goods intake and price through the stored information.
In an embodiment, when the fitting times are less than the preset times or the sales information does not meet the preset condition, a new display position may be generated according to the updated fitting times, the obtained sales information, and the display position, so that the user places the fitted garment in the new display position.
In the embodiment of the application, the fitting times corresponding to the clothing information are updated by acquiring the user fitting image acquired by the camera and identifying the clothing information corresponding to the clothing to be fitted in the user fitting image.
Based on the description, the user try-on images collected by the camera are identified to obtain which type of clothes is tried on, and then the try-on times of the type of clothes are updated, so that a manager can know which types of clothes are liked by customers in a store according to the recorded try-on times, and the clothes display strategy and the goods feeding and replenishment strategy are adjusted conveniently.
Fig. 2 is a block diagram of a garment try-on detection system according to an exemplary embodiment of the present application, the system including a camera 210 and a processor 220:
the camera 210 is configured to acquire a fitting image of a user and send the fitting image to the processor 220;
the processor 220 is configured to identify clothing information corresponding to a clothing to be tried on in the user try-on image, and update the fitting times corresponding to the clothing information.
For example, the processor 220 may be built in the camera 210, or may be disposed in a cloud of a server or a terminal device.
In an alternative implementation, the camera 210 may be disposed above the fitting mirror.
In an alternative implementation, the system may further comprise (not shown in fig. 2):
and a display, configured to receive and display the clothing information corresponding to the tried-on clothing and the updated fitting times sent by the processor 220.
In an optional implementation manner, the processor 220 is further configured to, after the fitting times corresponding to the clothing information are updated, obtain a display position corresponding to the clothing information, and store the clothing information, the updated fitting times, and the obtained display position; and/or obtaining sales information corresponding to the clothing information, and storing the clothing information, the updated fitting times and the obtained sales information; and/or obtaining the sales information and the display position corresponding to the clothing information, and storing the clothing information, the updated fitting times, the obtained sales information and the display position.
In an optional implementation manner, the processor 220 is further configured to generate a new display position according to the updated fitting times, the obtained sales information, and the display position when the fitting times are smaller than the preset times or the sales information does not reach the preset condition.
In an optional implementation manner, the processor 220 is specifically configured to, in the process of identifying clothing information corresponding to a clothing to be fitted in the user fitting image, divide the user fitting image into a clothing sub-image and a face sub-image; and searching a target clothing image matched with the clothing sub-image from a preset clothing database, and taking clothing information corresponding to the target clothing image as clothing information corresponding to the clothing to be tried on, wherein the clothing database comprises the clothing image and the clothing information.
In an optional implementation manner, the processor 220 is further configured to, after the user fitting image is divided into a clothing sub-image and a face sub-image, search a target face image matched with the face sub-image from a preset user database, and acquire user information corresponding to the target face image, where the user database includes a face image and user information of a store-entering user; and correspondingly storing the obtained user information and the fitting times corresponding to the identified clothing information.
In an optional implementation manner, the processor 220 is further configured to determine whether the size of the tried-on garment is suitable for the trying-on user based on the clothing sub-image or the acquired user information; and correspondingly storing the judgment result, the acquired user information and the fitting times corresponding to the identified clothing information.
In an optional implementation manner, the processor 220 is further specifically configured to, in the process of determining whether the size of the tried-on garment is suitable for the trying-on user based on the clothing sub-image or the obtained user information, when the user information includes a height and a weight of the user, compare the height and the weight of the user with a suitable height and a suitable weight corresponding to the tried-on garment, and determine whether the size of the tried-on garment is suitable for the trying-on user according to an obtained comparison result; or determining the expansion degree of the tried-on garment in the garment sub-image, and judging whether the size of the tried-on garment is suitable for a trying-on user or not according to the expansion degree; or, determining the proportion of the clothing sub-image in the fitting image, and judging whether the size of the fitted clothing is suitable for a fitting user according to the proportion.
In an optional implementation manner, the processor 220 is further specifically configured to determine a distance between the try-on user and the fitting mirror in a process of determining whether the size of the tried-on garment is suitable for the try-on user according to the ratio, and determine whether the size of the tried-on garment is suitable for the try-on user according to the distance and the ratio; or when the user information comprises the height and the weight of the user, judging whether the size of the tried-on clothes is suitable for the trying-on user according to the proportion and the height and the weight of the user.
The implementation processes of the functions and actions of the components in the system are specifically described in the implementation processes of the corresponding steps in the method, and are not described herein again.
For the system embodiment, since it basically corresponds to the method embodiment, reference may be made to the partial description of the method embodiment for relevant points.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (11)

1. A method for detecting a try-on of a garment, the method comprising:
acquiring a user fitting image acquired by a camera;
identifying clothing information corresponding to the clothing to be tried on in the user trying image;
and updating the fitting times corresponding to the clothing information.
2. The method of claim 1, wherein after updating the fitting times corresponding to the clothing information, the method further comprises:
acquiring a display position corresponding to the clothing information, and storing the clothing information, the updated fitting times and the acquired display position; and/or the presence of a gas in the gas,
acquiring sales information corresponding to the clothing information, and storing the clothing information, the updated fitting times and the acquired sales information; and/or the presence of a gas in the gas,
and acquiring sales information and display positions corresponding to the clothing information, and storing the clothing information, the updated fitting times, the acquired sales information and the acquired display positions.
3. The method of claim 2, further comprising:
and when the fitting times are less than the preset times or the sales information does not reach the preset condition, generating a new display position according to the updated fitting times, the acquired sales information and the display position.
4. The method of claim 1, wherein identifying apparel information corresponding to a clothing being fitted in the user fitting image comprises:
dividing the user fitting image into a clothing sub-image and a face sub-image;
and searching a target clothing image matched with the clothing sub-image from a preset clothing database, and taking clothing information corresponding to the target clothing image as clothing information corresponding to the clothing to be tried on, wherein the clothing database comprises the clothing image and the clothing information.
5. The method of claim 4, wherein after segmenting the user fitting image into a apparel sub-image and a face sub-image, the method further comprises:
searching a target face image matched with the face subimage from a preset user database, and acquiring user information corresponding to the target face image, wherein the user database comprises a face image and user information of a store-entering user;
and correspondingly storing the obtained user information and the fitting times corresponding to the identified clothing information.
6. The method of claim 5, further comprising:
judging whether the size of the clothes to be tried on is suitable for the user to try on based on the clothes subimage or the acquired user information;
and correspondingly storing the judgment result, the acquired user information and the fitting times corresponding to the identified clothing information.
7. The method of claim 6, wherein determining whether the size of the fitted garment is suitable for the fitting user based on the sub-image of the garment or the obtained user information comprises:
when the user information comprises the height and the weight of the user, the height and the weight of the user are respectively compared with the suitable height and the suitable weight corresponding to the tried-on clothes, and whether the size of the tried-on clothes is suitable for the tried-on user or not is judged according to the obtained comparison result; alternatively, the first and second electrodes may be,
determining the expansion degree of the tried-on garment in the garment subimage, and judging whether the size of the tried-on garment is suitable for a trying-on user or not according to the expansion degree; alternatively, the first and second electrodes may be,
and determining the proportion of the sub-image of the clothes in the fitting image, and judging whether the size of the fitted clothes is suitable for a fitting user according to the proportion.
8. The method of claim 7, wherein determining whether the size of the tried-on garment is suitable for the trying-on user according to the ratio comprises:
determining the distance between the try-on user and the fitting mirror, and judging whether the size of the clothes to be tried is suitable for the try-on user according to the distance and the ratio; alternatively, the first and second electrodes may be,
and when the user information comprises the height and the weight of the user, judging whether the size of the tried-on clothes is suitable for the trying-on user according to the proportion and the height and the weight of the user.
9. A garment try-on detection system, the system comprising:
the camera is used for acquiring a user fitting image and sending the user fitting image to the processor;
and the processor is used for identifying the clothing information corresponding to the clothing to be tried on in the user trying image and updating the trying-on times corresponding to the clothing information.
10. The system of claim 9, wherein the camera is disposed above the fitting mirror.
11. The system of claim 9, further comprising:
and the display is used for receiving and displaying the clothing information corresponding to the tried-on clothing and the updated fitting times sent by the processor.
CN201910312406.6A 2019-04-18 2019-04-18 Clothing try-on detection method and system Active CN111832350B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910312406.6A CN111832350B (en) 2019-04-18 2019-04-18 Clothing try-on detection method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910312406.6A CN111832350B (en) 2019-04-18 2019-04-18 Clothing try-on detection method and system

Publications (2)

Publication Number Publication Date
CN111832350A true CN111832350A (en) 2020-10-27
CN111832350B CN111832350B (en) 2023-12-29

Family

ID=72914817

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910312406.6A Active CN111832350B (en) 2019-04-18 2019-04-18 Clothing try-on detection method and system

Country Status (1)

Country Link
CN (1) CN111832350B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09325993A (en) * 1996-06-06 1997-12-16 Toshiba Corp Sales floor managing system
WO2008003192A1 (en) * 2006-06-29 2008-01-10 The Hong Kong Polytechnic University An intellectual fitting system and method
KR20120007181A (en) * 2010-07-14 2012-01-20 김만기 Clothing fitting information-based clothing purchasing system and method
CN203433545U (en) * 2013-08-16 2014-02-12 重庆和航科技股份有限公司 Statistical system and acquisition terminal for clothes try-on information based on Internet of Things
CN204288293U (en) * 2014-11-28 2015-04-22 常州市武进区半导体照明应用技术研究院 A kind of dressing system
CN105608238A (en) * 2014-11-21 2016-05-25 中兴通讯股份有限公司 Clothes trying-on method and device
CN106097011A (en) * 2016-06-15 2016-11-09 广州信聚丰信息科技有限公司 Intelligence shops intelligence patrols shop and consumer behavior Requirement Analysis system
CN106683150A (en) * 2016-12-30 2017-05-17 中南大学 Dressing mirror system and data processing method thereof
CN108446931A (en) * 2018-03-21 2018-08-24 广州市迪如服装有限公司 A kind of clothing enterprises management system and feedback method is tried on based on the system
CN108596730A (en) * 2018-04-26 2018-09-28 北京超满意科技有限责任公司 Processing method, device and the smart machine of dress ornament information
CN108985865A (en) * 2018-08-27 2018-12-11 广州联欣自动识别技术有限公司 The customer data analysis method and system of intelligent shops
CN109242556A (en) * 2018-08-27 2019-01-18 广州联欣自动识别技术有限公司 Intelligent clothing shops management method and system
CN109523367A (en) * 2018-11-29 2019-03-26 上海与德通讯技术有限公司 A kind of data analysing method, device, server and storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09325993A (en) * 1996-06-06 1997-12-16 Toshiba Corp Sales floor managing system
WO2008003192A1 (en) * 2006-06-29 2008-01-10 The Hong Kong Polytechnic University An intellectual fitting system and method
KR20120007181A (en) * 2010-07-14 2012-01-20 김만기 Clothing fitting information-based clothing purchasing system and method
CN203433545U (en) * 2013-08-16 2014-02-12 重庆和航科技股份有限公司 Statistical system and acquisition terminal for clothes try-on information based on Internet of Things
CN105608238A (en) * 2014-11-21 2016-05-25 中兴通讯股份有限公司 Clothes trying-on method and device
CN204288293U (en) * 2014-11-28 2015-04-22 常州市武进区半导体照明应用技术研究院 A kind of dressing system
CN106097011A (en) * 2016-06-15 2016-11-09 广州信聚丰信息科技有限公司 Intelligence shops intelligence patrols shop and consumer behavior Requirement Analysis system
CN106683150A (en) * 2016-12-30 2017-05-17 中南大学 Dressing mirror system and data processing method thereof
CN108446931A (en) * 2018-03-21 2018-08-24 广州市迪如服装有限公司 A kind of clothing enterprises management system and feedback method is tried on based on the system
CN108596730A (en) * 2018-04-26 2018-09-28 北京超满意科技有限责任公司 Processing method, device and the smart machine of dress ornament information
CN108985865A (en) * 2018-08-27 2018-12-11 广州联欣自动识别技术有限公司 The customer data analysis method and system of intelligent shops
CN109242556A (en) * 2018-08-27 2019-01-18 广州联欣自动识别技术有限公司 Intelligent clothing shops management method and system
CN109523367A (en) * 2018-11-29 2019-03-26 上海与德通讯技术有限公司 A kind of data analysing method, device, server and storage medium

Also Published As

Publication number Publication date
CN111832350B (en) 2023-12-29

Similar Documents

Publication Publication Date Title
JP7311524B2 (en) Method and Apparatus and Intelligent Shelving System for Identifying Items Purchased by a User
CN106846122B (en) Commodity data processing method and device
TWI778030B (en) Store apparatus, store management method and program
US11138420B2 (en) People stream analysis method, people stream analysis apparatus, and people stream analysis system
JP2009003701A (en) Information system and information processing apparatus
WO2019038968A1 (en) Storefront device, storefront system, storefront management method, and program
US11475500B2 (en) Device and method for item recommendation based on visual elements
TW201411515A (en) Interactive clothes searching in online stores
JP2013168132A (en) Commodity retrieval device, method and program
TW201947495A (en) Method and device for human-machine interaction in a storage unit, storage unit and storage medium
CN111028029B (en) Off-line commodity recommendation method and device and electronic equipment
CN107067290A (en) Data processing method and device
US20170358135A1 (en) Augmenting the Half-Mirror to Display Additional Information in Retail Environments
US10474919B2 (en) Method for determining and displaying products on an electronic display device
JP2019174959A (en) Commodity shelf position registration program and information processing apparatus
JP2015230616A (en) Image processing method and image processor
KR20140107734A (en) Virtual fitting method and system
EP3074844A1 (en) Estimating gaze from un-calibrated eye measurement points
CN111832350B (en) Clothing try-on detection method and system
WO2019192455A1 (en) Store system, article matching method and apparatus, and electronic device
CN113159876A (en) Clothing matching recommendation device and method and storage medium
JP7109520B2 (en) Purchased product estimation device
CN115661903A (en) Map recognizing method and device based on spatial mapping collaborative target filtering
CN115578777B (en) Image recognizing method and device for obtaining target based on space mapping
WO2023209955A1 (en) Information processing device, information processing method, and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant