US20220005101A1 - System and method for social style mapping - Google Patents
System and method for social style mapping Download PDFInfo
- Publication number
- US20220005101A1 US20220005101A1 US17/478,737 US202117478737A US2022005101A1 US 20220005101 A1 US20220005101 A1 US 20220005101A1 US 202117478737 A US202117478737 A US 202117478737A US 2022005101 A1 US2022005101 A1 US 2022005101A1
- Authority
- US
- United States
- Prior art keywords
- user
- stylist
- color
- metadata
- styling
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 69
- 238000013507 mapping Methods 0.000 title description 3
- 230000008569 process Effects 0.000 claims description 45
- 238000003860 storage Methods 0.000 claims description 11
- 239000003086 colorant Substances 0.000 claims description 9
- 238000010801 machine learning Methods 0.000 claims description 4
- 238000003058 natural language processing Methods 0.000 claims description 3
- 230000001932 seasonal effect Effects 0.000 claims description 2
- 241000023320 Luma <angiosperm> Species 0.000 claims 2
- OSWPMRLSEDHDFF-UHFFFAOYSA-N methyl salicylate Chemical compound COC(=O)C1=CC=CC=C1O OSWPMRLSEDHDFF-UHFFFAOYSA-N 0.000 claims 2
- 230000000295 complement effect Effects 0.000 claims 1
- 238000013500 data storage Methods 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 16
- 230000006870 function Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 229920000742 Cotton Polymers 0.000 description 1
- NUFNQYOELLVIPL-UHFFFAOYSA-N acifluorfen Chemical class C1=C([N+]([O-])=O)C(C(=O)O)=CC(OC=2C(=CC(=CC=2)C(F)(F)F)Cl)=C1 NUFNQYOELLVIPL-UHFFFAOYSA-N 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 235000009120 camo Nutrition 0.000 description 1
- 244000213578 camo Species 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229920000728 polyester Polymers 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- -1 wool Polymers 0.000 description 1
- 210000002268 wool Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0631—Item recommendations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5838—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
- G06Q30/0643—Graphical representation of items or shoppers
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/0023—Colour matching, recognition, analysis, mixture or the like
Definitions
- the present disclosure is directed to a specialized system that remotely manages wardrobes of individuals via an information network.
- the system obtains, via an information network, an image of an apparel item and text describing the apparel item.
- the system also determines descriptive metadata of the apparel item by extracting information from the text.
- the system determines categorization metadata of the apparel item by analyzing the image.
- the system stores the image, the descriptive metadata, and the categorization metadata in a virtual closet.
- the system provides, via the information network, a recommendation of an outfit from a stylist to a user based on the virtual closet.
- FIG. 1 illustrates a block diagram of an exemplary environment for implementing a system in accordance with aspects of the present disclosure
- FIG. 2 illustrates a flow diagram of an exemplary intake process in accordance with aspects of the present disclosure
- FIG. 3 illustrates exemplary data structures in accordance with aspects of the present disclosure
- FIG. 4 illustrates a flow diagram of an exemplary process for determining color categorization metadata in accordance with aspects of the present disclosure
- FIG. 5 illustrates a functional flow diagram of an exemplary process for determining color categorization metadata in accordance with aspects of the present disclosure
- FIG. 6 illustrates exemplary color categorization metadata in accordance with aspects of the present disclosure
- FIG. 7 illustrates a flow diagram of an exemplary process for determining a category of an item in accordance with aspects of the present disclosure
- FIG. 8 illustrates exemplary type categorization metadata in accordance with aspects of the present disclosure.
- FIG. 9 illustrates a flow diagram of an exemplary recommendation process in accordance with aspects of the present disclosure.
- the present disclosure is generally directed to a specialized system that remotely manages wardrobes of individuals via an information network.
- the system stores personal wardrobes of users in virtual closets and links the users with stylists (e.g., individual or automated) that combine existing items in the personal wardrobes and/or new items that can be purchased from retailers.
- stylists e.g., individual or automated
- items e.g., apparel, including accessories
- systems in accordance with the present disclosure address the technical challenge of sharing information among users, retailers, and stylists over the Internet.
- systems in accordance with the present disclosure address the technical challenge of transforming the information received from the users and the retailers into a new representation that enables efficient and accurate matching of items for providing fashion recommendations to users.
- FIG. 1 illustrates a block diagram of an environment for implementing systems and processes in accordance with aspects of the present disclosure.
- the environment includes one or more retailers 10 , one or more users 15 , one or more stylists 20 , and a styling system 25 , each of which can be remotely located and communicatively linked via an information network (e.g., the Internet).
- the retailers 10 can be merchants of apparel items, which include clothes, footwear, accessories, jewelry, fashion products, and the like.
- the users 15 can be consumers of such apparel items.
- the stylists 20 can be individuals employed or contracted by the retailers 10 and/or an operator of the styling system 25 that provide recommendations to the users 15 regarding combinations of apparel items.
- the styling system 25 creates, maintains, and/or curates virtual closets that can mix and match apparel items from the retailers 10 with apparel items already in a wardrobe of particular one of users 15 .
- each of the retailers 10 can provide images (e.g., photographs) and respective descriptions 70 describing apparel items available to purchase to the styling system 25 , which are stored in virtual closets (e.g., virtual closet 128 ).
- each of the users 15 can provide images and respective descriptions 80 describing apparel items in their personal wardrobes to create virtual closets (e.g., virtual closet 130 ).
- the virtual closets can be stored at the styling system 125 or locally at the retailers 10 and users 15 .
- the users 15 can populate their virtual closets with purchases from the retailers 10 based on recommendations provided by one or more of the stylists 20 using information generated by the styling system 25 .
- the retailers 10 and/or stylists 20 can push (e.g., periodically suggest) recommendations 90 to the users 15 based on new arrivals and/or updates to retailer-specific items.
- the styling system 25 includes hardware and software that perform the processes and functions described herein.
- the styling system 25 includes a computing device 120 , an input/output (I/O) device 122 , and a storage system 125 .
- the I/O device 122 can include any device that enables the retailers 10 , the users 15 , and/or the stylists 20 to interact with the computing device 120 (e.g., a user interface) and/or any device that enables the computing device 120 to communicate with the retailers 10 , the users 15 , and/or the stylists 20 using any type of communications link.
- the communications link includes a wide-area network, such as the Internet.
- the I/O device 122 can be, for example, a touchscreen display, pointer device, keyboard, etc.
- the storage system 125 can comprise a computer-readable, non-volatile hardware storage device that stores information and computer-readable program instructions.
- the storage system 125 can be one or more flash drives and/or hard disk drives.
- the storage system 125 stores retailer data 127 , user data 129 , stylist data 131 , and reference data 133 .
- the retailer data 127 , and the user data 129 include virtual closets 128 and 130 , which include images having respective metadata describing the subject matter of the images (e.g., information describing items of clothing depicted in a respective one of the images).
- the information stored in the retailer data 127 and user data 129 can be updated, managed, and/or curated by their respective retailers 10 and/or users 15 (via, e.g., I/O interface 122 ).
- the user data 129 can include items owned by the user.
- the information stored in user data 129 can include one or more wish lists for items that the user does not yet own, but has identified for purchase. For example, the user can selectively add items to the wish list based on a recommendation from a stylist 20 and/or based on personal selections from online catalogs of the retailers 10 .
- the information stored in user data 129 can include collections (e.g., virtual suitcases) of items selected by the user as belonging to a common theme (e.g., items for a beach vacation). Still further, the information stored in the user data 129 can include matched sets (e.g., virtual outfits), which are a collection of clothing items that are brought together to form combinations with some fashion sense. Note that a matched set may be comprised of both owned and new items and still “saved” to the user's closet. The matched sets can include recommended outfits, which are outfits that may be recommended to the user 15 by a stylist 20 . If the user 15 likes the recommended outfit they may be able to save it to their closet.
- collections e.g., virtual suitcases
- matched sets e.g., virtual outfits
- the computing device 120 includes one or more processors 140 (e.g., microprocessor, microchip, or application-specific integrated circuit), one or more memory devices 141 (e.g., RAM and ROM), one or more I/O interfaces 143 , and one or more network interfaces 145 .
- the memory device 141 can include a local memory (e.g., a random access memory and a cache memory) employed during execution of program instructions.
- the computing device 120 includes at least one communication channel (e.g., a data bus) by which it communicates with the I/O device 122 and the storage system 125 .
- the processor 140 executes computer-readable program instructions for an operating system and application programs, which can be stored in the memory device 141 and/or storage system 125 . Moreover, the processor 140 can execute computer-readable program instructions stored in the memory devices 141 or the storage system 125 for an intake module 151 , a categorization module 153 , and a matching module 155 .
- the intake module 151 can receive images from, e.g., the retailers 10 and the users 15 , and process the images to, e.g., crop the images, rescale the images, remove backgrounds from the images, rescale the images, and normalize the light and/or coloring of the images.
- removing a background comprises entirely removing elements in an image that are not part of an apparel item depicted in the image such that only they apparel item and a substantially uniform (e.g., monochrome) background remain in the image.
- the categorization module 151 can process images received by the intake module to generate metadata (e.g., tags) that describes, classifies, and/or categorizes the images.
- the categorization module 153 may use computer vision to automatically generate the metadata from received images. For example, given an image of a shirt, the categorization module 153 may analyze the image and output tags, such as shirt, long-sleeve, button-down, vertical stripes, etc.
- the categorization module 153 can include both a feature matcher to look at specific clothing features as well as a representation-based matcher that utilizes a catalog of clothing to build representative samples of different categories of clothing to assist in matching.
- the matching module 155 can determine matches between items contained in the retailer data 127 and/or the user data 129 . For example, for an item identified by a user 15 in a styling request 85 , the matching module 155 may provide a confidence-ranked selection of matching clothing from the retailers 10 or the personal wardrobe of the user 15 . The matching can be performed manually by the stylists 20 , and/or automatically by the matching module 155 . In embodiments, the logic of the matching module 155 goes beyond rule-based decisions. For example, the matching module 155 can take into account combined colors and patterns. Further, the matching module 155 can be built to understand current and seasonal fashion trends to ensure the suggestions are up to date. Moreover, the matching module 155 can take into account both the user's personal wardrobe and retail catalogs to match items that are wholly owned by the user, items that are completely new, or any combination in-between.
- the computing device 120 can comprise any general purpose computing article of manufacture capable of executing computer program instructions installed thereon (e.g., a personal computer, server, etc.). However, the computing device 120 is only representative of various possible equivalent-computing devices that can perform the processes described herein. To this extent, in embodiments, the functionality provided by the computing device 120 can be any combination of general and/or specific purpose hardware and/or computer program instructions. In each embodiment, the program instructions and hardware can be created using standard programming and engineering techniques, respectively.
- FIG. 2 illustrates an exemplary flow diagram of an intake process in accordance with aspects of the present disclosure.
- the process 200 (which may be, e.g., performed by execution of intake module 151 by processor 140 ) obtains items images and associated description data.
- the images can be obtained from one or more retailers (e.g., retailers 10 ) and/or one or more users (e.g., users 15 ).
- the description data can be information, including item name, gender, type (e.g., pants, tops, skirt, active wear, bags, blazers, suits, coats and jackets, dresses, intimates and sleepwear, jeans, jewelry, jumpsuits and rompers, shoes, shorts, sweaters, etc.), use category (e.g., work, formal, casual, etc.), material (e.g., silk, cotton, wool, polyester), season (e.g., spring, summer, winter, fall, holiday, etc.), and measurements (e.g., size and/or physical dimensions).
- use category e.g., work, formal, casual, etc.
- material e.g., silk, cotton, wool, polyester
- season e.g., spring, summer, winter, fall, holiday, etc.
- measurements e.g., size and/or physical dimensions
- the process 200 (e.g., by execution of the intake module 151 ) normalizes the images obtained at 203 .
- the normalizing includes cropping and/or scaling the images to a standard size. Additionally, the normalizing can include normalizing the luminance of the images. Further, the normalizing can include removing backgrounds from the images.
- the process 200 determines descriptive metadata for items in the images obtained at 203 .
- the metadata can be extracted from text of the descriptive information received with images.
- the text can be obtained by character recognition or included in a data file (e.g., a JSON file).
- the text can be recognized by performing a bag-of-words search for predetermined words and phrases or by using natural language processing.
- the process 200 determines categorization metadata for the images normalized at 211 .
- the categorization metadata can be determined by analyzing the images using computer intelligence to categorize the images based on color and type. For example, based on a color model, the items in the images can be categorized into one or more of a number (e.g., 26) color categories.
- the apparel in the images can be categorized into one or more apparel types (e.g., shirt, jacket, pants, skirt, dress, accessory, etc.) Further, for each of the color model and/or the type model, the process 200 can determine a confidence value associated with each of the determined colors and types. The confidence value can represent a probability that the determined color or type is correct.
- the process 200 stores the categorization metadata determined at 215 in association with the respective images normalized at 207 .
- the images are obtained form a retailer (e.g., retailers 10 )
- the images, and the respective metadata can be stored in retailer data (e.g., retailer data 127 in storage system 125 ), and can be associated together a virtual closet (e.g., virtual closet 128 ).
- the images and the respective metadata can be stored in user data (e.g., user data 129 in storage system 125 ), and can be associated together a virtual closet (e.g., virtual closet 130 ).
- FIG. 3 illustrates exemplary data structures in accordance with aspects of the present disclosure.
- the data structures include images 301 and metadata 303 .
- the images 301 and the metadata 303 can be the same or similar to those previously disclosed herein.
- an image 301 may be of a skirt that is associated with descriptive information.
- the metadata 303 can include information extracted from the description (e.g., by intake module 151 ) and information extracted from the image 301 (e.g., by categorization module 153 ).
- the metadata 303 can include information relating the item in the image 30 to one or more other items (“related”) determined by the matching module 155 . While FIG.
- FIG. 4 illustrates a flow diagram of an exemplary process 400 for determining color categorization metadata in accordance with aspects of the present disclosure.
- the process 400 determines a color model.
- determining the color model includes defining a number (e.g., a set) of predetermined color categories at 4021 .
- defining the color model includes providing the model color samples corresponding to each of the categories defined in 4021 at 4023 .
- defining the color model includes generating color maps at 4025 corresponding to each of the categories defined in 4021 using the color sample provided in 4023 .
- the color maps define a combination of color ranges (e.g., Cr, Cb) corresponding to each of the predetermined color categories in a color scheme (e.g., Chroma subsampling) Further, each of the color maps is indexed by a corresponding luminance value (e.g., Y) of the respective color sample.
- a color scheme e.g., Chroma subsampling
- the process 400 trains to the color model defined at 402 .
- training includes providing reference images (e.g., stored in reference data 144 ) having known colors and luminance values.
- the color model defined at 402 and trained at 409 is applied to images obtained from one or more retailers (e.g. retailers 10 ) and one or more users (e.g., users 15 ), which determines the colors in the images (e.g., Cr, Cb), as well as a respective luminance value (Y) for each of the colors.
- the determination is performed for one or more pixel areas included in each of the images.
- FIG. 5 illustrates a functional flow diagram of an exemplary process 500 for determining color categorization metadata in accordance with aspects of the present disclosure.
- the process 500 may be the same or part of the process 400 previously described.
- the process 500 categorizes the color of an item (e.g., an apparel item, such as a top) using a color maps of a color model, which may be the same as those previously described herein.
- the color model comprises a database of color maps (e.g., mapping Cr versus Cb) indexed by their respective luminance values (Y).
- the process 500 categorizes the color of a particular item in an image by determining the color (Cr, Cb) and luminance value (Y) (e.g., 501 , 503 , or 505 ) of an area of the image (e.g., a pixel or a co-located group of several pixels). Based on the luminance value (Y) of the area, the process 500 retrieves a color map 507 from the database having the closest (e.g., the same or nearest) luminance value.
- Y luminance value
- the process 500 determines the category of color (e.g., green, blue, red) corresponding to areas 511 , 513 , 515 (e.g., each having a ranges of Cr value vs. a range Cb values) of the categories within the color map 507 .
- the color 501 may have a Cr, Cb values corresponding to a position within “green” area 511 of the color map 507 .
- the process 500 selects the predetermined color category (e.g., “green”) corresponding to the color range in the color map 507 .
- the process 500 can assign a confidence score to the selected color category based on the nearness of the colors 501 , 503 , 505 , to the edges of the color areas 511 , 513 , 515 in the color map 507 .
- the color 501 may be in the center of the green area 511 of the color map 507 . Accordingly, the process 500 can assign a high confidence value.
- the color 503 may be on the edge of both a red region 513 and a pink region 515 of the color map 507 . Accordingly, the process 503 may categorize the color 503 as both “red” and “pink,” each having a low confidence value with respect to the confidence value of color 501 .
- FIG. 6 illustrates examples of color categories output and confidence values by the process 400 and/or process 500 in accordance with aspects of the present disclosure.
- a particular item can be associated with one or more color categories.
- a monochrome shirt 603 can belong to a single color category 611 and have a high confidence value 613 .
- a multicolor shirt 605 or a patterned shirt 607 can be associated with a number of such categories, each having a relatively lower confidence value than the monochrome item 603 .
- efficient and accurate categorizations of multiple colors can be obtained. Thereby, systems in accordance with the present disclosure can quickly match different items with great accuracy.
- FIG. 7 illustrates an exemplary flow diagram of a process 700 for determining the category of an apparel item (e.g. long sleeve shirt, etc.) in accordance with aspects of the present disclosure.
- the process 700 defines an item model.
- defining the color model includes defining a number (e.g., a set) of predetermined item categories ( 7021 ) through shape definitions, machine learning, or some combination of both.
- defining the category model ( 7025 ) includes providing the item samples ( 7023 ) corresponding to each of the categories defined in 7021 .
- defining the category model includes generating category maps corresponding to each of the categories defined in 7021 using the item sample provided in 7023 .
- the item model defined and trained at 702 is compared to images obtained from one or more retailers (e.g. retailers 10 ) or one or more users (e.g., users 15 ), which determines the item type in the images ( 709 ). Further, in embodiments, defining an item type determined in 709 , the model can also determine a view (e.g., front, back, side, etc.) in addition to category of item (e.g. dress, top, etc.) or other metadata (stripes, sleeve length, etc.).
- a view e.g., front, back, side, etc.
- category of item e.g. dress, top, etc.
- other metadata stripes, sleeve length, etc.
- FIG. 8 illustrates examples of outputs from the process 700 shown in FIG. 7 .
- the output of the item model e.g., at 709
- the predefined item categories e.g., at 7021
- a confidence value e.g., ⁇ 1.5 to ⁇ 0.5
- FIG. 9 illustrates a flow diagram of a process 900 for matching items and making recommendations in accordance with aspects of the present disclosure.
- the process 900 e.g., processor 140 executing matching module 155 queues requests (e.g., styling requests 85 ) to the styling system (e.g., styling system 25 ) from a user (e.g., one of users 15 ).
- Each request can include one or more items that the user wishes a stylist to use as a basis for making recommendation and can be further used for matching items or similar items.
- the metadata generated through apparel and color categories can be used to aid the stylist to quickly find style matches to the item or items in the user's request. If the stylist is a virtual stylist, the metadata is used to create recommendations to the user.
- all incoming requests from users can be placed into a request queue that a number of stylists (e.g., stylists 20 ) can access the requests.
- requests may first pass to stylists of retailers (e.g., one of retailers 10 ) before going to a stylist of the styling system or ultimately a virtual stylist.
- the stylist of the styling system may be able to respond to the user's request, but only after a predetermined amount of time passes to give stylists of the retailers a chance to respond, and thereby not overload stylists of the styling system.
- a user may be able to explicitly make a request of a specific retailer, regardless of stylist.
- the user can explicitly make a request of a specific stylist regardless of retailer or stylist of the styling system.
- the process establishes a link between the user and a stylist.
- a stylist at a retailer can assist a user to create an account with the styling system.
- the stylist can link his/her stylist account (e.g., stored in stylist data 131 ) to the user such that the user may perform a direct request.
- the user can make a general request without specifically directing the request to a stylist or a retailer; the user request may be placed into the request queue. Multiple stylists may be able to respond, but it may be the case that only a single stylist can do so.
- the user can preselect a number (e.g., 5) of favorite retailers from which the user will be linked by the styling system.
- the styling system automatically choses the retailers for user based on brands included in the closet of the user (e.g., stored in user data 129 ). For example, the styling system can choose the 5 least busy retailers (e.g., based on information in retailer data 127 ); or the system can choose a first set of retailers based on the user' closet and 2 additional retailers randomly to expand the user's awareness of other brands and/or retailers. If no retailer stylists are available the system would automatically chose the stylist of the styling system or a virtual stylist, if available.
- the stylist using the system can determine items in the user's wardrobe and/or the retailer's inventory that match the requested item based on a comparison of their respective metadata.
- the styling system can generate a list of one or more items for reference by the stylist.
- the stylist may use the generated metadata of the user's wardrobe and/or retailer's inventory to search and find style matches to the item or items in the user's request in a time-efficient manner.
- the styling system may provide a confidence-ranked listing of matching clothing from the retailers or the personal wardrobe of the user.
- the process provides recommendations from the stylist to the user based on style matches determined at 907 .
- the stylist may have access to the user's information.
- the stylist can view the user profile which may include: body size and dimensions, preferred brands, preferred price range, closet items, and saved outfits, which may also provide alternatives in the outfit.
- the stylist may provide additional options for shirts in addition to the one that defines the outfit. If the user does not specify an item to build around, the stylist may still provide alternatives.
- the stylist may add commentary onto the outfit that is being created. The stylist may save the outfit whereupon it may be added to the user's virtual closet for later review by the user (which may kick off a notification to the user).
- a retailer may only be able to recommend pairings involving items from the user's closet or the retailer's offerings themselves. While the platform supports openness, retailers' desires may be factored into the styling system, as retailers may not want to promote other retailer's products (e.g., one retailer's employees would be perusing the catalogs of other retailers). Additionally, the stylists may not profit on such an offering, so there may be little financial incentive.
- the sharing mechanism may resemble a social messaging scheme, with a one-way following. However, the person being followed may be able to block the followers (and therefore may need to be able to see list of people following them).
- a user may also be able to follow other people. When looking at a person, a stream of posted items may be available (may involve browsing their closet, or a date sorted list of what they shared, or other). Messaging between users may be a feature (i.e., outside of comments). Thus, a user may share a whole wish list or suitcase, or may share just items/outfits.
- a further embodiment to the sharing process can be to allow users to perform styling system operations on other users without the need for retailer stylists, styling system stylists, or virtual stylist.
- the process 900 may proceed as follows.
- the user owns a pair of camo boots.
- the user can photograph the boots using a camera on their mobile phone and upload the photograph to the user's closet.
- the user can then send a request to the styling system stating, “I would like to wear these boots out on Friday night in the city. Can you help me create some outfits around them? They are a little out of my comfort zone so I am looking for something fun and cute but comfortable.”
- the user can receive confirmation from the styling system that their request has been sent.
- a stylist can be notified that a request from the user has been submitted. The stylist now reads the request and sees the item I want styled around it.
- the stylist reads the user's profile and browses the user's closet to get a better idea of the user's style. The stylist then creates outfits based on items already in my closet (assuming I have a fairly full closet) and then creates outfit options around all entirely new items based off the user's brand preferences in the user's profile. The stylist may send her “favorite” outfit back first with comments on the back: “I chose this look because the style really comfortable and feminine and the shirt is subtle allowing the boots to complete the outfit without being too flashy.” The stylist may have the option of saving the outfit to their closet before sending to the user.
- the stylist sends to user and receives a notification that “outfits delivered successfully.” Another box appears prompting “style Dakota1?” The stylist moves on to a next user and repeats the above process. The user receives a notification “you've been styled!” The user may be immediately taken to the recommended new outfits.
- One embodiment is that the user may see the first outfit and flip over for more information. She reads what the stylist inspiration was behind the outfit. She can now scroll through the different outfits that she was given. The user then stops on outfit number 4 and isn't crazy about the shirt suggested. The user clicks on the photo of the shirt and selects the swap button. A new shirt is recommended.
- the swapping can happen by examining the metadata and performing an automatic swap based on the style matches generated by the system or by returning to the stylist who performs the swap. If the user doesn't like that and decides to swap again. This time the user remembers a shirt she already owns may look cute. She selects “my closet” then “shirts” and selects her shirt she already owns. This is now the new outfit and she clicks save. The user has the option to “keep” or “delete” any of the outfits suggested and the items within them. The user will be prompted if she deletes an item from an outfit then that will create and incomplete outfit. The system may prompt, “Would you like to add new item now?” If she chooses yes, then she goes through options from her closet or catalog. If the user chooses no, an icon will appear in the corner indicating an incomplete outfit.
- the process 900 may proceed as follows.
- the stylist may be a professional fashion consultant employed as a personal shopper for a large fashion retailer.
- the stylist may a lot of repeat and loyal customers but most of them are from out of town.
- the stylist can upload their purchase history into their closet in the styling system. If the retailer introducers a new item for sale (e.g., a new Navy Blazer), the stylist may determine that the new item is one that the user would enjoy and would fit in her closet well.
- the stylist can then access the user's closet and start browsing through her items.
- the stylist create outfits based on the new item.
- the stylist can save one or more such outfits before sending them to the user.
- the user receives a notification stating that “your stylist has created outfits for you!”
- User opens outfits and flips the first outfit created “I thought this blazer would look great with the pants you recently purchased. It's perfect for a cool summer evening.”
- each block in the flowcharts, block diagrams, and/or illustrations may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 16/531,972, filed Aug. 5, 2019, which is a continuation of U.S. patent application Ser. No. 14/806,345, filed Jul. 22, 2015, which claims priority to U.S. Provisional Patent Application No. 62/027,388, which was filed on Jul. 22, 2014, the contents of which are incorporated herein by reference in their entireties.
- Numerous attempts have been made to provide services for fashion over the Internet. For example, some services generate personalized shopping plans by collecting images of the user and creating models of the user dressed in various pieces of apparel. Other services provide a mechanism for storing a user's clothing in a virtual wardrobe closet that allows the user to gather separate articles of clothing to form a coordinated outfit. However, such services are limited because they are unidirectional. For example, they suggest items without accounting for the user's existing wardrobe. Hence, these services do not provide fashion inspiration, validation, and empowerment that users desire.
- The present disclosure is directed to a specialized system that remotely manages wardrobes of individuals via an information network. In accordance with aspects of the present disclosure, the system obtains, via an information network, an image of an apparel item and text describing the apparel item. The system also determines descriptive metadata of the apparel item by extracting information from the text. Further, the system determines categorization metadata of the apparel item by analyzing the image. Additionally, the system stores the image, the descriptive metadata, and the categorization metadata in a virtual closet. Moreover, the system provides, via the information network, a recommendation of an outfit from a stylist to a user based on the virtual closet.
- The foregoing summary as well as the following detailed description is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, the drawings detail exemplary constructions of the invention; however, the invention is not limited to the specific methods and systems disclosed herein.
-
FIG. 1 illustrates a block diagram of an exemplary environment for implementing a system in accordance with aspects of the present disclosure; -
FIG. 2 illustrates a flow diagram of an exemplary intake process in accordance with aspects of the present disclosure; -
FIG. 3 illustrates exemplary data structures in accordance with aspects of the present disclosure; -
FIG. 4 illustrates a flow diagram of an exemplary process for determining color categorization metadata in accordance with aspects of the present disclosure; -
FIG. 5 illustrates a functional flow diagram of an exemplary process for determining color categorization metadata in accordance with aspects of the present disclosure; -
FIG. 6 illustrates exemplary color categorization metadata in accordance with aspects of the present disclosure; -
FIG. 7 illustrates a flow diagram of an exemplary process for determining a category of an item in accordance with aspects of the present disclosure; -
FIG. 8 illustrates exemplary type categorization metadata in accordance with aspects of the present disclosure; and -
FIG. 9 illustrates a flow diagram of an exemplary recommendation process in accordance with aspects of the present disclosure. - The present disclosure is generally directed to a specialized system that remotely manages wardrobes of individuals via an information network. In accordance with aspects of the present disclosure, the system stores personal wardrobes of users in virtual closets and links the users with stylists (e.g., individual or automated) that combine existing items in the personal wardrobes and/or new items that can be purchased from retailers. By using a wide-area network (e.g., the Internet) to link remotely located stylists with the personal wardrobes of users and with items (e.g., apparel, including accessories) available from retailers, systems in accordance with the present disclosure address the technical challenge of sharing information among users, retailers, and stylists over the Internet. Additionally, systems in accordance with the present disclosure address the technical challenge of transforming the information received from the users and the retailers into a new representation that enables efficient and accurate matching of items for providing fashion recommendations to users.
-
FIG. 1 illustrates a block diagram of an environment for implementing systems and processes in accordance with aspects of the present disclosure. The environment includes one ormore retailers 10, one or more users 15, one ormore stylists 20, and astyling system 25, each of which can be remotely located and communicatively linked via an information network (e.g., the Internet). Theretailers 10 can be merchants of apparel items, which include clothes, footwear, accessories, jewelry, fashion products, and the like. The users 15 can be consumers of such apparel items. Thestylists 20 can be individuals employed or contracted by theretailers 10 and/or an operator of thestyling system 25 that provide recommendations to the users 15 regarding combinations of apparel items. - In embodiments, the
styling system 25 creates, maintains, and/or curates virtual closets that can mix and match apparel items from theretailers 10 with apparel items already in a wardrobe of particular one of users 15. For example, each of theretailers 10 can provide images (e.g., photographs) andrespective descriptions 70 describing apparel items available to purchase to thestyling system 25, which are stored in virtual closets (e.g., virtual closet 128). Additionally, each of the users 15 can provide images andrespective descriptions 80 describing apparel items in their personal wardrobes to create virtual closets (e.g., virtual closet 130). The virtual closets can be stored at thestyling system 125 or locally at theretailers 10 and users 15. - In accordance with aspects of the present disclosure, the users 15 can populate their virtual closets with purchases from the
retailers 10 based on recommendations provided by one or more of thestylists 20 using information generated by thestyling system 25. Moreover, in response tostyling requests 85 from theusers 10, theretailers 10 and/orstylists 20 can push (e.g., periodically suggest)recommendations 90 to the users 15 based on new arrivals and/or updates to retailer-specific items. - In accordance with aspects of the present disclosure, the
styling system 25 includes hardware and software that perform the processes and functions described herein. In particular, thestyling system 25 includes acomputing device 120, an input/output (I/O)device 122, and astorage system 125. The I/O device 122 can include any device that enables theretailers 10, the users 15, and/or thestylists 20 to interact with the computing device 120 (e.g., a user interface) and/or any device that enables thecomputing device 120 to communicate with theretailers 10, the users 15, and/or thestylists 20 using any type of communications link. In embodiments, the communications link includes a wide-area network, such as the Internet. The I/O device 122 can be, for example, a touchscreen display, pointer device, keyboard, etc. - The
storage system 125 can comprise a computer-readable, non-volatile hardware storage device that stores information and computer-readable program instructions. For example, thestorage system 125 can be one or more flash drives and/or hard disk drives. In accordance with aspects of the present disclosure, thestorage system 125stores retailer data 127, user data 129,stylist data 131, andreference data 133. In accordance with aspects of the present disclosure, theretailer data 127, and the user data 129 includevirtual closets - In accordance with aspects of the present disclosure, the information stored in the
retailer data 127 and user data 129 can be updated, managed, and/or curated by theirrespective retailers 10 and/or users 15 (via, e.g., I/O interface 122). For each user 15, the user data 129 can include items owned by the user. Additionally, the information stored in user data 129 can include one or more wish lists for items that the user does not yet own, but has identified for purchase. For example, the user can selectively add items to the wish list based on a recommendation from astylist 20 and/or based on personal selections from online catalogs of theretailers 10. Further, the information stored in user data 129 can include collections (e.g., virtual suitcases) of items selected by the user as belonging to a common theme (e.g., items for a beach vacation). Still further, the information stored in the user data 129 can include matched sets (e.g., virtual outfits), which are a collection of clothing items that are brought together to form combinations with some fashion sense. Note that a matched set may be comprised of both owned and new items and still “saved” to the user's closet. The matched sets can include recommended outfits, which are outfits that may be recommended to the user 15 by astylist 20. If the user 15 likes the recommended outfit they may be able to save it to their closet. - In embodiments, the
computing device 120 includes one or more processors 140 (e.g., microprocessor, microchip, or application-specific integrated circuit), one or more memory devices 141 (e.g., RAM and ROM), one or more I/O interfaces 143, and one ormore network interfaces 145. Thememory device 141 can include a local memory (e.g., a random access memory and a cache memory) employed during execution of program instructions. Additionally, thecomputing device 120 includes at least one communication channel (e.g., a data bus) by which it communicates with the I/O device 122 and thestorage system 125. - The
processor 140 executes computer-readable program instructions for an operating system and application programs, which can be stored in thememory device 141 and/orstorage system 125. Moreover, theprocessor 140 can execute computer-readable program instructions stored in thememory devices 141 or thestorage system 125 for anintake module 151, acategorization module 153, and amatching module 155. In accordance with aspects of the present disclosure, theintake module 151 can receive images from, e.g., theretailers 10 and the users 15, and process the images to, e.g., crop the images, rescale the images, remove backgrounds from the images, rescale the images, and normalize the light and/or coloring of the images. In embodiments, removing a background comprises entirely removing elements in an image that are not part of an apparel item depicted in the image such that only they apparel item and a substantially uniform (e.g., monochrome) background remain in the image. - In accordance with aspects of the present disclosure, the
categorization module 151 can process images received by the intake module to generate metadata (e.g., tags) that describes, classifies, and/or categorizes the images. In embodiments, thecategorization module 153 may use computer vision to automatically generate the metadata from received images. For example, given an image of a shirt, thecategorization module 153 may analyze the image and output tags, such as shirt, long-sleeve, button-down, vertical stripes, etc. Thecategorization module 153 can include both a feature matcher to look at specific clothing features as well as a representation-based matcher that utilizes a catalog of clothing to build representative samples of different categories of clothing to assist in matching. - In accordance with aspects of the present disclosure, the
matching module 155 can determine matches between items contained in theretailer data 127 and/or the user data 129. For example, for an item identified by a user 15 in astyling request 85, thematching module 155 may provide a confidence-ranked selection of matching clothing from theretailers 10 or the personal wardrobe of the user 15. The matching can be performed manually by thestylists 20, and/or automatically by thematching module 155. In embodiments, the logic of thematching module 155 goes beyond rule-based decisions. For example, thematching module 155 can take into account combined colors and patterns. Further, thematching module 155 can be built to understand current and seasonal fashion trends to ensure the suggestions are up to date. Moreover, thematching module 155 can take into account both the user's personal wardrobe and retail catalogs to match items that are wholly owned by the user, items that are completely new, or any combination in-between. - The
computing device 120 can comprise any general purpose computing article of manufacture capable of executing computer program instructions installed thereon (e.g., a personal computer, server, etc.). However, thecomputing device 120 is only representative of various possible equivalent-computing devices that can perform the processes described herein. To this extent, in embodiments, the functionality provided by thecomputing device 120 can be any combination of general and/or specific purpose hardware and/or computer program instructions. In each embodiment, the program instructions and hardware can be created using standard programming and engineering techniques, respectively. -
FIG. 2 illustrates an exemplary flow diagram of an intake process in accordance with aspects of the present disclosure. At 203, the process 200 (which may be, e.g., performed by execution ofintake module 151 by processor 140) obtains items images and associated description data. The images can be obtained from one or more retailers (e.g., retailers 10) and/or one or more users (e.g., users 15). The description data can be information, including item name, gender, type (e.g., pants, tops, skirt, active wear, bags, blazers, suits, coats and jackets, dresses, intimates and sleepwear, jeans, jewelry, jumpsuits and rompers, shoes, shorts, sweaters, etc.), use category (e.g., work, formal, casual, etc.), material (e.g., silk, cotton, wool, polyester), season (e.g., spring, summer, winter, fall, holiday, etc.), and measurements (e.g., size and/or physical dimensions). - At 207, the process 200 (e.g., by execution of the intake module 151) normalizes the images obtained at 203. In embodiments, the normalizing includes cropping and/or scaling the images to a standard size. Additionally, the normalizing can include normalizing the luminance of the images. Further, the normalizing can include removing backgrounds from the images.
- At 211, the process 200 (e.g., by execution of the categorization module 153) determines descriptive metadata for items in the images obtained at 203. In embodiments, the metadata can be extracted from text of the descriptive information received with images. For example, the text can be obtained by character recognition or included in a data file (e.g., a JSON file). In embodiments, the text can be recognized by performing a bag-of-words search for predetermined words and phrases or by using natural language processing.
- At 215, the process 200 (e.g., by execution of the
categorization module 153 by the processor 140) determines categorization metadata for the images normalized at 211. In embodiments, the categorization metadata can be determined by analyzing the images using computer intelligence to categorize the images based on color and type. For example, based on a color model, the items in the images can be categorized into one or more of a number (e.g., 26) color categories. Additionally or alternatively, based on type model, the apparel in the images can be categorized into one or more apparel types (e.g., shirt, jacket, pants, skirt, dress, accessory, etc.) Further, for each of the color model and/or the type model, theprocess 200 can determine a confidence value associated with each of the determined colors and types. The confidence value can represent a probability that the determined color or type is correct. - At 219, the
process 200 stores the categorization metadata determined at 215 in association with the respective images normalized at 207. Where the images are obtained form a retailer (e.g., retailers 10), the images, and the respective metadata can be stored in retailer data (e.g.,retailer data 127 in storage system 125), and can be associated together a virtual closet (e.g., virtual closet 128). Where the images are obtained form a user, the images and the respective metadata can be stored in user data (e.g., user data 129 in storage system 125), and can be associated together a virtual closet (e.g., virtual closet 130). -
FIG. 3 illustrates exemplary data structures in accordance with aspects of the present disclosure. In embodiments, the data structures includeimages 301 andmetadata 303. Theimages 301 and themetadata 303 can be the same or similar to those previously disclosed herein. For example, animage 301 may be of a skirt that is associated with descriptive information. Themetadata 303 can include information extracted from the description (e.g., by intake module 151) and information extracted from the image 301 (e.g., by categorization module 153). Additionally, themetadata 303 can include information relating the item in the image 30 to one or more other items (“related”) determined by thematching module 155. WhileFIG. 3 only shows details of one image and a corresponding metadata record for the sake of illustration, it is understood that there can be a large number of images and metadata records (e.g., tens, hundreds, thousands, etc.) corresponding to each user (e.g., users 10). -
FIG. 4 illustrates a flow diagram of anexemplary process 400 for determining color categorization metadata in accordance with aspects of the present disclosure. At 402, theprocess 400 determines a color model. In embodiments, determining the color model includes defining a number (e.g., a set) of predetermined color categories at 4021. Additionally, defining the color model includes providing the model color samples corresponding to each of the categories defined in 4021 at 4023. Further, defining the color model includes generating color maps at 4025 corresponding to each of the categories defined in 4021 using the color sample provided in 4023. In accordance with aspects of the present disclosure, the color maps define a combination of color ranges (e.g., Cr, Cb) corresponding to each of the predetermined color categories in a color scheme (e.g., Chroma subsampling) Further, each of the color maps is indexed by a corresponding luminance value (e.g., Y) of the respective color sample. - At 405, the
process 400 trains to the color model defined at 402. In embodiments, training includes providing reference images (e.g., stored in reference data 144) having known colors and luminance values. At 409, the color model defined at 402 and trained at 409 is applied to images obtained from one or more retailers (e.g. retailers 10) and one or more users (e.g., users 15), which determines the colors in the images (e.g., Cr, Cb), as well as a respective luminance value (Y) for each of the colors. In embodiments, the determination is performed for one or more pixel areas included in each of the images. -
FIG. 5 illustrates a functional flow diagram of anexemplary process 500 for determining color categorization metadata in accordance with aspects of the present disclosure. Theprocess 500 may be the same or part of theprocess 400 previously described. Theprocess 500 categorizes the color of an item (e.g., an apparel item, such as a top) using a color maps of a color model, which may be the same as those previously described herein. In embodiments, the color model comprises a database of color maps (e.g., mapping Cr versus Cb) indexed by their respective luminance values (Y). Thus, in accordance with aspects of the present disclosure, theprocess 500 categorizes the color of a particular item in an image by determining the color (Cr, Cb) and luminance value (Y) (e.g., 501, 503, or 505) of an area of the image (e.g., a pixel or a co-located group of several pixels). Based on the luminance value (Y) of the area, theprocess 500 retrieves acolor map 507 from the database having the closest (e.g., the same or nearest) luminance value. Using the retrievedcolor map 507, theprocess 500 determines the category of color (e.g., green, blue, red) corresponding toareas color map 507. For example, the color 501 may have a Cr, Cb values corresponding to a position within “green”area 511 of thecolor map 507. Accordingly, theprocess 500 selects the predetermined color category (e.g., “green”) corresponding to the color range in thecolor map 507. - Additionally, in accordance with aspects of the present disclosure, the
process 500 can assign a confidence score to the selected color category based on the nearness of thecolors color areas color map 507. For example, the color 501 may be in the center of thegreen area 511 of thecolor map 507. Accordingly, theprocess 500 can assign a high confidence value. In comparison, thecolor 503 may be on the edge of both ared region 513 and apink region 515 of thecolor map 507. Accordingly, theprocess 503 may categorize thecolor 503 as both “red” and “pink,” each having a low confidence value with respect to the confidence value of color 501. -
FIG. 6 illustrates examples of color categories output and confidence values by theprocess 400 and/orprocess 500 in accordance with aspects of the present disclosure. As is shown inFIG. 6 , a particular item can be associated with one or more color categories. For example, amonochrome shirt 603 can belong to asingle color category 611 and have ahigh confidence value 613. Further, amulticolor shirt 605 or apatterned shirt 607 can be associated with a number of such categories, each having a relatively lower confidence value than themonochrome item 603. Thus, by categorizing items using multiple areas, efficient and accurate categorizations of multiple colors can be obtained. Thereby, systems in accordance with the present disclosure can quickly match different items with great accuracy. -
FIG. 7 illustrates an exemplary flow diagram of aprocess 700 for determining the category of an apparel item (e.g. long sleeve shirt, etc.) in accordance with aspects of the present disclosure. At 702, theprocess 700 defines an item model. In embodiments, defining the color model includes defining a number (e.g., a set) of predetermined item categories (7021) through shape definitions, machine learning, or some combination of both. Additionally, defining the category model (7025) includes providing the item samples (7023) corresponding to each of the categories defined in 7021. Further, defining the category model includes generating category maps corresponding to each of the categories defined in 7021 using the item sample provided in 7023. - At 705, the item model defined and trained at 702 is compared to images obtained from one or more retailers (e.g. retailers 10) or one or more users (e.g., users 15), which determines the item type in the images (709). Further, in embodiments, defining an item type determined in 709, the model can also determine a view (e.g., front, back, side, etc.) in addition to category of item (e.g. dress, top, etc.) or other metadata (stripes, sleeve length, etc.).
-
FIG. 8 illustrates examples of outputs from theprocess 700 shown inFIG. 7 . In embodiments, the output of the item model (e.g., at 709) is a mapping between the predefined item categories (e.g., at 7021) and a confidence value (e.g., −1.5 to −0.5) for a particular item in an image as belonging to each of the item categories (e.g., bottom, dress, long coat, skirt, shorts, etc.). -
FIG. 9 illustrates a flow diagram of aprocess 900 for matching items and making recommendations in accordance with aspects of the present disclosure. At 903, the process 900 (e.g.,processor 140 executing matching module 155) queues requests (e.g., styling requests 85) to the styling system (e.g., styling system 25) from a user (e.g., one of users 15). Each request can include one or more items that the user wishes a stylist to use as a basis for making recommendation and can be further used for matching items or similar items. The metadata generated through apparel and color categories can be used to aid the stylist to quickly find style matches to the item or items in the user's request. If the stylist is a virtual stylist, the metadata is used to create recommendations to the user. In embodiments, all incoming requests from users can be placed into a request queue that a number of stylists (e.g., stylists 20) can access the requests. In other embodiments, requests may first pass to stylists of retailers (e.g., one of retailers 10) before going to a stylist of the styling system or ultimately a virtual stylist. For example, the stylist of the styling system may be able to respond to the user's request, but only after a predetermined amount of time passes to give stylists of the retailers a chance to respond, and thereby not overload stylists of the styling system. In yet another embodiment, a user may be able to explicitly make a request of a specific retailer, regardless of stylist. In another embodiment, the user can explicitly make a request of a specific stylist regardless of retailer or stylist of the styling system. - At 905, the process establishes a link between the user and a stylist. In embodiments, to link a stylist with a user, either the user or the stylist can establish can enter the email address of the other. For example, a stylist at a retailer can assist a user to create an account with the styling system. In doing so, the stylist can link his/her stylist account (e.g., stored in stylist data 131) to the user such that the user may perform a direct request. In other embodiments, the user can make a general request without specifically directing the request to a stylist or a retailer; the user request may be placed into the request queue. Multiple stylists may be able to respond, but it may be the case that only a single stylist can do so. Additionally or alternatively, the user can preselect a number (e.g., 5) of favorite retailers from which the user will be linked by the styling system. In embodiments, the styling system automatically choses the retailers for user based on brands included in the closet of the user (e.g., stored in user data 129). For example, the styling system can choose the 5 least busy retailers (e.g., based on information in retailer data 127); or the system can choose a first set of retailers based on the user' closet and 2 additional retailers randomly to expand the user's awareness of other brands and/or retailers. If no retailer stylists are available the system would automatically chose the stylist of the styling system or a virtual stylist, if available.
- At 907, the stylist using the system (e.g., via matching
module 155 executed by processor 140) can determine items in the user's wardrobe and/or the retailer's inventory that match the requested item based on a comparison of their respective metadata. In embodiments, the styling system can generate a list of one or more items for reference by the stylist. In other embodiments, the stylist may use the generated metadata of the user's wardrobe and/or retailer's inventory to search and find style matches to the item or items in the user's request in a time-efficient manner. For example, the styling system may provide a confidence-ranked listing of matching clothing from the retailers or the personal wardrobe of the user. - At 909, the process provides recommendations from the stylist to the user based on style matches determined at 907. After being linked, the stylist may have access to the user's information. For example, the stylist can view the user profile which may include: body size and dimensions, preferred brands, preferred price range, closet items, and saved outfits, which may also provide alternatives in the outfit. For example, if the user sends a request to build an outfit around a pair of pants, the stylist may provide additional options for shirts in addition to the one that defines the outfit. If the user does not specify an item to build around, the stylist may still provide alternatives. The stylist may add commentary onto the outfit that is being created. The stylist may save the outfit whereupon it may be added to the user's virtual closet for later review by the user (which may kick off a notification to the user).
- In embodiments, it may be desirable to limit the recommendation options of stylist of a retailer. In other words, a retailer may only be able to recommend pairings involving items from the user's closet or the retailer's offerings themselves. While the platform supports openness, retailers' desires may be factored into the styling system, as retailers may not want to promote other retailer's products (e.g., one retailer's employees would be perusing the catalogs of other retailers). Additionally, the stylists may not profit on such an offering, so there may be little financial incentive. Similarly, if any stylist from a given retailer can recommend items from any other retailer, then a bias problem may arise in which certain retailer stylists are selected more frequently by users (due to the cachet of having an outfit designed by, e.g., a famous or otherwise desirable brand or retailer, even though that retailer doesn't recommend apparel from that brand or retailer) than others, thereby potentially reducing the crowd advantage.
- At 913, the user can share the recommendations with other users. The sharing mechanism may resemble a social messaging scheme, with a one-way following. However, the person being followed may be able to block the followers (and therefore may need to be able to see list of people following them). There may be three levels of privacy: Private, Followers, and Public. Private may mean no one can see it. Followers may mean only the unblocked followers can see it. Public may mean open to world. A user may also be able to follow other people. When looking at a person, a stream of posted items may be available (may involve browsing their closet, or a date sorted list of what they shared, or other). Messaging between users may be a feature (i.e., outside of comments). Thus, a user may share a whole wish list or suitcase, or may share just items/outfits. A further embodiment to the sharing process can be to allow users to perform styling system operations on other users without the need for retailer stylists, styling system stylists, or virtual stylist.
- From the perspective of the user, the
process 900 may proceed as follows. The user owns a pair of camo boots. The user can photograph the boots using a camera on their mobile phone and upload the photograph to the user's closet. The user can then send a request to the styling system stating, “I would like to wear these boots out on Friday night in the city. Can you help me create some outfits around them? They are a little out of my comfort zone so I am looking for something fun and cute but comfortable.” In response, the user can receive confirmation from the styling system that their request has been sent. A stylist can be notified that a request from the user has been submitted. The stylist now reads the request and sees the item I want styled around it. The stylist reads the user's profile and browses the user's closet to get a better idea of the user's style. The stylist then creates outfits based on items already in my closet (assuming I have a fairly full closet) and then creates outfit options around all entirely new items based off the user's brand preferences in the user's profile. The stylist may send her “favorite” outfit back first with comments on the back: “I chose this look because the style really comfortable and feminine and the shirt is subtle allowing the boots to complete the outfit without being too flashy.” The stylist may have the option of saving the outfit to their closet before sending to the user. The stylist sends to user and receives a notification that “outfits delivered successfully.” Another box appears prompting “style Dakota1?” The stylist moves on to a next user and repeats the above process. The user receives a notification “you've been styled!” The user may be immediately taken to the recommended new outfits. One embodiment is that the user may see the first outfit and flip over for more information. She reads what the stylist inspiration was behind the outfit. She can now scroll through the different outfits that she was given. The user then stops on outfit number 4 and isn't crazy about the shirt suggested. The user clicks on the photo of the shirt and selects the swap button. A new shirt is recommended. The swapping can happen by examining the metadata and performing an automatic swap based on the style matches generated by the system or by returning to the stylist who performs the swap. If the user doesn't like that and decides to swap again. This time the user remembers a shirt she already owns may look cute. She selects “my closet” then “shirts” and selects her shirt she already owns. This is now the new outfit and she clicks save. The user has the option to “keep” or “delete” any of the outfits suggested and the items within them. The user will be prompted if she deletes an item from an outfit then that will create and incomplete outfit. The system may prompt, “Would you like to add new item now?” If she chooses yes, then she goes through options from her closet or catalog. If the user chooses no, an icon will appear in the corner indicating an incomplete outfit. - From a perspective of the stylist, the
process 900 may proceed as follows. The stylist may be a professional fashion consultant employed as a personal shopper for a large fashion retailer. The stylist may a lot of repeat and loyal customers but most of them are from out of town. The stylist can upload their purchase history into their closet in the styling system. If the retailer introducers a new item for sale (e.g., a new Navy Blazer), the stylist may determine that the new item is one that the user would enjoy and would fit in her closet well. The stylist can then access the user's closet and start browsing through her items. Then, using items already included in the user's closet, the stylist create outfits based on the new item. The stylist can save one or more such outfits before sending them to the user. The user receives a notification stating that “your stylist has created outfits for you!” User opens outfits and flips the first outfit created “I thought this blazer would look great with the pants you recently purchased. It's perfect for a cool summer evening.” - The above features have been described in detail with particular reference to certain preferred embodiments thereof, but it will be understood that variations and modifications can be effected within the spirit and scope of the disclosure. One of skill in the art will appreciate that each of the above are exemplary implementations and are not to be construed as a limitation on the scope of the present disclosure.
- The flowcharts, block diagrams, and/or illustrations in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowcharts, block diagrams, and/or illustrations may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware □ based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
- The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
- The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in any ensuing claims are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/478,737 US20220005101A1 (en) | 2014-07-22 | 2021-09-17 | System and method for social style mapping |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462027388P | 2014-07-22 | 2014-07-22 | |
US14/806,345 US10373231B2 (en) | 2014-07-22 | 2015-07-22 | System and method for social style mapping |
US16/531,972 US11132734B2 (en) | 2014-07-22 | 2019-08-05 | System and method for social style mapping |
US17/478,737 US20220005101A1 (en) | 2014-07-22 | 2021-09-17 | System and method for social style mapping |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/531,972 Continuation US11132734B2 (en) | 2014-07-22 | 2019-08-05 | System and method for social style mapping |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220005101A1 true US20220005101A1 (en) | 2022-01-06 |
Family
ID=55167070
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/806,345 Active 2037-04-29 US10373231B2 (en) | 2014-07-22 | 2015-07-22 | System and method for social style mapping |
US16/531,972 Active US11132734B2 (en) | 2014-07-22 | 2019-08-05 | System and method for social style mapping |
US17/478,737 Pending US20220005101A1 (en) | 2014-07-22 | 2021-09-17 | System and method for social style mapping |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/806,345 Active 2037-04-29 US10373231B2 (en) | 2014-07-22 | 2015-07-22 | System and method for social style mapping |
US16/531,972 Active US11132734B2 (en) | 2014-07-22 | 2019-08-05 | System and method for social style mapping |
Country Status (1)
Country | Link |
---|---|
US (3) | US10373231B2 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210150287A1 (en) * | 2019-11-20 | 2021-05-20 | Samsung Electronics Co., Ltd. | Apparatus and method of using ai metadata related to image quality |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106033195A (en) * | 2015-03-10 | 2016-10-19 | 青岛海尔洗衣机有限公司 | Intelligent clothes management device |
US20160307251A1 (en) * | 2015-04-15 | 2016-10-20 | Mastercard International Incorporated | Smart closet |
US10885809B2 (en) | 2015-05-21 | 2021-01-05 | Gammakite, Inc. | Device for language teaching with time dependent data memory |
US11704692B2 (en) * | 2016-05-12 | 2023-07-18 | Pinterest, Inc. | Promoting representations of items to users on behalf of sellers of those items |
JP6922400B2 (en) * | 2017-05-15 | 2021-08-18 | 富士通株式会社 | Fashion analysis program, fashion analyzer and fashion analysis method |
US11049166B2 (en) * | 2017-07-12 | 2021-06-29 | Glamhive, Inc. | Systems and methods for managing product recommendations and affiliate links |
US11232511B1 (en) * | 2017-11-28 | 2022-01-25 | A9.Com, Inc. | Computer vision based tracking of item utilization |
US20240037858A1 (en) * | 2022-07-28 | 2024-02-01 | Snap Inc. | Virtual wardrobe ar experience |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110320317A1 (en) * | 2008-06-20 | 2011-12-29 | Google Inc., A Delaware Corporation | Image capture for purchases |
US20130282526A1 (en) * | 2012-04-18 | 2013-10-24 | Mastercard International Incorporated | Method and system for displaying product information on a consumer device |
US20140035913A1 (en) * | 2012-08-03 | 2014-02-06 | Ebay Inc. | Virtual dressing room |
US20140310304A1 (en) * | 2013-04-12 | 2014-10-16 | Ebay Inc. | System and method for providing fashion recommendations |
US8903182B1 (en) * | 2012-03-08 | 2014-12-02 | Google Inc. | Image classification |
US20150170250A1 (en) * | 2009-12-17 | 2015-06-18 | Navneet Dalal | Recommendation engine for clothing and apparel |
US9401032B1 (en) * | 2014-06-26 | 2016-07-26 | Amazon Technologies, Inc. | Image-based color palette generation |
US9672556B2 (en) * | 2013-08-15 | 2017-06-06 | Nook Digital, Llc | Systems and methods for programatically classifying text using topic classification |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9768965B2 (en) | 2009-05-28 | 2017-09-19 | Adobe Systems Incorporated | Methods and apparatus for validating a digital signature |
US20110142335A1 (en) * | 2009-12-11 | 2011-06-16 | Bernard Ghanem | Image Comparison System and Method |
US8855375B2 (en) * | 2012-01-12 | 2014-10-07 | Kofax, Inc. | Systems and methods for mobile image capture and processing |
US8947528B2 (en) * | 2012-05-22 | 2015-02-03 | Eastman Kodak Company | Container-classification identification using directional-antenna RFID |
US20140180864A1 (en) * | 2012-12-20 | 2014-06-26 | Ebay Inc. | Personalized clothing recommendation system and method |
US20140279186A1 (en) * | 2013-03-13 | 2014-09-18 | Yahoo! Inc. | Digital wardrobe with recommender system |
US9372721B2 (en) * | 2013-05-09 | 2016-06-21 | Ricoh Company, Ltd. | System for processing data received from various data sources |
-
2015
- 2015-07-22 US US14/806,345 patent/US10373231B2/en active Active
-
2019
- 2019-08-05 US US16/531,972 patent/US11132734B2/en active Active
-
2021
- 2021-09-17 US US17/478,737 patent/US20220005101A1/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110320317A1 (en) * | 2008-06-20 | 2011-12-29 | Google Inc., A Delaware Corporation | Image capture for purchases |
US20150170250A1 (en) * | 2009-12-17 | 2015-06-18 | Navneet Dalal | Recommendation engine for clothing and apparel |
US8903182B1 (en) * | 2012-03-08 | 2014-12-02 | Google Inc. | Image classification |
US20130282526A1 (en) * | 2012-04-18 | 2013-10-24 | Mastercard International Incorporated | Method and system for displaying product information on a consumer device |
US20140035913A1 (en) * | 2012-08-03 | 2014-02-06 | Ebay Inc. | Virtual dressing room |
US20140310304A1 (en) * | 2013-04-12 | 2014-10-16 | Ebay Inc. | System and method for providing fashion recommendations |
US9672556B2 (en) * | 2013-08-15 | 2017-06-06 | Nook Digital, Llc | Systems and methods for programatically classifying text using topic classification |
US9401032B1 (en) * | 2014-06-26 | 2016-07-26 | Amazon Technologies, Inc. | Image-based color palette generation |
Non-Patent Citations (1)
Title |
---|
Cheng, Ching-I, A Decision Making Framework for Dressing Consultant, 01 April 2007, IEEE Symposium on Computational Intelligence in Multi-Criteria Decision-Making, pp. 267-271 (Year: 2007) * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210150287A1 (en) * | 2019-11-20 | 2021-05-20 | Samsung Electronics Co., Ltd. | Apparatus and method of using ai metadata related to image quality |
US11636626B2 (en) * | 2019-11-20 | 2023-04-25 | Samsung Electronics Co., Ltd. | Apparatus and method of using AI metadata related to image quality |
Also Published As
Publication number | Publication date |
---|---|
US20190362411A1 (en) | 2019-11-28 |
US10373231B2 (en) | 2019-08-06 |
US11132734B2 (en) | 2021-09-28 |
US20160027088A1 (en) | 2016-01-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220005101A1 (en) | System and method for social style mapping | |
US10963944B2 (en) | System and method for fashion recommendations | |
US10242396B2 (en) | Automatic color palette based recommendations for affiliated colors | |
US20180308151A1 (en) | Enhancing revenue of a retailer by making a recommendation to a customer | |
KR101886161B1 (en) | Method for providing clothing management service based on ai | |
JP5443854B2 (en) | Computer-implemented method to facilitate social networking based on fashion-related information | |
US20170039628A1 (en) | Image processing method and apparatus | |
CN113454670A (en) | Extending machine learning training data to generate artificial intelligence recommendation engine | |
US11157988B2 (en) | System and method for fashion recommendations | |
KR20200046924A (en) | The Automatic Recommendation System and Method of the Fashion Coordination | |
WO2010002920A1 (en) | System and method for networking shops online and offline | |
US20090281922A1 (en) | Method and system for selecting clothing items according to predetermined criteria | |
KR102550214B1 (en) | Artificial intelligence-based styling recommendation system for body parts and situations | |
US11509712B2 (en) | Fashion item analysis based on user ensembles in online fashion community | |
US11544768B2 (en) | System and method for fashion recommendations | |
KR20200091593A (en) | The user fitting type automatic online fashion coordination matching method | |
US11526925B2 (en) | System and method for fashion recommendations | |
WO2020079235A1 (en) | Method and apparatus for accessing clothing | |
US11430043B2 (en) | System and method for fashion recommendations | |
KR20200048508A (en) | Method and appatus for providing personalized clothing information | |
CN111507790A (en) | Processing method of clothing matching information, data object processing method, system and equipment | |
NL2022937B1 (en) | Method and Apparatus for Accessing Clothing | |
US11328339B2 (en) | System and method for fashion recommendations | |
US11790429B2 (en) | Systems and methods for interpreting colors and backgrounds of maps | |
KR102512327B1 (en) | How to operate a shopping mall that provides customized denim product information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: REMOTERETAIL, INC., DISTRICT OF COLUMBIA Free format text: CHANGE OF NAME;ASSIGNOR:SNAP+STYLE, INC;REEL/FRAME:057538/0863 Effective date: 20200609 Owner name: SNAP+STYLE, INC, DISTRICT OF COLUMBIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JENSEN, ANNA;FERNANDEZ, RAUL;MYERS, GARY;AND OTHERS;SIGNING DATES FROM 20150817 TO 20150820;REEL/FRAME:057520/0470 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |