US12175777B2 - Systems and methods for fashion accessory evaluation - Google Patents

Systems and methods for fashion accessory evaluation Download PDF

Info

Publication number
US12175777B2
US12175777B2 US18/055,923 US202218055923A US12175777B2 US 12175777 B2 US12175777 B2 US 12175777B2 US 202218055923 A US202218055923 A US 202218055923A US 12175777 B2 US12175777 B2 US 12175777B2
Authority
US
United States
Prior art keywords
fashion accessory
replacement
fashion
accessory
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US18/055,923
Other versions
US20230252805A1 (en
Inventor
Ya Hsuan CHEN
Ming-Chun Ko
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect Mobile Corp
Original Assignee
Perfect Mobile Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect Mobile Corp filed Critical Perfect Mobile Corp
Priority to US18/055,923 priority Critical patent/US12175777B2/en
Assigned to Perfect Mobile Corp. reassignment Perfect Mobile Corp. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHEN, YA HSUAN, KO, MING-CHUN
Publication of US20230252805A1 publication Critical patent/US20230252805A1/en
Application granted granted Critical
Publication of US12175777B2 publication Critical patent/US12175777B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/66Trinkets, e.g. shirt buttons or jewellery items
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Electronic shopping [e-shopping] utilising user interfaces specially adapted for shopping
    • G06Q30/0643Electronic shopping [e-shopping] utilising user interfaces specially adapted for shopping graphically representing goods, e.g. 3D product representation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/60Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes

Definitions

  • the present disclosure generally relates to systems and methods for evaluating fashion accessories.
  • a computing device obtains an image of a user and detects at least one fashion accessory depicted in the image.
  • the computing device determines a fashion accessory category for each of the at least one detected fashion accessory and retrieves at least one candidate fashion accessory associated with the accessory category from a data store.
  • the computing device determines attributes of the fashion accessory and a replacement fashion accessory and performs virtual application of the replacement fashion accessory on the user based on the attributes of the fashion accessory and the replacement fashion accessory.
  • Another embodiment is a system that comprises a memory storing instructions and a processor coupled to the memory.
  • the processor is configured by the instructions to obtain an image of a user and detect at least one fashion accessory depicted in the image.
  • the processor is further configured by the instructions to determine a fashion accessory category for each of the at least one detected fashion accessory and retrieve at least one candidate fashion accessory associated with the accessory category from a data store.
  • the processor is further configured by the instructions to determine attributes of the fashion accessory and a replacement fashion accessory and perform virtual application of the replacement fashion accessory on the user based on the attributes of the fashion accessory and the replacement fashion accessory.
  • Another embodiment is a non-transitory computer-readable storage medium storing instructions to be implemented by a computing device.
  • the computing device comprises a processor, wherein the instructions, when executed by the processor, cause the computing device to obtain an image of a user and detect at least one fashion accessory depicted in the image.
  • the processor is further configured by the instructions to determine a fashion accessory category for each of the at least one detected fashion accessory and retrieve at least one candidate fashion accessory associated with the accessory category from a data store.
  • the processor is further configured by the instructions to determine attributes of the fashion accessory and a replacement fashion accessory and perform virtual application of the replacement fashion accessory on the user based on the attributes of the fashion accessory and the replacement fashion accessory.
  • FIG. 1 is a block diagram of a computing device configured to provide an accessory evaluation service according to various embodiments of the present disclosure.
  • FIG. 2 is a schematic diagram of the computing device of FIG. 1 in accordance with various embodiments of the present disclosure.
  • FIG. 3 is a top-level flowchart illustrating examples of functionality implemented as portions of the computing device of FIG. 1 for providing an accessory evaluation service according to various embodiments of the present disclosure.
  • FIG. 4 illustrates an example user interface generated by the computing device of FIG. 1 according to various embodiments of the present disclosure.
  • FIG. 5 illustrates another example user interface for selecting a replacement fashion accessory according to various embodiments of the present disclosure.
  • FIG. 6 illustrates determination of areas occupied by an original fashion accessory and a selected replacement fashion accessory according to various embodiments of the present disclosure.
  • FIG. 7 illustrates the region in which inpainting is applied based on areas occupied by an original fashion accessory and a selected replacement fashion accessory according to various embodiments of the present disclosure.
  • FIG. 8 illustrates virtual application of the selected replacement fashion accessory according to various embodiments of the present disclosure.
  • FIG. 9 illustrates lighting attributes associated with an original fashion accessory according to various embodiments of the present disclosure.
  • FIG. 10 illustrates duplication of the lighting attributes shown in FIG. 9 on the selected replacement fashion accessory according to various embodiments of the present disclosure.
  • FIG. 11 illustrates detected fashion accessories being displayed based on a predefined priority according to various embodiments of the present disclosure.
  • the present disclosure is directed to systems and methods for providing an accessory evaluation service that detects the presence of one or more fashion accessories currently worn by an individual depicted in an image.
  • the accessory evaluation service retrieves candidate fashion accessories based on the detected fashion accessories and presents the candidate fashion accessories to the individual. The individual is then able to select a new fashion accessory and replace the fashion accessory currently worn by the individual, thereby allowing the individual to try on new fashion accessories without actually purchasing the fashion accessories.
  • FIG. 1 is a block diagram of a computing device 102 in which the embodiments disclosed herein may be implemented.
  • the computing device 102 may comprise one or more processors that execute machine executable instructions to perform the features described herein.
  • the computing device 102 may be embodied as a computing device such as, but not limited to, a smartphone, a tablet-computing device, a laptop, and so on.
  • a fashion accessory evaluation service 104 executes on a processor of the computing device 102 and includes an import module 106 , an accessory detector 108 , an object modifier 110 , and an image editor 112 .
  • the import module 106 is configured to obtain digital images of a user wearing one or more original fashion accessories.
  • the fashion accessory evaluation service 104 allows the user to select other desired fashion accessories to try on in place of the original fashion accessories worn by the user.
  • the import module 106 is configured to cause a camera (e.g., front-facing camera) of the computing device 102 to capture an image or a video of a user of the computing device 102 .
  • the import module 106 may obtain an image or video of the user from another device or server where the computing device 102 may be equipped with the capability to connect to the Internet.
  • the images obtained by the import module 106 may be encoded in any of a number of formats including, but not limited to, JPEG (Joint Photographic Experts Group) files, TIFF (Tagged Image File Format) files, PNG (Portable Network Graphics) files, GIF (Graphics Interchange Format) files, BMP (bitmap) files or any number of other digital formats.
  • JPEG Joint Photographic Experts Group
  • TIFF Tagged Image File Format
  • PNG Portable Network Graphics
  • GIF Graphics Interchange Format
  • BMP bitmap
  • the video may be encoded in formats including, but not limited to, Motion Picture Experts Group (MPEG)-1, MPEG-2, MPEG-4, H.264, Third Generation Partnership Project (3GPP), 3GPP-2, Standard-Definition Video (SD-Video), High-Definition Video (HD-Video), Digital Versatile Disc (DVD) multimedia, Video Compact Disc (VCD) multimedia, High-Definition Digital Versatile Disc (HD-DVD) multimedia, Digital Television Video/High-definition Digital Television (DTV/HDTV) multimedia, Audio Video Interleave (AVI), Digital Video (DV), QuickTime (QT) file, Windows Media Video (WMV), Advanced System Format (ASF), Real Media (RM), Flash Media (FLV), an MPEG Audio Layer III (MP3), an MPEG Audio Layer II (MP2), Waveform Audio Format (WAV), Windows Media Audio (WMA), 360 degree video, 3D scan model, or any number of other digital formats.
  • MPEG Motion Picture Experts Group
  • MPEG-4 High-Definition Video
  • the accessory detector 108 is configured to detect a fashion accessory currently worn by the user in the image obtained by the import module 106 .
  • the accessory detector 108 performs image semantic segmentation on original fashion accessories depicted in the image of the user and searches a data store 116 for other candidate fashion accessories 118 that fall within the same category.
  • a first category e.g., “CATEGORY 1”
  • another category can comprise handbags.
  • Yet another category can comprise bracelets.
  • the candidate fashion accessories 118 retrieved by the accessory detector 108 are then presented to the user in a user interface displayed on the computing device 102 , thereby allowing the user to select one or more desired fashion accessories to replace fashion accessories currently worn by the user.
  • the accessory detector 108 may comprise an accessory size detection module (not shown) configured to determine size or area attributes of the fashion accessories currently worn by the user.
  • the accessory size detection module may be configured to determine the area occupied by the fashion accessories currently worn by the user relative to the entire image obtained by the import module 106 .
  • the accessory detector 108 utilizes depth and size information derived by a front-facing camera of the computing device 102 to determine size or area attributes of the fashion accessories currently worn by the user.
  • the object modifier 110 is configured to obtain a selection from the user of one or more of the candidate fashion accessories 118 where the selection comprises one or more replacement fashion accessories. As the size and shape of the replacement fashion accessory may differ from that of the fashion accessory currently worn by the user, the object modifier 110 analyzes attributes of the detected (original) fashion accessory and of the replacement fashion accessory. For example, the object modifier 110 may be configured to determine a first area occupied by the detected fashion accessory in the image. The object modifier 110 also determines a second area to be occupied by the replacement fashion accessory in the image. In particular, the object modifier 110 determines the second area occupied by the replacement fashion accessory when the replacement fashion accessory is virtually applied to the user in the image. The object modifier 110 then determines a difference between the first area and the second area.
  • Virtual application of the replacement fashion accessory on the user is then performed by the image editor 112 based on the difference between the first area and the second area.
  • the image editor 112 may perform virtual application of the replacement fashion accessory on the user by covering the detected fashion accessory with the replacement fashion accessory or removing the detected fashion accessory when the second area is greater than the first area.
  • the image editor 112 may perform virtual application of the replacement fashion accessory on the user by removing the detected fashion accessory and performing inpainting in an area around the replacement fashion accessory when the second area is less than the first area. Inpainting generally refers to the process of reconstructing lost or deteriorated parts of images and videos.
  • the image editor 112 performs virtual application of the replacement fashion accessory on a segment-by-segment basis within the image where each segment has a predetermined size.
  • each segment may be defined according to a predetermined number of pixels.
  • Other attributes analyzed by the object modifier 110 may include lighting attributes of the detected fashion accessory in the image.
  • the object modifier 110 may be configured to determine such lighting attributes as the angle of lighting incident on the detected fashion accessory in the image, a color of the lighting incident on the detected fashion accessory in the image, and shading on the detected fashion accessory in the image. The object modifier 110 then reproduces these lighting attributes on the replacement fashion accessory to generate a modified replacement fashion accessory and performs virtual application of the modified replacement fashion accessory on the user to produce a more accurate depiction of the replacement fashion accessory in the image.
  • FIG. 2 illustrates a schematic block diagram of the computing device 102 in FIG. 1 .
  • the computing device 102 may be embodied as a desktop computer, portable computer, dedicated server computer, multiprocessor computing device, smart phone, tablet, and so forth.
  • the computing device 102 comprises memory 214 , a processing device 202 , a number of input/output interfaces 204 , a network interface 206 , a display 208 , a peripheral interface 211 , and mass storage 226 , wherein each of these components are connected across a local data bus 210 .
  • the processing device 202 may include a custom made processor, a central processing unit (CPU), or an auxiliary processor among several processors associated with the computing device 102 , a semiconductor based microprocessor (in the form of a microchip), a macroprocessor, one or more application specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, and so forth.
  • a custom made processor a central processing unit (CPU), or an auxiliary processor among several processors associated with the computing device 102 , a semiconductor based microprocessor (in the form of a microchip), a macroprocessor, one or more application specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, and so forth.
  • CPU central processing unit
  • ASICs application specific integrated circuits
  • the memory 214 may include one or a combination of volatile memory elements (e.g., random-access memory (RAM, such as DRAM, and SRAM, etc.)) and nonvolatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.).
  • RAM random-access memory
  • nonvolatile memory elements e.g., ROM, hard drive, tape, CDROM, etc.
  • the memory 214 typically comprises a native operating system 216 , one or more native applications, emulation systems, or emulated applications for any of a variety of operating systems and/or emulated hardware platforms, emulated operating systems, etc.
  • the applications may include application specific software that may comprise some or all the components of the computing device 102 displayed in FIG. 1 .
  • the components are stored in memory 214 and executed by the processing device 202 , thereby causing the processing device 202 to perform the operations/functions disclosed herein.
  • the components in the computing device 102 may be implemented by hardware and/or software.
  • Input/output interfaces 204 provide interfaces for the input and output of data.
  • the computing device 102 comprises a personal computer
  • these components may interface with one or more input/output interfaces 204 , which may comprise a keyboard or a mouse, as shown in FIG. 2 .
  • the display 208 may comprise a computer monitor, a plasma screen for a PC, a liquid crystal display (LCD) on a hand held device, a touchscreen, or other display device.
  • LCD liquid crystal display
  • a non-transitory computer-readable medium stores programs for use by or in connection with an instruction execution system, apparatus, or device. More specific examples of a computer-readable medium may include by way of example and without limitation: a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory), and a portable compact disc read-only memory (CDROM) (optical).
  • RAM random access memory
  • ROM read-only memory
  • EPROM erasable programmable read-only memory
  • CDROM portable compact disc read-only memory
  • FIG. 3 is a flowchart 300 in accordance with various embodiments for providing an accessory evaluation service, where the operations are performed by the computing device 102 of FIG. 1 . It is understood that the flowchart 300 of FIG. 3 provides merely an example of the different types of functional arrangements that may be employed to implement the operation of the various components of the computing device 102 . As an alternative, the flowchart 300 of FIG. 3 may be viewed as depicting an example of steps of a method implemented in the computing device 102 according to one or more embodiments.
  • flowchart 300 of FIG. 3 shows a specific order of execution, it is understood that the order of execution may differ from that which is displayed. For example, the order of execution of two or more blocks may be scrambled relative to the order shown. In addition, two or more blocks shown in succession in FIG. 3 may be executed concurrently or with partial concurrence. It is understood that all such variations are within the scope of the present disclosure.
  • the computing device 102 obtains an image of a user.
  • the computing device 102 detects at least one fashion accessory depicted in the image.
  • the fashion accessory may comprise, for example, a necklace, rings, earrings, bracelets, watches, hats, and/or a handbag.
  • the computing device 102 determines a fashion accessory category for each of the at least one detected fashion accessory. For some embodiments, when more than one fashion accessory is detected, the user is given the opportunity to select one or more of the detected fashion accessories to replace. For example, the detected fashion accessories may be highlighted or thumbnails representing the detected fashion accessories may be grouped together and displayed in the user interface, where the user is able to select one or more of the detected fashion accessories to replace.
  • the detected fashion accessories may be displayed in a particular order according to a desired priority set by the user.
  • FIG. 11 shows detected fashion accessories being displayed based on a predefined priority.
  • a thumbnail of a handbag 1102 is shown first on the left followed by thumbnail of a necklace 1104 and so on.
  • the priority may be specified by the user or may be based on popularity. Note that other layouts may be used.
  • fashion accessories comprising handbags 1102 may be assigned a highest priority and shown at the top followed, for example, by necklaces 1104 and so on.
  • the computing device 102 retrieves at least one candidate fashion accessory 118 associated with the accessory category from a data store.
  • an artificial intelligence (AI) engine executing in the computing device 102 retrieves the candidate fashion accessories 118 from the data store.
  • the data store is not limited to being implemented in the computing device 102 and may comprise cloud storage where the computing device 102 accesses data from the data store over a network.
  • each candidate fashion accessory 118 in the data store has associated metadata where the metadata may include one or more labels or tags.
  • the metadata for a pearl necklace may include two labels—“pearl” and “necklace.”
  • the AI engine may be configured to derive metadata describing the user's current makeup and clothing style to retrieve at least one candidate fashion accessory 118 associated with the accessory category from the data store.
  • the metadata includes clothing styles that may be characterized as, for example, casual daily wear, luxury or party outfits, colorful clothing, and so on.
  • the computing device 102 examines the metadata associated with each fashion accessory to identify suitable candidates.
  • the computing device 102 may determine the type or category of detected fashion accessory and perform a search in the data store for an associated label(s) using a keyword search or other search technique.
  • Fashion accessories in the data store with label(s) that match that of the detected fashion accessory may be prioritized such that other fashion accessories in the data store with labels that only partially match may be assigned a lower priority when candidate fashion accessories are displayed to the user.
  • the candidate fashion accessory with the highest priority may be shown at the top.
  • the computing device 102 determines attributes of the fashion accessory and a replacement fashion accessory. For some embodiments, this comprises utilizing depth and size information derived by a front-facing camera of the computing device 102 to determine a size of the fashion accessory. For some embodiments, the computing device 102 determines the attributes of the fashion accessory and the replacement fashion accessory by determining a first area occupied by the detected fashion accessory in the image and determining a second area to be occupied by the replacement fashion accessory in the image. The computing device 102 determines a difference between the first area and the second area. The computing device 102 then performs virtual application of the replacement fashion accessory on the user based on the difference between the first area and the second area.
  • the computing device 102 performs virtual application of the replacement fashion accessory on the user where inpainting is performed in an area around the replacement fashion accessory when the second area is less than the first area.
  • the computing device 102 performs virtual application of the replacement fashion accessory on the user by covering the detected fashion accessory with the replacement fashion accessory when the second area is greater than the first area.
  • the computing device 102 determines lighting attributes of the detected fashion accessory in the image and reproduces the lighting attributes on the replacement fashion accessory to generate a modified replacement fashion accessory. Virtual application of the modified replacement fashion accessory is then performed on the user.
  • the lighting attributes can comprise, for example, the angle of lighting incident on the detected fashion accessory in the image, a color of the lighting incident on the detected fashion accessory in the image, and shading on the detected fashion accessory in the image.
  • the computing device 102 performs virtual application of the replacement fashion accessory on the user based on the attributes of the fashion accessory and the replacement fashion accessory. Thereafter, the process in FIG. 3 ends.
  • FIG. 4 illustrates an example user interface 402 provided on a display of the computing device 102 whereby an image of the user is displayed.
  • the import module 106 executing in the computing device 102 can be configured to cause a camera (e.g., front-facing camera) of the computing device 102 to capture an image or a video of a user of the computing device 102 .
  • the image depicts the user wearing a fashion accessory 404 (i.e., earrings).
  • the accessory detector 108 executing in the computing device 102 detects the presence of one or more fashion accessories 404 currently worn by an individual depicted in an image.
  • the computing device 102 retrieves candidate fashion accessories 406 based on the one or more detected fashion accessories and presents the candidate fashion accessories to the individual, thereby giving the individual an opportunity to try on one or more replacement fashion accessories.
  • the computing device 102 detects the presence of earrings currently worn by the user. The computing device 102 then accesses the data store 116 and retrieves products that fall within the category of earrings. The retrieved earrings are then presented to the user in the user interface 402 , and the user selects a desired set of replacement earrings.
  • FIG. 5 illustrates another example user interface for selecting a replacement fashion accessory according to various embodiments of the present disclosure.
  • the user interface 502 displays an image depicting the user wearing a fashion accessory 504 (i.e., handbag).
  • the computing device 102 detects the presence of one or more fashion accessories 504 currently worn by an individual depicted in an image.
  • the computing device 102 retrieves candidate fashion accessories 506 based on the one or more detected fashion accessories and presents the candidate fashion accessories to the individual.
  • the individual selects a new fashion accessory and replaces the fashion accessory 504 currently worn by the individual, thereby allowing the individual to try on new fashion accessories without actually purchasing the fashion accessories.
  • the computing device 102 detects the presence of a handbag carried by the user.
  • the computing device 102 then accesses the data store 116 ( FIG. 1 ) and retrieves products that fall within the category of handbags.
  • the retrieved handbags are then presented to the user, and the user selects a desired handbag to replace the handbag currently depicted in the image.
  • FIG. 6 illustrates determination of areas occupied by an original fashion accessory and a selected replacement fashion accessory according to various embodiments of the present disclosure.
  • the user selects a replacement handbag from the candidate handbags presented to the user.
  • the object modifier 110 described earlier determines a first area 602 in the image currently occupied by the original handbag.
  • the object modifier 110 also determines a second area 604 to be occupied by the selected replacement handbag.
  • the image editor 112 performs virtual application of the selected replacement handbag and performs other post-processing operations (e.g., inpainting) as needed.
  • FIG. 7 illustrates the region in which inpainting is applied based on areas occupied by an original fashion accessory and a selected replacement fashion accessory according to various embodiments of the present disclosure.
  • the image editor 112 may perform virtual application of the replacement fashion accessory on the user by removing the detected fashion accessory and performing inpainting in an area around the selected replacement fashion accessory when the area 702 occupied by the selected replacement fashion accessory is less than the area 704 occupied by the original fashion accessory.
  • Inpainting generally refers to the process of reconstructing lost or deteriorated parts of images and videos.
  • the image editor 112 since removal of the original handbag from the image leaves a void larger than the area occupied by the selected replacement handbag, the image editor 112 performs inpainting to the shaded region shown in FIG. 7 in order to reconstruct portions of the image since voids still exist after the selected replacement handbag is virtually applied to the user. In the example shown in FIG. 7 , the voids exist on portions of the user and the background region of the image.
  • virtual application of the selected replacement fashion accessory may involve the use of both inpainting and covering techniques where this depends, for example, on such attributes of the fashion accessories as the construction material (e.g., diamond, crystal), geometry (e.g., symmetric versus asymmetric construction), light transmission properties (e.g., opaque versus transparent), and so on.
  • the use of both inpainting and covering techniques may be utilized if the fashion accessory being replaced differs significantly in size, construction material, etc. from the selected replacement fashion accessory. For example, suppose the user is initially wearing heart-shaped earrings and wishes to replace these earrings with long chain earrings. In this example, both inpainting and covering techniques are required during virtual application of the selected fashion accessory due to the difference in size and shape.
  • FIG. 8 illustrates virtual application of the selected handbag after inpainting is performed by the image editor 112 ( FIG. 1 ) according to various embodiments of the present disclosure.
  • FIG. 9 illustrates lighting attributes associated with an original fashion accessory according to various embodiments of the present disclosure.
  • the object modifier 110 may be configured to determine other attributes relating to the fashion accessory detected in the image of the individual.
  • the object modifier 110 is configured to determine lighting attributes that may include, for example, the angle of lighting incident on the detected fashion accessory in the image, a color of the lighting incident on the detected fashion accessory in the image, and shading on the detected fashion accessory in the image.
  • the location of a light source causes a portion 904 of the detected handbag 902 to be shaded. The user selects a replacement handbag from among candidate handbags 906 presented to the user.
  • FIG. 10 illustrates duplication of the lighting attributes shown in FIG. 9 on the selected replacement handbag according to various embodiments of the present disclosure.
  • the object modifier 110 ( FIG. 1 ) reproduces the lighting attributes detected for the original handbag on the selected replacement handbag 1002 to generate a modified replacement handbag 1002 and performs virtual application of the modified replacement handbag 1002 on the user to produce a more accurate depiction of the replacement handbag 1002 in the image whereby a portion 1004 of the replacement handbag 1002 is similarly shaded.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A computing device obtains an image of a user and detects at least one fashion accessory depicted in the image. The computing device determines a fashion accessory category for each of the at least one detected fashion accessory and retrieves at least one candidate fashion accessory associated with the accessory category from a data store. The computing device determines attributes of the fashion accessory and a replacement fashion accessory and performs virtual application of the replacement fashion accessory on the user based on the attributes of the fashion accessory and the replacement fashion accessory.

Description

CROSS-REFERENCE TO RELATED APPLICATION
This application claims priority to, and the benefit of, U.S. Provisional patent application entitled, “Accessory replacement system,” having Ser. No. 63/308,795, filed on Feb. 10, 2022, which is incorporated by reference in its entirety.
TECHNICAL FIELD
The present disclosure generally relates to systems and methods for evaluating fashion accessories.
SUMMARY
In accordance with one embodiment, a computing device obtains an image of a user and detects at least one fashion accessory depicted in the image. The computing device determines a fashion accessory category for each of the at least one detected fashion accessory and retrieves at least one candidate fashion accessory associated with the accessory category from a data store. The computing device determines attributes of the fashion accessory and a replacement fashion accessory and performs virtual application of the replacement fashion accessory on the user based on the attributes of the fashion accessory and the replacement fashion accessory.
Another embodiment is a system that comprises a memory storing instructions and a processor coupled to the memory. The processor is configured by the instructions to obtain an image of a user and detect at least one fashion accessory depicted in the image. The processor is further configured by the instructions to determine a fashion accessory category for each of the at least one detected fashion accessory and retrieve at least one candidate fashion accessory associated with the accessory category from a data store. The processor is further configured by the instructions to determine attributes of the fashion accessory and a replacement fashion accessory and perform virtual application of the replacement fashion accessory on the user based on the attributes of the fashion accessory and the replacement fashion accessory.
Another embodiment is a non-transitory computer-readable storage medium storing instructions to be implemented by a computing device. The computing device comprises a processor, wherein the instructions, when executed by the processor, cause the computing device to obtain an image of a user and detect at least one fashion accessory depicted in the image. The processor is further configured by the instructions to determine a fashion accessory category for each of the at least one detected fashion accessory and retrieve at least one candidate fashion accessory associated with the accessory category from a data store. The processor is further configured by the instructions to determine attributes of the fashion accessory and a replacement fashion accessory and perform virtual application of the replacement fashion accessory on the user based on the attributes of the fashion accessory and the replacement fashion accessory.
Other systems, methods, features, and advantages of the present disclosure will be or become apparent to one with skill in the art upon examination of the following drawings and detailed description. It is intended that all such additional systems, methods, features, and advantages be included within this description, be within the scope of the present disclosure, and be protected by the accompanying claims.
BRIEF DESCRIPTION OF THE DRAWINGS
Various aspects of the disclosure are better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, with emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
FIG. 1 is a block diagram of a computing device configured to provide an accessory evaluation service according to various embodiments of the present disclosure.
FIG. 2 is a schematic diagram of the computing device of FIG. 1 in accordance with various embodiments of the present disclosure.
FIG. 3 is a top-level flowchart illustrating examples of functionality implemented as portions of the computing device of FIG. 1 for providing an accessory evaluation service according to various embodiments of the present disclosure.
FIG. 4 illustrates an example user interface generated by the computing device of FIG. 1 according to various embodiments of the present disclosure.
FIG. 5 illustrates another example user interface for selecting a replacement fashion accessory according to various embodiments of the present disclosure.
FIG. 6 illustrates determination of areas occupied by an original fashion accessory and a selected replacement fashion accessory according to various embodiments of the present disclosure.
FIG. 7 illustrates the region in which inpainting is applied based on areas occupied by an original fashion accessory and a selected replacement fashion accessory according to various embodiments of the present disclosure.
FIG. 8 illustrates virtual application of the selected replacement fashion accessory according to various embodiments of the present disclosure.
FIG. 9 illustrates lighting attributes associated with an original fashion accessory according to various embodiments of the present disclosure.
FIG. 10 illustrates duplication of the lighting attributes shown in FIG. 9 on the selected replacement fashion accessory according to various embodiments of the present disclosure.
FIG. 11 illustrates detected fashion accessories being displayed based on a predefined priority according to various embodiments of the present disclosure.
DETAILED DESCRIPTION
The subject disclosure is now described with reference to the drawings, where like reference numerals are used to refer to like elements throughout the following description. Other aspects, advantages, and novel features of the disclosed subject matter will become apparent from the following detailed description and corresponding drawings.
Consumers have access to a wide selection of fashion accessories through department stores, online retailers, and so on. However, purchasing fashion accessories can be costly and trying on fashion accessories can also be time consuming. Therefore, there is a need for an improved cost-effective platform that allows consumers to efficiently evaluate a variety of fashion accessories. The present disclosure is directed to systems and methods for providing an accessory evaluation service that detects the presence of one or more fashion accessories currently worn by an individual depicted in an image. In example embodiments, the accessory evaluation service retrieves candidate fashion accessories based on the detected fashion accessories and presents the candidate fashion accessories to the individual. The individual is then able to select a new fashion accessory and replace the fashion accessory currently worn by the individual, thereby allowing the individual to try on new fashion accessories without actually purchasing the fashion accessories.
A description of a system for implementing an accessory evaluation service is described followed by a discussion of the operation of the components within the system. FIG. 1 is a block diagram of a computing device 102 in which the embodiments disclosed herein may be implemented. The computing device 102 may comprise one or more processors that execute machine executable instructions to perform the features described herein. For example, the computing device 102 may be embodied as a computing device such as, but not limited to, a smartphone, a tablet-computing device, a laptop, and so on.
A fashion accessory evaluation service 104 executes on a processor of the computing device 102 and includes an import module 106, an accessory detector 108, an object modifier 110, and an image editor 112. The import module 106 is configured to obtain digital images of a user wearing one or more original fashion accessories. The fashion accessory evaluation service 104 allows the user to select other desired fashion accessories to try on in place of the original fashion accessories worn by the user. For some embodiments, the import module 106 is configured to cause a camera (e.g., front-facing camera) of the computing device 102 to capture an image or a video of a user of the computing device 102. Alternatively, the import module 106 may obtain an image or video of the user from another device or server where the computing device 102 may be equipped with the capability to connect to the Internet.
The images obtained by the import module 106 may be encoded in any of a number of formats including, but not limited to, JPEG (Joint Photographic Experts Group) files, TIFF (Tagged Image File Format) files, PNG (Portable Network Graphics) files, GIF (Graphics Interchange Format) files, BMP (bitmap) files or any number of other digital formats. The video may be encoded in formats including, but not limited to, Motion Picture Experts Group (MPEG)-1, MPEG-2, MPEG-4, H.264, Third Generation Partnership Project (3GPP), 3GPP-2, Standard-Definition Video (SD-Video), High-Definition Video (HD-Video), Digital Versatile Disc (DVD) multimedia, Video Compact Disc (VCD) multimedia, High-Definition Digital Versatile Disc (HD-DVD) multimedia, Digital Television Video/High-definition Digital Television (DTV/HDTV) multimedia, Audio Video Interleave (AVI), Digital Video (DV), QuickTime (QT) file, Windows Media Video (WMV), Advanced System Format (ASF), Real Media (RM), Flash Media (FLV), an MPEG Audio Layer III (MP3), an MPEG Audio Layer II (MP2), Waveform Audio Format (WAV), Windows Media Audio (WMA), 360 degree video, 3D scan model, or any number of other digital formats.
The accessory detector 108 is configured to detect a fashion accessory currently worn by the user in the image obtained by the import module 106. For some embodiments, the accessory detector 108 performs image semantic segmentation on original fashion accessories depicted in the image of the user and searches a data store 116 for other candidate fashion accessories 118 that fall within the same category. For example, a first category (e.g., “CATEGORY 1”) can comprise necklaces while another category can comprise handbags. Yet another category can comprise bracelets. The candidate fashion accessories 118 retrieved by the accessory detector 108 are then presented to the user in a user interface displayed on the computing device 102, thereby allowing the user to select one or more desired fashion accessories to replace fashion accessories currently worn by the user. The accessory detector 108 may comprise an accessory size detection module (not shown) configured to determine size or area attributes of the fashion accessories currently worn by the user. In particular, the accessory size detection module may be configured to determine the area occupied by the fashion accessories currently worn by the user relative to the entire image obtained by the import module 106. For some embodiments, the accessory detector 108 utilizes depth and size information derived by a front-facing camera of the computing device 102 to determine size or area attributes of the fashion accessories currently worn by the user.
The object modifier 110 is configured to obtain a selection from the user of one or more of the candidate fashion accessories 118 where the selection comprises one or more replacement fashion accessories. As the size and shape of the replacement fashion accessory may differ from that of the fashion accessory currently worn by the user, the object modifier 110 analyzes attributes of the detected (original) fashion accessory and of the replacement fashion accessory. For example, the object modifier 110 may be configured to determine a first area occupied by the detected fashion accessory in the image. The object modifier 110 also determines a second area to be occupied by the replacement fashion accessory in the image. In particular, the object modifier 110 determines the second area occupied by the replacement fashion accessory when the replacement fashion accessory is virtually applied to the user in the image. The object modifier 110 then determines a difference between the first area and the second area.
Virtual application of the replacement fashion accessory on the user is then performed by the image editor 112 based on the difference between the first area and the second area. For example, the image editor 112 may perform virtual application of the replacement fashion accessory on the user by covering the detected fashion accessory with the replacement fashion accessory or removing the detected fashion accessory when the second area is greater than the first area. On the other hand, the image editor 112 may perform virtual application of the replacement fashion accessory on the user by removing the detected fashion accessory and performing inpainting in an area around the replacement fashion accessory when the second area is less than the first area. Inpainting generally refers to the process of reconstructing lost or deteriorated parts of images and videos. In this case, since removal of the detected fashion accessory from the image leaves a void larger than the area occupied by the replacement fashion accessory, inpainting is performed to reconstruct portions of the image when voids still exist after the replacement fashion accessory is virtually applied to the user. The voids may exist on portions of the user and/or the background region of the image. For some embodiments, the image editor 112 performs virtual application of the replacement fashion accessory on a segment-by-segment basis within the image where each segment has a predetermined size. For example, each segment may be defined according to a predetermined number of pixels.
Other attributes analyzed by the object modifier 110 may include lighting attributes of the detected fashion accessory in the image. In particular, the object modifier 110 may be configured to determine such lighting attributes as the angle of lighting incident on the detected fashion accessory in the image, a color of the lighting incident on the detected fashion accessory in the image, and shading on the detected fashion accessory in the image. The object modifier 110 then reproduces these lighting attributes on the replacement fashion accessory to generate a modified replacement fashion accessory and performs virtual application of the modified replacement fashion accessory on the user to produce a more accurate depiction of the replacement fashion accessory in the image.
FIG. 2 illustrates a schematic block diagram of the computing device 102 in FIG. 1 . The computing device 102 may be embodied as a desktop computer, portable computer, dedicated server computer, multiprocessor computing device, smart phone, tablet, and so forth. As shown in FIG. 2 , the computing device 102 comprises memory 214, a processing device 202, a number of input/output interfaces 204, a network interface 206, a display 208, a peripheral interface 211, and mass storage 226, wherein each of these components are connected across a local data bus 210.
The processing device 202 may include a custom made processor, a central processing unit (CPU), or an auxiliary processor among several processors associated with the computing device 102, a semiconductor based microprocessor (in the form of a microchip), a macroprocessor, one or more application specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, and so forth.
The memory 214 may include one or a combination of volatile memory elements (e.g., random-access memory (RAM, such as DRAM, and SRAM, etc.)) and nonvolatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.). The memory 214 typically comprises a native operating system 216, one or more native applications, emulation systems, or emulated applications for any of a variety of operating systems and/or emulated hardware platforms, emulated operating systems, etc. For example, the applications may include application specific software that may comprise some or all the components of the computing device 102 displayed in FIG. 1 .
In accordance with such embodiments, the components are stored in memory 214 and executed by the processing device 202, thereby causing the processing device 202 to perform the operations/functions disclosed herein. For some embodiments, the components in the computing device 102 may be implemented by hardware and/or software.
Input/output interfaces 204 provide interfaces for the input and output of data. For example, where the computing device 102 comprises a personal computer, these components may interface with one or more input/output interfaces 204, which may comprise a keyboard or a mouse, as shown in FIG. 2 . The display 208 may comprise a computer monitor, a plasma screen for a PC, a liquid crystal display (LCD) on a hand held device, a touchscreen, or other display device.
In the context of this disclosure, a non-transitory computer-readable medium stores programs for use by or in connection with an instruction execution system, apparatus, or device. More specific examples of a computer-readable medium may include by way of example and without limitation: a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory), and a portable compact disc read-only memory (CDROM) (optical).
Reference is made to FIG. 3 , which is a flowchart 300 in accordance with various embodiments for providing an accessory evaluation service, where the operations are performed by the computing device 102 of FIG. 1 . It is understood that the flowchart 300 of FIG. 3 provides merely an example of the different types of functional arrangements that may be employed to implement the operation of the various components of the computing device 102. As an alternative, the flowchart 300 of FIG. 3 may be viewed as depicting an example of steps of a method implemented in the computing device 102 according to one or more embodiments.
Although the flowchart 300 of FIG. 3 shows a specific order of execution, it is understood that the order of execution may differ from that which is displayed. For example, the order of execution of two or more blocks may be scrambled relative to the order shown. In addition, two or more blocks shown in succession in FIG. 3 may be executed concurrently or with partial concurrence. It is understood that all such variations are within the scope of the present disclosure.
At block 310, the computing device 102 obtains an image of a user. At block 320, the computing device 102 detects at least one fashion accessory depicted in the image. The fashion accessory may comprise, for example, a necklace, rings, earrings, bracelets, watches, hats, and/or a handbag. At block 330, the computing device 102 determines a fashion accessory category for each of the at least one detected fashion accessory. For some embodiments, when more than one fashion accessory is detected, the user is given the opportunity to select one or more of the detected fashion accessories to replace. For example, the detected fashion accessories may be highlighted or thumbnails representing the detected fashion accessories may be grouped together and displayed in the user interface, where the user is able to select one or more of the detected fashion accessories to replace. For some embodiments, the detected fashion accessories may be displayed in a particular order according to a desired priority set by the user. To illustrate, reference is made to FIG. 11 , which shows detected fashion accessories being displayed based on a predefined priority. In the example shown, a thumbnail of a handbag 1102 is shown first on the left followed by thumbnail of a necklace 1104 and so on. The priority may be specified by the user or may be based on popularity. Note that other layouts may be used. For example, fashion accessories comprising handbags 1102 may be assigned a highest priority and shown at the top followed, for example, by necklaces 1104 and so on.
Referring back to FIG. 3 , at block 340, the computing device 102 retrieves at least one candidate fashion accessory 118 associated with the accessory category from a data store. For some embodiments, an artificial intelligence (AI) engine executing in the computing device 102 retrieves the candidate fashion accessories 118 from the data store. Note that the data store is not limited to being implemented in the computing device 102 and may comprise cloud storage where the computing device 102 accesses data from the data store over a network.
For some embodiments, each candidate fashion accessory 118 in the data store has associated metadata where the metadata may include one or more labels or tags. For example, the metadata for a pearl necklace may include two labels—“pearl” and “necklace.” For some embodiments, the AI engine may be configured to derive metadata describing the user's current makeup and clothing style to retrieve at least one candidate fashion accessory 118 associated with the accessory category from the data store. The metadata includes clothing styles that may be characterized as, for example, casual daily wear, luxury or party outfits, colorful clothing, and so on. When accessing the data store, the computing device 102 examines the metadata associated with each fashion accessory to identify suitable candidates. For example, the computing device 102 may determine the type or category of detected fashion accessory and perform a search in the data store for an associated label(s) using a keyword search or other search technique. Fashion accessories in the data store with label(s) that match that of the detected fashion accessory may be prioritized such that other fashion accessories in the data store with labels that only partially match may be assigned a lower priority when candidate fashion accessories are displayed to the user. The candidate fashion accessory with the highest priority may be shown at the top.
At block 350, the computing device 102 determines attributes of the fashion accessory and a replacement fashion accessory. For some embodiments, this comprises utilizing depth and size information derived by a front-facing camera of the computing device 102 to determine a size of the fashion accessory. For some embodiments, the computing device 102 determines the attributes of the fashion accessory and the replacement fashion accessory by determining a first area occupied by the detected fashion accessory in the image and determining a second area to be occupied by the replacement fashion accessory in the image. The computing device 102 determines a difference between the first area and the second area. The computing device 102 then performs virtual application of the replacement fashion accessory on the user based on the difference between the first area and the second area. In particular, the computing device 102 performs virtual application of the replacement fashion accessory on the user where inpainting is performed in an area around the replacement fashion accessory when the second area is less than the first area. On the other hand, the computing device 102 performs virtual application of the replacement fashion accessory on the user by covering the detected fashion accessory with the replacement fashion accessory when the second area is greater than the first area.
For some embodiments, the computing device 102 determines lighting attributes of the detected fashion accessory in the image and reproduces the lighting attributes on the replacement fashion accessory to generate a modified replacement fashion accessory. Virtual application of the modified replacement fashion accessory is then performed on the user. The lighting attributes can comprise, for example, the angle of lighting incident on the detected fashion accessory in the image, a color of the lighting incident on the detected fashion accessory in the image, and shading on the detected fashion accessory in the image. At block 360, the computing device 102 performs virtual application of the replacement fashion accessory on the user based on the attributes of the fashion accessory and the replacement fashion accessory. Thereafter, the process in FIG. 3 ends.
To illustrate further various aspects of the present invention, reference is made to the following figures. FIG. 4 illustrates an example user interface 402 provided on a display of the computing device 102 whereby an image of the user is displayed. As described above, the import module 106 executing in the computing device 102 can be configured to cause a camera (e.g., front-facing camera) of the computing device 102 to capture an image or a video of a user of the computing device 102. The image depicts the user wearing a fashion accessory 404 (i.e., earrings). As described above, the accessory detector 108 executing in the computing device 102 detects the presence of one or more fashion accessories 404 currently worn by an individual depicted in an image. The computing device 102 then retrieves candidate fashion accessories 406 based on the one or more detected fashion accessories and presents the candidate fashion accessories to the individual, thereby giving the individual an opportunity to try on one or more replacement fashion accessories.
The individual selects a new fashion accessory and replaces the fashion accessory currently worn by the individual, thereby allowing the individual to try on new fashion accessories without actually purchasing the fashion accessories. In the example shown in FIG. 4 , the computing device 102 detects the presence of earrings currently worn by the user. The computing device 102 then accesses the data store 116 and retrieves products that fall within the category of earrings. The retrieved earrings are then presented to the user in the user interface 402, and the user selects a desired set of replacement earrings.
FIG. 5 illustrates another example user interface for selecting a replacement fashion accessory according to various embodiments of the present disclosure. The user interface 502 displays an image depicting the user wearing a fashion accessory 504 (i.e., handbag). As described above, the computing device 102 detects the presence of one or more fashion accessories 504 currently worn by an individual depicted in an image. The computing device 102 then retrieves candidate fashion accessories 506 based on the one or more detected fashion accessories and presents the candidate fashion accessories to the individual. By using a touchscreen interface or other input device, the individual selects a new fashion accessory and replaces the fashion accessory 504 currently worn by the individual, thereby allowing the individual to try on new fashion accessories without actually purchasing the fashion accessories. In the example shown in FIG. 5 , the computing device 102 detects the presence of a handbag carried by the user. The computing device 102 then accesses the data store 116 (FIG. 1 ) and retrieves products that fall within the category of handbags. The retrieved handbags are then presented to the user, and the user selects a desired handbag to replace the handbag currently depicted in the image.
FIG. 6 illustrates determination of areas occupied by an original fashion accessory and a selected replacement fashion accessory according to various embodiments of the present disclosure. Referring back to the example in FIG. 5 , the user selects a replacement handbag from the candidate handbags presented to the user. Referring now to FIG. 6 , the object modifier 110 described earlier determines a first area 602 in the image currently occupied by the original handbag. The object modifier 110 also determines a second area 604 to be occupied by the selected replacement handbag. As described above, the image editor 112 performs virtual application of the selected replacement handbag and performs other post-processing operations (e.g., inpainting) as needed.
FIG. 7 illustrates the region in which inpainting is applied based on areas occupied by an original fashion accessory and a selected replacement fashion accessory according to various embodiments of the present disclosure. As described above, the image editor 112 (FIG. 1 ) may perform virtual application of the replacement fashion accessory on the user by removing the detected fashion accessory and performing inpainting in an area around the selected replacement fashion accessory when the area 702 occupied by the selected replacement fashion accessory is less than the area 704 occupied by the original fashion accessory. Inpainting generally refers to the process of reconstructing lost or deteriorated parts of images and videos.
In the example shown, since removal of the original handbag from the image leaves a void larger than the area occupied by the selected replacement handbag, the image editor 112 performs inpainting to the shaded region shown in FIG. 7 in order to reconstruct portions of the image since voids still exist after the selected replacement handbag is virtually applied to the user. In the example shown in FIG. 7 , the voids exist on portions of the user and the background region of the image.
In some instances, virtual application of the selected replacement fashion accessory may involve the use of both inpainting and covering techniques where this depends, for example, on such attributes of the fashion accessories as the construction material (e.g., diamond, crystal), geometry (e.g., symmetric versus asymmetric construction), light transmission properties (e.g., opaque versus transparent), and so on. In particular, the use of both inpainting and covering techniques may be utilized if the fashion accessory being replaced differs significantly in size, construction material, etc. from the selected replacement fashion accessory. For example, suppose the user is initially wearing heart-shaped earrings and wishes to replace these earrings with long chain earrings. In this example, both inpainting and covering techniques are required during virtual application of the selected fashion accessory due to the difference in size and shape. As discussed above, however, the size and shape of the fashion accessories are not the only attributes taken into consideration during the virtual application process. Other attributes include the construction materials of the fashion accessories, the light transmission properties, and so on. FIG. 8 illustrates virtual application of the selected handbag after inpainting is performed by the image editor 112 (FIG. 1 ) according to various embodiments of the present disclosure.
FIG. 9 illustrates lighting attributes associated with an original fashion accessory according to various embodiments of the present disclosure. As described above, the object modifier 110 (FIG. 1 ) may be configured to determine other attributes relating to the fashion accessory detected in the image of the individual. For some embodiments, the object modifier 110 is configured to determine lighting attributes that may include, for example, the angle of lighting incident on the detected fashion accessory in the image, a color of the lighting incident on the detected fashion accessory in the image, and shading on the detected fashion accessory in the image. In the example shown in FIG. 9 , the location of a light source causes a portion 904 of the detected handbag 902 to be shaded. The user selects a replacement handbag from among candidate handbags 906 presented to the user.
FIG. 10 illustrates duplication of the lighting attributes shown in FIG. 9 on the selected replacement handbag according to various embodiments of the present disclosure. The object modifier 110 (FIG. 1 ) reproduces the lighting attributes detected for the original handbag on the selected replacement handbag 1002 to generate a modified replacement handbag 1002 and performs virtual application of the modified replacement handbag 1002 on the user to produce a more accurate depiction of the replacement handbag 1002 in the image whereby a portion 1004 of the replacement handbag 1002 is similarly shaded.
It should be emphasized that the above-described embodiments of the present disclosure are merely possible examples of implementations set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.

Claims (20)

At least the following is claimed:
1. A method implemented in a computing device, comprising:
obtaining an image of a user;
determining a fashion accessory category for each of at least one fashion accessory depicted in the image;
retrieving at least one candidate fashion accessory associated with the fashion accessory category from a data store;
determining attributes of the at least one fashion accessory and a replacement fashion accessory, wherein determining the attributes of the fashion accessory and the replacement fashion accessory comprises:
determining a first area occupied by the at least one fashion accessory depicted in the image;
determining a second area to be occupied by the replacement fashion accessory depicted in the image; and
determining a difference between the first area and the second area, wherein virtual application of the replacement fashion accessory on the user is performed based on the difference between the first area and the second area; and
performing virtual application of the replacement fashion accessory on the user based on the attributes of the at least one fashion accessory and the replacement fashion accessory, wherein performing virtual application of the replacement fashion accessory on the user comprises:
performing inpainting in an area around the replacement fashion accessory; and
performing virtual application of the replacement fashion accessory on the user after inpainting is performed in the area around the replacement fashion accessory.
2. The method of claim 1, further comprising obtaining a selection of one of the at least one candidate fashion accessory from the user as the replacement fashion accessory, wherein obtaining the selection comprises:
displaying the at least one fashion accessory according to a predefined order of priority; and
obtaining the selection among the displayed at least one fashion accessory.
3. The method of claim 1, wherein retrieving the at least one candidate fashion accessory associated with the fashion accessory category from the data store comprises:
determining a label for each of the at least one fashion accessory;
comparing metadata comprising a label of each candidate fashion accessory in the data store with each label of each of the at least one fashion accessory; and
determining the at least one candidate fashion accessory based on the comparison.
4. The method of claim 1, wherein determining the attributes of the fashion accessory and the replacement fashion accessory comprises utilizing depth and size information derived by a front-facing camera of the computing device to determine a size of the at least one fashion accessory.
5. The method of claim 1, wherein performing virtual application of the replacement fashion accessory on the user based on the difference between the first area and the second area comprises:
performing inpainting in an area around the replacement fashion accessory when the second area is less than the first area; and
covering the at least one fashion accessory with the replacement fashion accessory when the second area is greater than the first area.
6. The method of claim 1, wherein performing virtual application of the replacement fashion accessory on the user based on the difference between the first area and the second area comprises:
applying a covering technique and partial inpainting based on the difference of the first area and the second area when the second area is greater than the first area.
7. The method of claim 1, wherein the attributes of the fashion accessory and the replacement fashion accessory comprise each of: size and shape of the fashion accessory and the replacement fashion accessory, construction material of the fashion accessory and the replacement fashion accessory, geometry of the fashion accessory and the replacement fashion accessory, and light transmission properties of the fashion accessory and the replacement fashion accessory.
8. A method implemented in a computing device, comprising:
obtaining an image of a user;
determining a fashion accessory category for each of at least one fashion accessory depicted in the image;
retrieving at least one candidate fashion accessory associated with the fashion accessory category from a data store;
determining attributes of the at least one fashion accessory and a replacement fashion accessory, wherein determining the attributes of the fashion accessory and the replacement fashion accessory comprises:
determining lighting attributes of the at least one fashion accessory depicted in the image;
duplicating the lighting attributes on the replacement fashion accessory to generate a modified replacement fashion accessory; and
performing virtual application of the modified replacement fashion accessory on the user; and
performing virtual application of the replacement fashion accessory on the user based on the attributes of the at least one fashion accessory and the replacement fashion accessory.
9. The method of claim 8, wherein the lighting attributes comprise: angle of lighting incident on the at least one fashion accessory depicted in the image; a color of the lighting incident on the at least one fashion accessory depicted in the image; and shading on the at least one fashion accessory depicted in the image.
10. A system, comprising:
a memory storing instructions;
a processor coupled to the memory and configured by the instructions to at least:
obtain an image of a user;
determine a fashion accessory category for each of at least one fashion accessory depicted in the image;
retrieve at least one candidate fashion accessory associated with the fashion accessory category from a data store;
determine attributes of the at least one fashion accessory and a replacement fashion accessory, wherein the processor is configured to determine the attributes of the fashion accessory and the replacement fashion accessory by:
determining a first area occupied by the at least one fashion accessory depicted in the image;
determining a second area to be occupied by the replacement fashion accessory in the image; and
determining a difference between the first area and the second area, wherein virtual application of the replacement fashion accessory on the user is performed based on the difference between the first area and the second area; and
perform virtual application of the replacement fashion accessory on the user based on the attributes of the at least one fashion accessory and the replacement fashion accessory, wherein the processor is configured to perform virtual application of the replacement fashion accessory on the user by:
performing inpainting in an area around the replacement fashion accessory; and
performing virtual application of the replacement fashion accessory on the user after inpainting is performed in the area around the replacement fashion accessory.
11. The system of claim 10, wherein the processor is further configured to obtain a selection of one of the at least one candidate fashion accessory from the user as the replacement fashion accessory, wherein obtaining the selection comprises:
displaying the at least one fashion accessory according to a predefined order of priority; and
obtaining the selection among the displayed at least one fashion accessory.
12. The system of claim 10, wherein the processor is configured to retrieve the at least one candidate fashion accessory associated with the fashion accessory category from the data store by:
determining a label for each of the at least one fashion accessory;
comparing metadata comprising a label of each candidate fashion accessory in the data store with each label of each of the at least one fashion accessory; and
determining the at least one candidate fashion accessory based on the comparison.
13. The system of claim 10, further comprising:
a front-facing camera, wherein the processor is configured to determine the attributes of the at least one fashion accessory and the replacement fashion accessory by utilizing depth and size information derived by the front-facing camera to determine a size of the at least one fashion accessory.
14. The system of claim 10, wherein the processor is configured to determine the attributes of the fashion accessory and the replacement fashion accessory by:
determining a first area occupied by the at least one fashion accessory depicted in the image;
determining a second area to be occupied by the replacement fashion accessory in the image; and
determining a difference between the first area and the second area, wherein virtual application of the replacement fashion accessory on the user is performed based on the difference between the first area and the second area.
15. A non-transitory computer-readable storage medium storing instructions to be implemented by a computing device having a processor, wherein the instructions, when executed by the processor, cause the computing device to at least:
obtain an image of a user;
determine a fashion accessory category for each of the at least one detected fashion accessory depicted in the image;
retrieve at least one candidate fashion accessory associated with the fashion accessory category from a data store;
determine attributes of the at least one fashion accessory and a replacement fashion accessory, wherein the processor is configured by the instructions to determine the attributes of the fashion accessory and the replacement fashion accessory by:
determining a first area occupied by the fashion accessory in the image;
determining a second area to be occupied by the replacement fashion accessory in the image; and
determining a difference between the first area and the second area, wherein virtual application of the replacement fashion accessory on the user is performed based on the difference between the first area and the second area; and
perform virtual application of the replacement fashion accessory on the user based on the attributes of the at least one fashion accessory and the replacement fashion accessory, wherein the processor is configured by the instructions to perform virtual application of the replacement fashion accessory on the user by:
performing inpainting in an area around the replacement fashion accessory; and
performing virtual application of the replacement fashion accessory on the user after inpainting is performed in the area around the replacement fashion accessory.
16. The non-transitory computer-readable storage medium of claim 15, wherein the processor is further configured by the instructions to obtain a selection of one of the at least one candidate fashion accessory from the user as the replacement fashion accessory, wherein obtaining the selection comprises:
displaying the at least one fashion accessory according to a predefined order of priority; and
obtaining the selection among the displayed at least one fashion accessory.
17. The non-transitory computer-readable storage medium of claim 15, wherein the processor is configured by the instructions to retrieve the at least one candidate fashion accessory associated with the fashion accessory category from the data store by:
determining a label for each of the at least one fashion accessory;
comparing metadata comprising a label of each candidate fashion accessory in the data store with each label of each of the at least one fashion accessory; and
determining the at least one candidate fashion accessory based on the comparison.
18. The non-transitory computer-readable storage medium of claim 15, wherein the processor is configured by the instructions to determine the attributes of the fashion accessory and the replacement fashion accessory by utilizing depth and size information derived by a front-facing camera of the computing device to determine a size of the at least one fashion accessory.
19. A system, comprising:
a memory storing instructions;
a processor coupled to the memory and configured by the instructions to at least:
obtain an image of a user;
determine a fashion accessory category for each of the at least one fashion accessory depicted in the image;
retrieve at least one candidate fashion accessory associated with the fashion accessory category from a data store;
determine attributes of the at least one fashion accessory and a replacement fashion accessory, wherein the processor is configured to determine the attributes of the fashion accessory and the replacement fashion accessory by:
determining lighting attributes of the at least one fashion accessory depicted in the image;
duplicating the lighting attributes on the replacement fashion accessory to generate a modified replacement fashion accessory; and
performing virtual application of the modified replacement fashion accessory on the user; and
perform virtual application of the replacement fashion accessory on the user based on the attributes of the at least one fashion accessory and the replacement fashion accessory.
20. A non-transitory computer-readable storage medium storing instructions to be implemented by a computing device having a processor, wherein the instructions, when executed by the processor, cause the computing device to at least:
obtain an image of a user;
determine a fashion accessory category for each of the at least one fashion accessory depicted in the image;
retrieve at least one candidate fashion accessory associated with the fashion accessory category from a data store;
determine attributes of the fashion accessory and a replacement fashion accessory, wherein the processor is configured by the instructions to determine the attributes of the at least one fashion accessory and the replacement fashion accessory by:
determining lighting attributes of the at least one fashion accessory depicted in the image;
duplicating the lighting attributes on the replacement fashion accessory to generate a modified replacement fashion accessory; and
performing virtual application of the modified replacement fashion accessory on the user; and
perform virtual application of the replacement fashion accessory on the user based on the attributes of the at least one fashion accessory and the replacement fashion accessory.
US18/055,923 2022-02-10 2022-11-16 Systems and methods for fashion accessory evaluation Active 2043-05-18 US12175777B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/055,923 US12175777B2 (en) 2022-02-10 2022-11-16 Systems and methods for fashion accessory evaluation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263308795P 2022-02-10 2022-02-10
US18/055,923 US12175777B2 (en) 2022-02-10 2022-11-16 Systems and methods for fashion accessory evaluation

Publications (2)

Publication Number Publication Date
US20230252805A1 US20230252805A1 (en) 2023-08-10
US12175777B2 true US12175777B2 (en) 2024-12-24

Family

ID=87521315

Family Applications (1)

Application Number Title Priority Date Filing Date
US18/055,923 Active 2043-05-18 US12175777B2 (en) 2022-02-10 2022-11-16 Systems and methods for fashion accessory evaluation

Country Status (1)

Country Link
US (1) US12175777B2 (en)

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9813693B1 (en) * 2014-06-27 2017-11-07 Amazon Technologies, Inc. Accounting for perspective effects in images
US9892561B2 (en) 2016-06-30 2018-02-13 Fittingbox Method of hiding an object in an image or video and associated augmented reality process
US10083521B1 (en) * 2015-12-04 2018-09-25 A9.Com, Inc. Content recommendation based on color match
JP2019046428A (en) 2017-09-06 2019-03-22 都子 松田 Accessory classification system
KR102060972B1 (en) 2019-01-31 2019-12-31 주식회사 일 System for virtual wearing of jewelry based on augmented reality and method thereof
KR102153409B1 (en) 2018-07-17 2020-09-08 주식회사 비주얼 Method and electric apparatus for ordering jewelry product
KR102153410B1 (en) 2018-08-22 2020-09-08 주식회사 비주얼 Method and electric apparatus for recommending jewelry product
US10810647B2 (en) 2017-06-09 2020-10-20 International Business Machines Corporation Hybrid virtual and physical jewelry shopping experience
US20200364839A1 (en) 2019-05-17 2020-11-19 Beijing Dajia Internet Information Technology Co., Ltd. Image processing method and apparatus, electronic device and storage medium
CN112084398A (en) 2020-07-28 2020-12-15 北京旷视科技有限公司 Accessory recommendation method, accessory virtual try-on method and device and electronic equipment
CN112102149A (en) 2019-06-18 2020-12-18 北京陌陌信息技术有限公司 Figure hair style replacing method, device, equipment and medium based on neural network
CN112102148A (en) 2019-06-18 2020-12-18 北京陌陌信息技术有限公司 Figure hair style replacing method, device, equipment and medium based on neural network
US20210064910A1 (en) 2019-08-26 2021-03-04 Apple Inc. Image-based detection of surfaces that provide specular reflections and reflection modification
CN112489184A (en) 2020-12-14 2021-03-12 深圳市小玩意智能科技有限公司 Jewelry customization system
KR102255404B1 (en) 2018-03-15 2021-05-24 주식회사 비주얼 Method and electric apparatus for recommending jewelry product
US11100560B2 (en) 2019-03-19 2021-08-24 Stitch Fix, Inc. Extending machine learning training data to generate an artificial intelligence recommendation engine
US20210366147A1 (en) 2020-05-20 2021-11-25 Styledotme Fashion And Lifestyle Private Limited Process for detection of position of body parts based on facial image
US20230230292A1 (en) * 2022-01-19 2023-07-20 Snap Inc. Object replacement system
US20230401460A1 (en) * 2021-12-13 2023-12-14 Samsung Electronics Co., Ltd. Method and electronic device for on-device lifestyle recommendations

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9813693B1 (en) * 2014-06-27 2017-11-07 Amazon Technologies, Inc. Accounting for perspective effects in images
US10083521B1 (en) * 2015-12-04 2018-09-25 A9.Com, Inc. Content recommendation based on color match
US9892561B2 (en) 2016-06-30 2018-02-13 Fittingbox Method of hiding an object in an image or video and associated augmented reality process
US10810647B2 (en) 2017-06-09 2020-10-20 International Business Machines Corporation Hybrid virtual and physical jewelry shopping experience
JP2019046428A (en) 2017-09-06 2019-03-22 都子 松田 Accessory classification system
KR102255404B1 (en) 2018-03-15 2021-05-24 주식회사 비주얼 Method and electric apparatus for recommending jewelry product
KR102153409B1 (en) 2018-07-17 2020-09-08 주식회사 비주얼 Method and electric apparatus for ordering jewelry product
KR102153410B1 (en) 2018-08-22 2020-09-08 주식회사 비주얼 Method and electric apparatus for recommending jewelry product
KR102060972B1 (en) 2019-01-31 2019-12-31 주식회사 일 System for virtual wearing of jewelry based on augmented reality and method thereof
US11100560B2 (en) 2019-03-19 2021-08-24 Stitch Fix, Inc. Extending machine learning training data to generate an artificial intelligence recommendation engine
US20200364839A1 (en) 2019-05-17 2020-11-19 Beijing Dajia Internet Information Technology Co., Ltd. Image processing method and apparatus, electronic device and storage medium
CN112102148A (en) 2019-06-18 2020-12-18 北京陌陌信息技术有限公司 Figure hair style replacing method, device, equipment and medium based on neural network
CN112102149A (en) 2019-06-18 2020-12-18 北京陌陌信息技术有限公司 Figure hair style replacing method, device, equipment and medium based on neural network
US20210064910A1 (en) 2019-08-26 2021-03-04 Apple Inc. Image-based detection of surfaces that provide specular reflections and reflection modification
US20210366147A1 (en) 2020-05-20 2021-11-25 Styledotme Fashion And Lifestyle Private Limited Process for detection of position of body parts based on facial image
CN112084398A (en) 2020-07-28 2020-12-15 北京旷视科技有限公司 Accessory recommendation method, accessory virtual try-on method and device and electronic equipment
CN112489184A (en) 2020-12-14 2021-03-12 深圳市小玩意智能科技有限公司 Jewelry customization system
US20230401460A1 (en) * 2021-12-13 2023-12-14 Samsung Electronics Co., Ltd. Method and electronic device for on-device lifestyle recommendations
US20230230292A1 (en) * 2022-01-19 2023-07-20 Snap Inc. Object replacement system

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
"Fashion Accessories Using Virtual Mirror;" Sep. 2020; pp. 1-10; https://www.seminarsonly.com/Engineering-Projects/Computer/fashion-accessories.php.
Shirkhani, S.; "Image-Based Fashion Recommender Systems: Considering Deep learning role in computer vision development;" Master Programme in Data Science: Lulea University of Technology; 2021; pp. 1-79.
Singh, V., et al.; "A Comparative Experiment In Classifying Jewelry Images Using Convolutional Neural Networks;" Science & Technology Asia; vol. 23; No. 4 Oct.-Dec. 2018; pp. 1-11; ttps://tci-thaijo.org/index.php/SciTechAsia.
Wei_CN112084298—EPO translated (Year: 2020). *
Yang, Y.I., et al.; "Virtual Try-On Of Footwear In Mixed Reality Using Depth Sensors;" VRCAI '13: Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry; Nov. 2013; pp. 309-312; https://doi.org/10.1145/2534329.2534376.

Also Published As

Publication number Publication date
US20230252805A1 (en) 2023-08-10

Similar Documents

Publication Publication Date Title
US11340754B2 (en) Hierarchical, zoomable presentations of media sets
US9336583B2 (en) Systems and methods for image editing
US11030798B2 (en) Systems and methods for virtual application of makeup effects based on lighting conditions and surface properties of makeup effects
US20240256218A1 (en) Modifying digital images using combinations of direct interactions with the digital images and context-informing speech input
US9002175B1 (en) Automated video trailer creation
US10002452B2 (en) Systems and methods for automatic application of special effects based on image attributes
US20230132180A1 (en) Upsampling and refining segmentation masks
US20210015242A1 (en) Systems and methods for recommendation of makeup effects based on makeup trends and facial analysis
US9672866B2 (en) Automated looping video creation
US20220027624A1 (en) Media annotation with product source linking
Dong et al. Fast multi-operator image resizing and evaluation
KR101947553B1 (en) Apparatus and Method for video edit based on object
CN115917647B (en) Automatic non-linear editing style transfer
US11922540B2 (en) Systems and methods for segment-based virtual application of facial effects to facial regions displayed in video frames
US12175777B2 (en) Systems and methods for fashion accessory evaluation
CN118591841A (en) Pre-export of changes for video editing projects
US20210257003A1 (en) Systems and methods for segment-based virtual application of makeup effects to facial regions displayed in video frames
US12307812B2 (en) System and method for personality prediction using multi-tiered analysis
GB2628691A (en) Generating a modified digital image utilizing a human inpainting model
CN111507907B (en) System, method and storage medium executed on computing device
US12169983B1 (en) Automated headshot ranking for retrieval and display
CN112287744A (en) Method and system and storage medium for implementing cosmetic effect recommendations
US20240144719A1 (en) Systems and methods for multi-tiered generation of a face chart
Choi et al. A framework for automatic static and dynamic video thumbnail extraction
US20240144585A1 (en) Systems and methods for adjusting lighting intensity of a face chart

Legal Events

Date Code Title Description
AS Assignment

Owner name: PERFECT MOBILE CORP., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHEN, YA HSUAN;KO, MING-CHUN;REEL/FRAME:061795/0304

Effective date: 20221116

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE