US20210073833A1 - Autonomous Item Generation - Google Patents
Autonomous Item Generation Download PDFInfo
- Publication number
- US20210073833A1 US20210073833A1 US17/015,822 US202017015822A US2021073833A1 US 20210073833 A1 US20210073833 A1 US 20210073833A1 US 202017015822 A US202017015822 A US 202017015822A US 2021073833 A1 US2021073833 A1 US 2021073833A1
- Authority
- US
- United States
- Prior art keywords
- item
- machine learning
- learning model
- listing
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000010801 machine learning Methods 0.000 claims abstract description 172
- 238000012549 training Methods 0.000 claims abstract description 102
- 238000004519 manufacturing process Methods 0.000 claims abstract description 77
- 238000000034 method Methods 0.000 claims description 59
- 230000003993 interaction Effects 0.000 claims description 34
- 238000003860 storage Methods 0.000 claims description 23
- 238000009877 rendering Methods 0.000 claims description 13
- 230000000007 visual effect Effects 0.000 claims description 6
- 239000004753 textile Substances 0.000 claims description 5
- 238000007639 printing Methods 0.000 claims description 3
- 230000006870 function Effects 0.000 abstract description 17
- 238000012552 review Methods 0.000 description 16
- 230000006399 behavior Effects 0.000 description 15
- 238000012545 processing Methods 0.000 description 13
- 238000013461 design Methods 0.000 description 10
- 238000007670 refining Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 239000004744 fabric Substances 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 6
- 239000000463 material Substances 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 239000000047 product Substances 0.000 description 3
- 238000009958 sewing Methods 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000000052 comparative effect Effects 0.000 description 2
- 238000011960 computer-aided design Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000004570 mortar (masonry) Substances 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000008685 targeting Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000004931 aggregating effect Effects 0.000 description 1
- XMQFTWRPUQYINF-UHFFFAOYSA-N bensulfuron-methyl Chemical class COC(=O)C1=CC=CC=C1CS(=O)(=O)NC(=O)NC1=NC(OC)=CC(OC)=N1 XMQFTWRPUQYINF-UHFFFAOYSA-N 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000011982 device technology Methods 0.000 description 1
- 238000001035 drying Methods 0.000 description 1
- 239000004835 fabric adhesive Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000009940 knitting Methods 0.000 description 1
- 238000012417 linear regression Methods 0.000 description 1
- 238000007477 logistic regression Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 230000000135 prohibitive effect Effects 0.000 description 1
- 230000035755 proliferation Effects 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000004826 seaming Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000012358 sourcing Methods 0.000 description 1
- 238000009987 spinning Methods 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 238000009988 textile finishing Methods 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
- 238000009732 tufting Methods 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
- 238000009941 weaving Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0623—Item investigation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B33—ADDITIVE MANUFACTURING TECHNOLOGY
- B33Y—ADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
- B33Y50/00—Data acquisition or data processing for additive manufacturing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B33—ADDITIVE MANUFACTURING TECHNOLOGY
- B33Y—ADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
- B33Y50/00—Data acquisition or data processing for additive manufacturing
- B33Y50/02—Data acquisition or data processing for additive manufacturing for controlling or regulating additive manufacturing processes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
- G06F21/6254—Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/12—Digital output to print unit, e.g. line printer, chain printer
- G06F3/1201—Dedicated interfaces to print systems
- G06F3/1202—Dedicated interfaces to print systems specifically adapted to achieve a particular effect
- G06F3/1203—Improving or facilitating administration, e.g. print management
- G06F3/1204—Improving or facilitating administration, e.g. print management resulting in reduced user or operator actions, e.g. presetting, automatic actions, using hardware token storing data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/12—Digital output to print unit, e.g. line printer, chain printer
- G06F3/1201—Dedicated interfaces to print systems
- G06F3/1202—Dedicated interfaces to print systems specifically adapted to achieve a particular effect
- G06F3/1203—Improving or facilitating administration, e.g. print management
- G06F3/1208—Improving or facilitating administration, e.g. print management resulting in improved quality of the output result, e.g. print layout, colours, workflows, print preview
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/12—Digital output to print unit, e.g. line printer, chain printer
- G06F3/1201—Dedicated interfaces to print systems
- G06F3/1223—Dedicated interfaces to print systems specifically adapted to use a particular technique
- G06F3/1237—Print job management
- G06F3/1253—Configuration of print job parameters, e.g. using UI at the client
- G06F3/1256—User feedback, e.g. print preview, test print, proofing, pre-flight checks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G06N3/0454—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/088—Non-supervised learning, e.g. competitive learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/083—Shipping
- G06Q10/0835—Relationships between shipper or supplier and carriers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0207—Discounts or incentives, e.g. coupons or rebates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0207—Discounts or incentives, e.g. coupons or rebates
- G06Q30/0222—During e-commerce, i.e. online transactions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0631—Item recommendations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0633—Lists, e.g. purchase orders, compilation or processing
- G06Q30/0635—Processing of requisition or of purchase orders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0639—Item locations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
- G06Q30/0643—Graphical representation of items or shoppers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/04—Manufacturing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B13/00—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion
- G05B13/02—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
- G05B13/0265—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric the criterion being a learning criterion
- G05B13/027—Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric the criterion being a learning criterion using neural networks only
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/12—Digital output to print unit, e.g. line printer, chain printer
- G06F3/1201—Dedicated interfaces to print systems
- G06F3/1278—Dedicated interfaces to print systems specifically adapted to adopt a particular infrastructure
- G06F3/1285—Remote printer device, e.g. being remote from client or server
- G06F3/1287—Remote printer device, e.g. being remote from client or server via internet
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Definitions
- Virtual marketplaces such as network-based commerce systems are increasingly becoming a preferred mechanism by which vendors offer goods and services for sale, in contrast to conventional brick-and-mortar retail stores.
- virtual marketplaces provide item listings with certain tools to facilitate purchase of items, such as search interfaces to browse item listings and controls for purchasing a subject item of an item listing
- virtual marketplaces require vendors to manually designate and design specific aspects of each item listing for publication at the virtual marketplace.
- virtual marketplaces connect interested buyers with vendors, it remains the vendor's responsibility to manually perform various operations required to complete the sale of an item, such as processing financial transactions, contracting for shipment of the item to the buyer, and so forth.
- virtual marketplaces due to the computational and network resources required to aggregate and analyze data that describes potential buyer feedback pertaining to an item listing, virtual marketplaces only sporadically provide vendors with information describing item listing feedback. Due to this limited information, vendors are unable to instantly realize how an item listing, or the subject item, should be modified to account for changing trends and behaviors. Consequently, vendors are faced with undue delay at each stage in an item's lifecycle, such as item conception, item manufacture, item listing, item delivery, ascertaining item feedback, item design modification, and so forth.
- An autonomous item generation system receives at least one machine learning model trained to generate both fabrication instructions for generating an item as well as metadata describing the item, automatically and independent of user input.
- the autonomous item generation system causes the at least one machine learning model to generate the fabrication instructions and metadata for an item.
- the autonomous item generation system then transmits the fabrication instructions to a fabrication device, which causes the fabrication device to generate the item.
- a listing for the item is generated from the item metadata output by the at least one machine learning model, and the autonomous item generation system publishes the listing to a virtual marketplace.
- the autonomous item generation system is configured to obtain analytics data describing one or more interactions with the item listing as published to the virtual marketplace, such as views of the item listing, favorites of the item listing, purchases of the item via the item listing, navigation from the item listing to a different item listing, shares of the item listing, comments on the item listing, user feedback to the item listing, combinations thereof, and so forth.
- the autonomous item generation system is configured to serve as the virtual marketplace by publishing the item listing to a network (e.g., the Internet) and monitoring network traffic pertaining to the item listing.
- the autonomous item generation system is configured to leverage an existing virtual marketplace platform and implement one or more application programming interfaces (APIs) of the virtual marketplace to cause publication of the item listing on the virtual marketplace and obtain analytics data pertaining to the item listing from the virtual marketplace.
- APIs application programming interfaces
- the autonomous item generation system forms training data for refining the at least one machine learning model.
- the training data is provided as input to the at least one machine learning model, which causes modification of one or more control parameters of the at least one machine learning model.
- the autonomous item generation system then generates fabrication instructions and metadata for an additional item using the at least one machine learning model with its modified parameter(s) and repeats its operations to continuously refine the machine learning model(s), without requiring user input or intervention to do so.
- FIG. 1 is an illustration of an environment in an example implementation that is operable to employ the autonomous item generation techniques described herein.
- FIG. 2 depicts an example implementation in which an autonomous item generation system of FIG. 1 generates an item and an item listing for the item using at least one machine learning model and refines the at least one machine learning model based on feedback data for the item.
- FIG. 3 depicts an example implementation of a user interface displaying an item listing generated by the autonomous item generation system of FIG. 1 .
- FIG. 4 depicts an example implementation of a user interface displaying an item listing generated by the autonomous item generation system of FIG. 1 .
- FIG. 5 depicts an example implementation of a user interface for the autonomous item generation system of FIG. 1 .
- FIG. 6 depicts an example implementation of a user interface for the autonomous item generation system of FIG. 1 .
- FIG. 7 is a flow diagram depicting a procedure in an example implementation for the autonomous item generation system of FIG. 1 generating an item using at least one machine learning model and refining the machine learning model for generating additional items.
- FIG. 8 is a flow diagram depicting a procedure in an example implementation for outputting and modifying a user interface for the autonomous item generation system of FIG. 1 .
- FIG. 9 illustrates an example system including various components of an example device that can be implemented as any type of computing device as described and/or utilized with reference to FIGS. 1-8 to implement embodiments of the techniques described herein.
- virtual marketplaces are increasingly used as a mechanism to publish item listings (e.g., offers of goods for sale). While such virtual marketplaces enable vendors to reach a wider audience than otherwise enabled by conventional brick-and-mortar storefronts, vendors continue to face undue burdens involved with bringing an initial concept design to a tangible item, determining a strategy for listing the item on a virtual marketplace, publishing the listing to the virtual marketplace, entering into transactions with potential buyers, securing shipping for transporting the item to a buyer, and verifying that the item actually reaches the buyer.
- an autonomous item generation system employs at least one machine learning model trained to generate, for a tangible item, item data that includes fabrication instructions for the item, a description for the item, tags describing various attributes of the item, and a recommended price for listing the item at a virtual marketplace.
- the autonomous item generation system is further configured to interface with the virtual marketplace to automatically perform operations involved with transporting the item from a fabrication device used to generate the item to a purchasing entity, including financial transactions, shipping operations, and so forth.
- the autonomous item generation system is further configured to obtain feedback data describing one or more interactions with the item listing as published to the virtual marketplace, and continuously modify control parameters of the machine learning model used to generate the item and item listing, all without human user intervention or guidance.
- the autonomous item generation system is configured to identify virtual marketplace trends and behaviors in real-time, well before even the most skilled human analyst could detect the same trends and behaviors when provided with the same behavior data.
- the autonomous item generation system and techniques described herein are configured to generate items and item listings without human guidance, identify observed interactions with the product listings, and continuously adapt to such observed interactions in generating subsequent items and item listings at both a rate and a scale that is not possible via conventional systems.
- FIG. 1 illustrates a digital medium environment 100 in an example implementation that is operable to employ the autonomous item generation techniques described herein.
- the illustrated environment 100 includes computing device 102 , which may be implemented according to a variety of different configurations.
- the computing device 102 may be configured as a desktop computer, a laptop computer, a mobile device (e.g., assuming a handheld configuration such as a tablet or mobile phone), and so forth.
- the computing device 102 may range from full resource devices with substantial memory and/or processing resources to devices with limited memory and/or processing resources (e.g., mobile devices).
- the computing device 102 may be representative of a plurality of different devices, such as multiple servers utilized by a business to perform operations “over the cloud” as described in further detail below with respect to FIG. 9 .
- the computing device 102 is illustrated as including an autonomous item generation system 104 .
- the autonomous item generation system 104 is implemented at least partially in hardware of the computing device 102 and represents functionality of the computing device 102 to generate an item 106 and an item listing 108 publication at a virtual marketplace, automatically and independent of user input, guidance, instruction, or other form of intervention that facilitates generation of the item 106 or item listing 108 .
- the item 106 is representative of a tangible good, product, and the like.
- the item 106 is representative of digital content, such as a digital graphic, an animation, a video, and so forth.
- the item listing 108 is representative of publication information describing the item 106 , as described in further detail below.
- the autonomous item generation system 104 employs an item generation module 110 , a transaction module 112 , a feedback module 114 , and a training module 116 .
- the item generation module 110 , the transaction module 112 , the feedback module 114 , and the training module 116 are implemented at least partially in hardware of the computing device 102 (e.g., through use of a processing system and computer-readable storage media), as described in further detail below with respect to FIG. 9 .
- the item generation module 110 is representative of functionality of the computing device 102 to generate the item 106 and item listing 108 , without user input otherwise required by conventional systems (e.g., input specifying design criteria for the item 106 , input specifying information to be included or emphasized in the item listing 108 , input specifying demographic targeting information for the item listing 108 , and so forth). To do so, the item generation module 110 employs machine learning model 118 .
- Machine learning model 118 is representative of one or more machine learning models, where each model represented by machine learning model 118 can be configured according to a range of different machine learning model architectures.
- machine learning model 118 is representative of a model having an architecture based on neural networks (e.g., fully-connected neural networks, convolutional neural networks, or recurrent neural networks), deep learning networks, generative adversarial networks (GANs), decision trees, support vector machines, linear regression, logistic regression, Bayesian networks, random forest learning, dimensionality reduction algorithms, boosting algorithms, combinations thereof, and so forth.
- neural networks e.g., fully-connected neural networks, convolutional neural networks, or recurrent neural networks
- GANs generative adversarial networks
- decision trees support vector machines, linear regression, logistic regression, Bayesian networks, random forest learning, dimensionality reduction algorithms, boosting algorithms, combinations thereof, and so forth.
- machine learning model 118 is representative of one or more trained machine learning models that are configured to generate fabrication instructions 120 for the item 106 and metadata describing the item 106 , where the metadata is useable by the autonomous item generation system 104 to generate the item listing 108 .
- the machine learning model 118 may be representative of a GAN that is trained to generate fabrication instructions for a particular type of item (e.g., an article of clothing, a work of art, a three-dimensional model, and so forth).
- Such a trained GAN implementation of the machine learning model 118 may be generated by providing the machine learning model 118 with a plurality of training sets, where each training set includes information that is useable to guide the machine learning model towards producing a desired output.
- each training dataset may include fabrication instructions for an example article of clothing (e.g., instructions describing one or more fabrics or materials to be used in generating the article of clothing; cut patterns for the one or more fabrics or materials; adhesion instructions for combining the cuts of materials or fabrics such as sewing patterns, sewing thread types, fabric adhesive types, and the like; folding instructions for the article of clothing; and so forth).
- the training dataset may further include metadata describing the example article of clothing (e.g., a name for the article of clothing, a product category for the article of clothing, descriptive attributes of the article of clothing, a demographic audience for the article of clothing, and so forth).
- Each training dataset for the example machine learning model 118 trained to generate fabrication instructions and metadata for an article of clothing may further include information describing feedback pertaining to the subject article of clothing for the training dataset.
- feedback information may specify a number of times a listing for the article of clothing was viewed via a virtual marketplace, a number of purchases made of the article of clothing, a percentage of views that resulted in a share or favorite of the article of clothing, reviews for the article of clothing, information specifying comparative feedback data for different articles of clothing listed on the virtual marketplace, information describing an appearance of a listing for the article of clothing on the virtual marketplace, combinations thereof, and so forth.
- the machine learning model 118 is representative of a model trained on a generic dataset for one or more characteristics to be learned by the model during training. For instance, continuing the example of a machine learning model 118 trained to learn artwork characteristics, the machine learning model 118 is representative of a model trained on a training dataset that includes a collection of different works of art that share one or more common characteristics (e.g., being portraits of a human subject, depicting landscapes, being abstract art, being vector artwork, comprising a certain color palette, and so forth). As such, the characteristics learned by machine learning model 118 during training are dictated by the contents of the dataset used during training, such that the training dataset defines a style or theme of the machine learning model's 118 output following training.
- the machine learning model 118 may be representative of a model trained on a training dataset that consists of human portraits, such that the machine learning model 118 is configured to output artwork that depicts human portraits after training is complete.
- machine learning model 118 is representative of one or more models trained on a plurality of different training datasets. For instance, continuing the previous example of a training dataset consisting of human portraits, after training the machine learning model 118 to output artwork depicting human portraits, training may continue using an additional training dataset that includes artwork depicting landscapes, such that the machine learning model 118 is further trained to output artwork depicting human portraits against landscape backgrounds.
- the machine learning model 118 may be representative of a plurality of different machine learning models arranged in a stacked configuration, where the output of one model is provided as input to a different model of the stacked configuration.
- each model of the stacked configuration of machine learning models may be trained on a different training dataset, such as one trained to output landscape artwork, another trained to output abstract artwork, and another trained to output watercolor artwork.
- landscape artwork output by the initial model would be provided as input to the model trained to output abstract artwork, which would output an abstract artwork representation of the input landscape, which in turn would be provided to the watercolor artwork-trained model, such that the resulting output of the stacked configuration of machine learning models is an abstract watercolor landscape work of art.
- the specific characteristics learned by the machine learning model 118 are dependent on the training dataset(s) used to generate the machine learning model 118 and are not restricted to the examples provided herein.
- each training dataset may include fabrication instructions for a specific piece of art (e.g., instructions designating one or more materials to use in generating the piece of art such as paint type and color, ink, paper, canvas, combinations thereof, and the like; printing instructions for generating the piece of art on a particular medium; dimensional constraints for the piece of art; and so forth).
- fabrication instructions for a specific piece of art e.g., instructions designating one or more materials to use in generating the piece of art such as paint type and color, ink, paper, canvas, combinations thereof, and the like; printing instructions for generating the piece of art on a particular medium; dimensional constraints for the piece of art; and so forth).
- Individual training datasets supplement the fabrication instructions for the piece of art by including metadata describing the particular piece of art (e.g., a title for the piece of art, a description of the art, tags for listing the piece of art in a virtual marketplace, a recommended price for the piece of art, combinations thereof, and so forth).
- Each training dataset may further include feedback information for the piece of art, such as feedback information similar to that described above with respect to the example training dataset for training the machine learning model 118 to generate fabrication instructions for an article of clothing.
- the GAN may be trained by causing different portions of the GAN (e.g., a generator portion and a discriminator portion) to compete in an adversarial objective (e.g., a min-max game) that seeks to maximize positive feedback associated with a corresponding item 106 generated according to fabrication instructions 120 output by the GAN.
- the feedback data of the example training datasets may be normalized on a scale that indicates whether feedback data for an item is generally positive or negative (e.g., feedback data indicating numerous views, purchases, shares, positive reviews of the item may be characterized and quantified as indicating positive feedback for the subject item of the training dataset).
- Such positive feedback data can be contrasted with feedback data indicating few purchases, shares, or favorites of the item, feedback data indicating negative user reviews, and/or feedback data indicating a view of the item and subsequent purchase of a different similar item, which may be characterized and quantified as indicating negative feedback for the subject item of the training dataset.
- the machine learning model 118 may be configured to generate fabrication instructions 120 and metadata for an item 106 in a manner that seeks to maximize positive feedback data for the item 106 .
- training the machine learning model 118 includes supplementing training data from the training datasets with noise (e.g., Gaussian input noise), which causes the generator portion of the GAN to generate samples that could potentially fool the discriminator portion in the mini-max game objective example.
- noise e.g., Gaussian input noise
- the machine learning model 118 is representative of one or more machine learning models that are trained to identify different aspects of item fabrication instructions and/or descriptive metadata for the item that influences positive feedback associated with the item.
- the machine learning model 118 is instructed (e.g., via an objective function using convolutional neural networks) to generate an output defined by characteristics of a training dataset (e.g., articles of clothing with long sleeves). Outputs of the machine learning model 118 are then compared to the training dataset, and the model is governed to generated realistic articles of clothing with long sleeves based on a loss function determined from the comparison (e.g., F1 loss, visual perceptual quality loss, combinations thereof, and so forth in an implementation where the machine learning model 118 is configured as a GAN). Training continues until the machine learning model 118 converges and consistently generates outputs that are within a comparative threshold to the training dataset, at which point the machine learning model 118 is output, or adapted to different characteristics using additional training datasets.
- a loss function e.g., F1 loss, visual perceptual quality loss, combinations thereof, and so forth in an implementation where the machine learning model 118 is configured as a GAN.
- the machine learning model 118 is further trained to identify differences in feedback data associated with an item among different demographic segments. For instance, training data may indicate that for a same article of clothing (e.g., a down jacket), the article of clothing is generally associated with positive feedback for a particular geographic location demographic segment during a three month window (e.g., during a winter season for the particular geographic location) and is generally associated with negative feedback for the northern hemisphere demographic segment at other times (e.g., during spring, summer, and fall seasons for the particular geographic location).
- training data may indicate that for a same article of clothing (e.g., a down jacket), the article of clothing is generally associated with positive feedback for a particular geographic location demographic segment during a three month window (e.g., during a winter season for the particular geographic location) and is generally associated with negative feedback for the northern hemisphere demographic segment at other times (e.g., during spring, summer, and fall seasons for the particular geographic location).
- the machine learning model 118 is further trained with the objective of maximizing positive feedback associated for an item at an audience-specific level, where the audience can be constrained according to any range of control parameters (e.g., geographic location, time of day, day(s) of week, audience age, audience gender, combinations thereof, and so forth).
- control parameters e.g., geographic location, time of day, day(s) of week, audience age, audience gender, combinations thereof, and so forth.
- the machine learning model 118 may be received by the autonomous item generation system 104 together with an indication of control parameters in the machine learning model's 118 latent space(s).
- the training module 116 is configured to identify one or more control parameters in the machine learning model's 118 latent space(s). Such control parameters correlate to any aspect of the machine learning model's 118 output.
- one control parameter may affect a size of the sky in the resulting landscapes
- another control parameter may define a color palette (e.g., one or more colors) used in depicting mountains in the landscape
- another control parameter may describe a medium on which the landscape is depicted
- another control parameter may describe characteristics of a demographic audience for which the landscape is generated, and so forth.
- the training module 116 is configured to identify control parameters by adjusting individual control parameters of the machine learning model 118 and determining how the adjustment affects the resulting model output.
- the machine learning model 118 is representative of one or more trained machine learning models that are configured to generate fabrication instructions 120 for the item 106 and metadata describing the item 106 that is useable by the autonomous item generation system 104 to generate the item listing 108 , in a manner that seeks to maximize positive feedback associated with the item 106 .
- the item generation module 110 Upon generating fabrication instructions 120 for the item 106 using the machine learning model 118 , the item generation module 110 is configured to transmit the fabrication instructions 120 to a fabrication device 122 , which is representative of one or more machines that are configured to generate the item 106 , responsive to receipt of the fabrication instructions.
- the fabrication device 122 is representative of one or more textile machines, such as a textile sourcing machine, a textile spinning machine, a textile finishing machine, cloth finishing machine, a knitting machine, a fabric seaming machine, a crochet machine, a quilting machine, a tufting machine, a weaving machine, a component (e.g., zipper, button, etc.) manufacturing machine, a measuring machine, a cutting machine, an embroidery machine, a sewing machine, a washing machine, a drying machine, a folding machine, a monogramming machine, an applique attachment machine, combinations thereof, and the like.
- textile machines such as a textile sourcing machine, a textile spinning machine, a textile finishing machine, cloth finishing machine, a knitting machine, a fabric seaming machine, a crochet machine, a quilting machine, a tufting machine, a weaving machine, a component (e.g., zipper, button, etc.) manufacturing machine, a measuring machine, a cutting machine, an embroidery machine, a sewing machine, a washing machine,
- the fabrication device 122 may be configured as one or more of a two-dimensional printer, a three-dimensional printer, a computer numerical control (CNC) machine, combinations thereof, and so forth.
- the fabrication device may be representative of computer-aided design (CAD) software implemented at least partially in hardware of a computing device, such as in hardware of the computing device 102 .
- CAD computer-aided design
- the fabrication device 122 is representative of any one or combination of multiple devices that are capable of generating the item 106 based on the fabrication instructions 120 output by the machine learning model 118 .
- the transaction module 112 is representative of functionality of the computing device 102 to generate the item listing 108 for the item 106 , based on the metadata describing the item 106 as output by the machine learning model 118 .
- the item listing 108 generated by the transaction module 112 is representative of information that describes an appearance of the item listing 108 as published at a virtual marketplace 124 , both as visually appearing to a viewing user of the marketplace 124 as well as appearing in data observed by a search engine (e.g., when indexing virtual marketplace 124 or otherwise becoming aware of the item listing 108 ).
- the item listing 108 is representative of data specifying at least one of a title for the item 106 , a detailed description for the item 106 , a representative image for the item 106 , a suggested price for the item 106 , one or more different items that are similar to the item 106 , combinations thereof, and so forth.
- Example implementations of an item listing 108 generated by the transaction module 112 are illustrated in FIGS. 3 and 4 and described in further detail below.
- the transaction module 112 is further representative of functionality of the computing device to interface with the virtual marketplace 124 or directly implement the virtual marketplace as part of the autonomous item generation system 104 .
- the virtual marketplace 124 is representative of a service configured to publish item listings 108 where items (e.g., tangible goods) are offered for sale.
- the virtual marketplace 124 is representative of a social networking system or other type of informational system that is configured to output the item listing 108 for display to one or more users.
- the virtual marketplace 124 may be hosted on dedicated or shared server machines (not shown) that are communicatively coupled to enable communications between the server machines.
- the virtual marketplace 124 may be implemented over distributed servers as described in further detail below with respect to FIG. 9 .
- the virtual marketplace 124 is further representative of connected components that allow the components to share and access common data, such as data hosted on one or more databases.
- the virtual marketplace 124 is further representative of a platform that provides at least one of a publishing mechanism, a listing mechanism, or a price-setting mechanism that enable a seller to list, or publish information pertaining to tangible goods for sale.
- the virtual marketplace 124 is representative of a platform that enables a buyer to express interest in, or indicate a desire to purchase, the tangible goods offered for sale.
- the virtual marketplace 124 may comprise at least one publication engine and at least one shopping engine.
- the publication engine of the virtual marketplace is associated with one or more Application Programming Interfaces (APIs) that enable the transaction module 112 to communicate the item listing 108 to the virtual marketplace 124 and cause the virtual marketplace 124 to publish the item listing 108 in a manner that can be observed and interacted with by a user of the virtual marketplace (e.g., a potential buyer of the item 106 ).
- the shopping engine of the virtual marketplace 124 is associated with one or more APIs that enable a user of the virtual marketplace 124 to accept an offer for sale of the item 106 by agreeing to pay a price associated with the item 106 .
- the APIs supported by the shopping engine of the virtual marketplace 124 support different price listing formats for publication of the item listing 108 .
- price listing formats include fixed-price listing formats (e.g., the traditional classified advertisement-type listing or a catalog listing), auction-type price listing formats, buyout-type listing formats (e.g., the Buy-It-Now (BIN) technology developed by eBay Inc., of San Jose, Calif.), combinations thereof, and so forth.
- fixed-price listing formats e.g., the traditional classified advertisement-type listing or a catalog listing
- auction-type price listing formats e.g., the Buy-It-Now (BIN) technology developed by eBay Inc., of San Jose, Calif.
- buyout-type listing formats e.g., the Buy-It-Now (BIN) technology developed by eBay Inc., of San Jose, Calif.
- the virtual marketplace 124 is further representative of a navigation engine that enables a user of the virtual marketplace to browse and inspect various item listings 108 published by the virtual marketplace 124 .
- the navigation engine of the virtual marketplace 124 enables a user to identify and discover various item listings by providing a search module that enables keyword and/or image searches of item listings 108 or other information published by the virtual marketplace 124 .
- the virtual marketplace 124 may organize item listings 108 according to various data structures (e.g., category, catalog, or other form of classification for differentiating and grouping item listings 108 , relative to one another).
- the virtual marketplace 124 may provide tools that enable users to browse published item listings 108 according to metadata tags that categorize the item listing 108 , rather than having to index through an entirety of item listings 108 published by the virtual marketplace.
- Various other navigation techniques and item listing classification and categorization approaches may be enabled by the virtual marketplace 124 without departing from the spirit and scope of the examples described herein.
- the virtual marketplace 124 is further representative of a messaging system that enables generation and delivery of various entities involved with facilitating a transaction via the virtual marketplace 124 .
- the messaging system implemented by the virtual marketplace 124 may facilitate communications among a selling entity that published the item listing 108 to the virtual marketplace, a purchasing entity 126 that purchases the item 106 via interaction with the item listing 108 , a shipping entity (not shown) contracted to physically deliver the item 106 from the fabrication device 122 to the purchasing entity 126 , one or more financial institutions tasked with transferring funds among the various entities (e.g., the autonomous item generation system 104 , the virtual marketplace 124 , the fabrication device 122 , the shipping entity, the purchasing entity 126 , and so forth).
- the network 128 is representative of a real time communication protocol that connects the autonomous item generation system 104 to one or more of the fabrication device 122 , the virtual marketplace 124 , the purchasing entity 126 , one or more shipping entities, and one or more financial institutions involved in these example activities.
- the network 128 may represent functionality of a real-time communication protocol, such as a remote procedure call that enables a streaming, always-connected link among different entities.
- the network 128 may be representative of the Internet, a subscriber network such as a cellular of Wi-Fi network, combinations thereof, and so forth.
- the feedback module 114 is representative of functionality of the computing device 102 to obtain analytics data from the virtual marketplace 124 , such as information describing user feedback pertaining to the item listing 108 as published at the virtual marketplace 124 .
- Data obtained by the feedback module 114 is representative of any form of information that indicates a manner in which the item listing 108 was observed and/or interacted with by users of the virtual marketplace 124 .
- the feedback module 114 may include one or more APIs configured to obtain data describing a number of views (e.g., a number of impressions) of the item listing 108 , a number of purchases of the item via the item listing 108 , a number of favorites of the item listing 108 , a number of shares of the item listing 108 , user reviews submitted for the item listing 108 , combinations thereof, and so forth.
- a number of views e.g., a number of impressions
- the feedback module 114 may include one or more APIs configured to obtain data describing a number of views (e.g., a number of impressions) of the item listing 108 , a number of purchases of the item via the item listing 108 , a number of favorites of the item listing 108 , a number of shares of the item listing 108 , user reviews submitted for the item listing 108 , combinations thereof, and so forth.
- the feedback module 114 is further representative of functionality of the autonomous item generation system 104 to obtain user profile information (e.g., age, gender, location, and so forth) pertaining to individual users that interacted with the item listing 108 , and metadata describing the interaction (e.g., a duration of the interaction, specific aspects of the item listing 108 with which a user interacted, an amount of time spent viewing certain portions of the item listing, a date and time of the interaction, combinations thereof, and so forth).
- user profile information e.g., age, gender, location, and so forth
- metadata describing the interaction e.g., a duration of the interaction, specific aspects of the item listing 108 with which a user interacted, an amount of time spent viewing certain portions of the item listing, a date and time of the interaction, combinations thereof, and so forth.
- the feedback module 114 is representative of functionality of the autonomous item generation system 104 to understand how the particular item listing 108 for the item 106 was received, relative to other item listings published on the virtual marketplace 124 .
- the training module 116 is representative of functionality of the autonomous item generation system 104 to modify the machine learning model 118 by modifying at least one control parameter of the machine learning model 118 based on data obtained by the feedback module 114 pertaining to the item listing 108 .
- the training module 116 is configured to generate additional training datasets for refining the machine learning module 118 , where each additional training dataset includes data describing the item listing 108 and the fabrication instructions 120 for the item 106 along with at least one instance of analytics data obtained by the feedback module 114 describing a user interaction with the item listing 108 at the virtual marketplace 124 .
- the training module 116 may generate a training dataset that includes the control parameters for the machine learning model 118 used in generating the fabrication instructions 120 and metadata for the item 106 (e.g., metadata used to generate the item listing 108 ).
- the training module 116 supplements the control parameters in the training dataset with predicted feedback data for the item 106 , and provides the training dataset to the machine learning model 118 as input along with a loss function that penalizes differences, between predicted feedback data of the training dataset and observed feedback data obtained from the feedback module 114 , indicating that the control parameters resulted in negative feedback for the item 106 .
- the loss function implemented by the training module 116 rewards differences indicating that feedback pertaining to the item 106 was more positive than predicted.
- control parameters output through this training process may be configured as a ranking of different combinations of control parameters for the machine learning model 118 , where the ranking is ordered based control parameter combinations that are likely to garner the most positive feedback via publication to the virtual marketplace 124 .
- the training module 116 is configured to refine control parameters of the machine learning model 118 using any number of different loss functions. For instance, loss functions may be layered, such that the loss function penalizing negative differences between predicted and observed user review average scores is layered with a loss function that penalizes negative feedback differences between predicted and observed user interactions with the item listing 108 (e.g., click-through rates, purchase rates, etc.).
- loss functions may be layered, such that the loss function penalizing negative differences between predicted and observed user review average scores is layered with a loss function that penalizes negative feedback differences between predicted and observed user interactions with the item listing 108 (e.g., click-through rates, purchase rates, etc.).
- the training module 116 in addition to training the machine learning model 118 using loss functions that consider differences between predicted and observed feedback data, is configured to employ one or more multi-armed bandit approaches to explore novel control parameter combinations that differ from previously attempted control parameter combinations for the machine learning model 118 .
- the training module 116 is configured to continuously refine the machine learning model 118 to adapt its subsequent generation of fabrication instructions 120 and item metadata for use in generating the item listing 108 for additional items 106 , while accounting for user behaviors and trends at the virtual marketplace 124 .
- the autonomous item generation system 104 advantageously enables real-time adaptation to changes in user behavior and trends at the virtual marketplace 124 at a rate that is impossible to achieve by conventional systems that require human input or intervention.
- the advantages enabled by the autonomous item generation system 104 relative to conventional approaches are exponentially increased when generating a catalog of items 106 and corresponding item listings 108 , as the human hours required by conventional approaches prohibit generating fabrication instructions 120 and item listings 108 for an item 106 in real-time.
- FIG. 2 depicts a system 200 in an example implementation showing operation of the autonomous item generation system 104 of FIG. 1 in greater detail as generating an item 106 and an item listing 108 , automatically and independent of user input via the machine learning model 118 , and as refining the machine learning model 118 based on data describing one or more interactions with the item listing 108 as published to the virtual marketplace 124 .
- system 200 illustrates components of the autonomous item generation system 104 , including the item generation module 110 , the transaction module 112 , the feedback module 114 , and the training module 116 .
- the item generation module 110 is configured to cause the machine learning model 118 to output item data 202 for the item 106 , where a type of the item 106 depends on an objective and training dataset used to originally train the machine learning model 118 .
- the item data 202 includes the fabrication instructions 120 for the item 106 along with metadata for the item 106 including an item description 204 , at least one item tag 206 , and item pricing data 208 .
- the item description 204 is representative of a title for the item 106 to be included in the item listing 108 , a detailed textual description for the item listing 108 , and a digital rendering (e.g., an image, a video, an animation, combinations thereof, and so forth) of the item 106 to be represented in the item listing 108 .
- the item tags 206 included in the item data 202 are representative of metadata to be embedded in the item listing 108 that enables the virtual marketplace 124 and/or a search engine (not shown) to identify and categorize the item listing 108 (e.g., relative to other item listings published at the virtual marketplace 124 ).
- Item pricing data specifies at least one suggested price to be associated with the item 106 (e.g., to be displayed as part of the item listing 108 ).
- the item tags 206 further specify audience information for the item listing 108 to define a particular manner in which the item listing 108 is published at the virtual marketplace 124 .
- the item tags 206 may include information specifying a particular demographic for the item listing 108 that restricts its publication to the particular demographic (e.g., specifying different item pricing data 208 for European and Asian markets, specifying different visual appearances for conveying the item description 204 in the item listing 108 at different times of the day, and so forth).
- the item data 202 generated by the machine learning model 118 is representative of information that is useable by the fabrication device 122 to fabricate the item 106 as well as information that is useable by the transaction module 112 to generate the item listing 108 .
- the transaction module 112 Upon receiving the item data 202 from the item generation module 110 , the transaction module 112 is configured to generate the item listing 108 , where a visual appearance of the item listing 108 as published to the virtual marketplace 124 is defined by one or more of the item description 204 , the item tags 206 , or the item pricing data 208 .
- a visual appearance of the item listing 108 as published to the virtual marketplace 124 is defined by one or more of the item description 204 , the item tags 206 , or the item pricing data 208 .
- FIG. 3 depicts an example interface 300 of the virtual marketplace 124 as displaying an item listing 302 .
- the item listing 302 represents an instance of the item listing 108 generated by the transaction module 112 , where the item listing 302 is created for an article of clothing item 106 generated by the machine learning model 118 .
- the item listing 302 includes an item title 304 for a “Men's Casual Button Down Shirt,” and a digital rendering 306 of the item 106 .
- the digital rendering 306 indicates how the item 106 would visually appear following fabrication by the fabrication device 122 , according to the fabrication instructions 120 .
- the item listing 302 further includes a price 308 for purchasing the item 106 depicted by the digital rendering 306 and a detailed description 310 that provides a viewing user with additional information describing the item 106 .
- the example item listing 302 further includes an item details portion 312 configured to display additional information not provided by the item title 304 , the digital rendering 306 of the item, the price 308 , or the detailed description 310 .
- the item details 312 may specify specific dimensions of the article of clothing, textiles used to construct the article of clothing, and any other information included in the item data 202 output by the machine learning model 118 .
- the item listing 302 is further illustrated as including a shipping options portion 314 , a user reviews portion 316 , and a similar items portion 318 .
- the shipping options portion 314 is representative of information displayed to a viewing user of the item listing 302 that informs the viewing user as to available choices for logistically transferring the subject item 106 of the item listing 302 from the fabrication device 122 that manufactures the item 106 to a location of the viewing user (e.g., the purchasing entity 126 ).
- the user reviews portion 316 is representative of explicit feedback information pertaining to the item 106 as received from one or more users of the virtual marketplace 124 that have previously purchased the item 106 .
- the similar items portion 318 is configured to display representations 320 , 322 , and 324 of different item listings published to the virtual marketplace 124 that are identified as being similar to the item 106 for which the item listing 302 is generated (e.g., based on comparison of the item tags 206 for the item 106 to tags associated with the representations 320 , 322 , and 324 ).
- the shipping options portion 314 , the user reviews portion 316 , and the similar items portion 318 of the item listing 302 are defined by the virtual marketplace 124 to which the item listing is published, rather than being defined by the transaction module 112 .
- the transaction module 112 may be configured to control a visual appearance of one or more of the shipping options portion 314 , the user reviews portion 316 , or the similar items portion 318 by virtue of a communicative connection between the virtual marketplace 124 and the transaction module 112 (e.g., network 128 ), as represented by the double-headed arrow connecting the transaction module 112 and the virtual marketplace 124 in FIG. 2 .
- the item listing 302 further includes controls 326 , 328 , and 330 for interacting with the item listing 302 via the virtual marketplace 124 , where interaction with the controls 326 , 328 , and 330 is indicative of positive feedback to the item listing 302 .
- control 326 enables a viewing user to immediately purchase the subject item 106 of the item listing 302
- control 328 enables the viewing user to add the item 106 to a shopping cart during browsing of the virtual marketplace
- control 330 enables the viewing user to favorite the item 106 .
- controls 326 , 328 , and 330 are representative aspects of the item listing 302 from which interaction data may be gleaned to ascertain a positive or negative reaction to the item listing and used as feedback data for further refining control parameters of the machine learning model 118 used to generate the item listing 302 and its subject item 106 .
- the transaction module 112 is illustrated as including a listing component 210 , a finance component 212 , and a logistics component 214 .
- the listing component 210 , the finance component 212 , and the logistics component 214 are representative of the transaction module's 112 ability to enable functionality of the standalone virtual marketplace 124 , as described above with reference to FIG. 1 .
- the listing component 210 , the finance component 212 , and the logistics component 214 are representative of functionality of the autonomous item generation system to automatically handle interactions with the virtual marketplace 124 that otherwise cannot be performed by conventional systems absent human user intervention.
- the listing component 210 is representative of functionality of the transaction module 112 to communicate and cause publication of the item listing 108 at the virtual marketplace 124 .
- the listing component 210 is representative of one or more APIs configured to interface with the virtual marketplace 124 and list the item 106 according to one or more shopping engines or price-listing platforms supported by the virtual marketplace 124 .
- the finance component 212 is representative of functionality of the transaction module 112 to interface with one or more financial institutions to transfer funds among the various entities involved in facilitating the fabrication of the item 106 , publishing the item listing 108 , purchasing the item 106 , and facilitating shipment of the item 106 to a purchasing entity 126 .
- the finance component 212 is configured to handle returns and process refunds in the event a purchasing entity 126 is dissatisfied with the item 106 and attempts to return the item 106 via the virtual marketplace 124 .
- the logistics component 214 is representative of functionality of the transaction module 112 to identify one or more shipping options for logistically transporting the item 106 to a purchasing entity 126 .
- the logistics component 126 is configured to identify geographic locations associated with a fabrication device 122 that manufactured the item 106 and the purchasing entity 126 to which the item 106 is to be transported. Given the geographic locations, the logistics component 214 is configured to interface with one or more shipping entities to obtain quotes for costs associated with transporting the item 106 to the purchasing entity 126 .
- the logistics component 214 is configured to update the item listing 108 to convey such shipping cost quotes for a particular purchasing entity 126 viewing the item listing (e.g., by updating information included in the shipping options portion 314 of the example item listing 302 illustrated in FIG. 3 .).
- the finance component 212 Upon receiving an indication from the virtual marketplace 124 of the purchasing entity 126 purchasing the item 106 , the finance component 212 is configured to interface with a financial instruction associated with the purchasing entity 126 to verify that the purchasing entity 126 has sufficient funds to purchase the item 106 and, if so, contracts with a shipping entity for transporting the item 106 to the purchasing entity 126 .
- the logistics component 214 is configured to select a particular shipping entity and shipping method for transporting the item 106 to the purchasing entity 126 based on various considerations, such as a price willing to be paid for shipping by the purchasing entity 126 , a shipping speed desired by the purchasing entity 126 , a cost for the autonomous item generation system 104 to transport the item 106 to the purchasing entity 126 , combinations thereof, and so forth.
- the transaction module 112 is configured to automatically handle interactions with the virtual marketplace 124 that otherwise cannot be performed by conventional systems absent human user intervention in facilitating the publication of the item listing 108 as well as sale activities involved with facilitating a sale of the subject item 106 for the item listing 108 .
- the feedback module 114 is configured to receive listing feedback data 216 from the virtual marketplace 124 , which is representative of analytics data provided by the virtual marketplace 124 describing one or more interactions with the item listing 108 .
- the listing feedback data 216 may specify different interactions with the item listing 302 such as a number of page views, or impressions of the item listing 302 , a number of different computing devices that viewed the item listing 302 , a number of favorites of the item listing 302 , a number of purchases of the subject item of the item listing 302 , a number of shares of the item listing 302 , and so forth.
- the listing feedback data 216 may further provide information describing a user profile associated with the interaction, such as a location of the user during the interaction, a date and time associated with the interaction, demographic information for the user profile (e.g., age, gender, etc.), historical user behavior data for the user profile relative to the virtual marketplace 124 , combinations thereof, and so forth.
- a user profile associated with the interaction such as a location of the user during the interaction, a date and time associated with the interaction, demographic information for the user profile (e.g., age, gender, etc.), historical user behavior data for the user profile relative to the virtual marketplace 124 , combinations thereof, and so forth.
- the listing feedback data 216 may provide additional levels of detail regarding interactions with the item listing 108 .
- the listing feedback data 216 may specify an amount of time spent viewing discrete portions of the item listing 302 , such as a duration spent reading the detailed description 310 , a number of user reviews displayed in navigating the user reviews portion 316 , a purchase of an item listed in the similar items portion 318 instead of the subject item of the item listing 302 , and so forth.
- the listing feedback data 216 is representative of any type and format of data that indicates a manner in which the item listing 108 was experienced or interacted with by users of the virtual marketplace 124 .
- the feedback module 114 is configured to generate at least one training dataset 218 for use in refining the machine learning model 118 . To do so, the feedback module combines the listing feedback data 216 with the item data 202 in a format that corresponds to training dataset format used to originally train the machine learning model 118 (e.g., the format of the predicted feedback data included in the training dataset generated by the training module 116 ). By virtue of its initial training, the feedback module 114 does not need to annotate or otherwise label the training dataset 218 (e.g., as quantifying or otherwise classifying the listing feedback data 216 as indicating that the item listing 108 is associated with positive or negative feedback).
- the machine learning model 118 is configured to infer relationships between different aspects of the item data 202 and the resulting interactions with the item listing 108 via the virtual marketplace. To do so, the training module 116 feeds the training dataset 218 as an input to the machine learning model 118 , which causes the machine learning model 118 to modify one or more control parameters (e.g., internal model node weights) according to a loss function for the model that penalizes negative differences between predicted and observed feedback data.
- control parameters e.g., internal model node weights
- the machine learning model 118 with its one or more modified parameters is output by the training module 116 as the refined machine learning model 220 , which is communicated to the item generation module 110 for use in place of the machine learning model 118 in subsequently generating item data 202 for a different item 106 .
- the refined machine learning model 220 As an example of an item 106 and item listing 108 subsequently output by the refined machine learning model 220 , consider FIG. 4 .
- FIG. 4 depicts an example interface 400 of the virtual marketplace 124 as displaying an item listing 402 , representative of an instance of an item listing 108 generated from item data 202 output by the refined machine learning model 220 .
- item listing 402 represents example changes between item data 202 output by the machine leaning model 118 and item data 202 output by the refined machine learning model 220 .
- item listing 402 include an item title 404 for a “Men's Double Pocket Tailored Shirt,” a digital rendering 406 of the subject item of the item listing 402 , a price 408 for the subject item, and a detailed description 410 for the subject item, which each differ from their counterpart aspects of the item listing 302 .
- Such changes may be indicative of the machine learning model 118 interpreting the listing feedback data 216 as indicating certain trends gleaned from interactions with the virtual marketplace 124 , such as that double pocketed men's collared shirts are currently more popular than collared shirts without pockets, that articles of clothing including tags noting that the clothing is “tailored” are associated with positive feedback, that item listings featuring multiple digital renderings of the subject item are associated with increased impression and purchase rates, that the revised layout of item listing 402 is preferred over the layout of listing 302 , and so forth.
- the training module 116 is configured to identify control parameters in the latent space(s) of the machine learning model 118 that correlate with different design aspects (e.g., sleeve length, pocket styles, listing tags, item fabric(s), and so forth. In this manner, by informing the machine learning model 118 of information included in the listing feedback data 216 via the training datasets 218 , the autonomous item generation system 104 is configured to adapt to changing trends and alter fabrication instructions 120 and item listing 108 characteristics for items subsequently generated by the refined machine learning model 220 automatically and without relying on guiding user intervention.
- design aspects e.g., sleeve length, pocket styles, listing tags, item fabric(s), and so forth.
- the autonomous item generation system 104 is configured to continuously monitor activities associated with item listings 108 published to the virtual marketplace 124 and refine control parameters of the machine learning model 118 used to generate the item listing 108 to adapt to inferred trends and behaviors. Because the autonomous item generation system 104 is configured to perform its continuous cycle of operations independent of user input and identify trends and behaviors to consider in refining machine learning model parameters before such trends or behaviors can be identified by a user of the autonomous item generation system 104 , the autonomous item generation system 104 is configured to output a user interface that enables a user to glean insight into the system's operations.
- FIG. 5 depicts an example interface 500 of the autonomous item generation system 104 configured for output on a display device of the computing device implementing the autonomous item generation system 104 , such as a display device of computing device 102 .
- the interface 500 includes a model selection control 502 and an audience specification control 504 .
- the audience specification control 504 enables a user of the autonomous item generation system 104 to change input parameters considered by the model selected via model selection control 502 , such that the user can observe how the changed input parameters alter a resulting item 106 generated by the machine learning model 118 .
- the autonomous item generation system updates interface 500 to output item preview 506 , which includes a preview digital rendering 508 of an item 106 that would be generated by machine learning model 118 according to input parameters specified by the selection(s) made with respect to controls 502 and 504 .
- item preview 506 is configured to include a display of any information included in item data 202 , such as visual representations of the item data 202 , textual descriptions of the item data 202 , and combinations thereof.
- model selection control 502 includes options 510 , 512 , and 514 , where option 510 enables selection of an instance of machine learning model 118 trained to generate item data 202 for men's clothing items, option 512 enables selection of an instance of machine learning model 118 trained to generate item data 202 for works of art, and option 514 enables a user of the autonomous item generation system 104 to upload their own model (e.g., an instance of the machine learning model 118 trained to generate item data 202 for an item 106 not categorized as men's clothing or works of art).
- the audience specification control 504 in the illustrated example of FIG. 5 includes options 516 , 518 , and 520 , where option 516 enables specification of a “general public” audience segment (e.g., no constraints on the audience to be considered by the machine learning model 118 ), control 518 enables designation of custom demographic parameters to be considered by the machine learning model 118 (e.g., a specified geographic region for an audience of an item listing 108 , a specified audience age and gender combination, a specified time of day for publishing the item listing 108 , and so forth), and control 520 enables designation of a particular individual user to be considered as the audience for the machine learning model's 118 generation of the item data 202 .
- option 516 enables specification of a “general public” audience segment (e.g., no constraints on the audience to be considered by the machine learning model 118 )
- control 518 enables designation of custom demographic parameters to be considered by the machine learning model 118 (e.g., a specified geographic region for an audience of an item
- interface 500 By interacting with the controls 502 and 504 of interface 500 , a user of the autonomous item generation system 104 is informed of considerations made by the autonomous item generation system 104 in performing its automatic operations. For instance, interface 500 indicates to a user of the autonomous item generation system 104 that an instance of the machine learning model 118 trained to generate art items, when considering the general public as an audience, will generate an item 106 that depicts a waterfront dock scene at sunset with certain nature aspects to achieve a realistic, photo-quality appearance, based on current parameters of the machine learning model 118 .
- the item preview 506 portion of the interface 500 may further include information that describes control parameters of the machine learning model 118 selected for the specified audience.
- the item preview portion 506 may specify that the same machine learning model 118 configured to generate landscape works of art, when targeting a Swiss audience, selects control parameters for the machine learning model 118 that promote inclusion of mountains in the landscape artwork.
- the item preview portion 506 may specify that control parameters emphasizing inclusion of beaches and oceans in the landscape artwork are to be utilized.
- the interface 500 provides a user of the autonomous item generation system 104 with insight as to what considerations are made when selecting control parameters for different machine learning models 118 , audience considerations, and combinations thereof.
- FIG. 6 depicts an example interface 600 of the autonomous item generation system 104 , where the selected option of the audience specification control 504 has been altered from option 516 to option 518 , indicating that specific audience demographic characteristics (not shown) are to be considered instead of the general public considered in the example interface 500 .
- content of the item preview 506 is altered to indicate how a resulting item 106 generated by the same machine learning model 118 would differ based on the specified audience demographic characteristics.
- interface 600 indicates to the user of the a user of the autonomous item generation system 104 that the same instance of the machine learning model 118 trained to generate art items as selected in FIG. 5 , when considering the updated audience demographic characteristics, would instead generate an item 106 that depicts a surreal mountain landscape scene.
- an interface of the autonomous item generation system 104 provides a user with tools to obtain insight regarding the ongoing revision of a particular machine learning model 118 implemented by the autonomous item generation system 104 in a manner that would not be possible by inspecting raw input and output data from the machine learning model 118 .
- FIG. 7 depicts a procedure 700 in an example implementation of autonomous item and item listing generation in accordance with aspects of the techniques described herein. Notably, each and every operation of procedure 700 is performed automatically and independent of user input or intervention.
- fabrication instructions for an item and metadata describing the item are generated (block 702 ).
- the item generation module 110 of the autonomous item generation system 104 for instance, causes machine learning model 118 to generate item data 202 , which includes fabrication instructions 120 that are useable by the fabrication device 122 to fabricate a tangible item 106 .
- the item data 202 includes metadata describing the item 106 , such as item description 204 , item tags 206 , and item pricing data 208 .
- Fabrication of the item is caused by transmitting the fabrication instructions to the fabrication device (block 704 ).
- the item generation module 110 for instance transmits the fabrication instructions 120 to the fabrication device 122 in a manner that causes the fabrication device to fabricate, manufacture, or otherwise output the item 106 .
- a listing for the item is then created using the metadata describing the item (block 706 ).
- the transaction module 122 of the autonomous item generation system 104 obtains the item data 202 from the item generation module 110 and generates item listing 108 , such as the example item listings depicted in FIGS. 3 and 4 .
- the item listing is published to a virtual marketplace and analytics data describing one or more interactions with the item listing is obtained (block 708 ).
- the transaction module 112 employs listing component 210 to interface with the virtual marketplace 124 and publish the item listing 108 in a manner that makes the item listing discoverable on the virtual marketplace 124 (e.g., to a browsing user of the virtual marketplace 124 , to a search engine indexing the virtual marketplace 124 , and so forth).
- the feedback module 114 of the autonomous item generation system 104 obtains listing feedback data 216 , which is representative of information describing one or more interactions with the item listing 108 as published to the virtual marketplace 124 .
- Example interactions include a number of views (e.g., a number of impressions) of the item listing 108 , a number of purchases of the item via the item listing 108 , a number of favorites of the item listing 108 , a number of shares of the item listing 108 , user reviews submitted for the item listing 108 , combinations thereof, and so forth.
- a number of views e.g., a number of impressions
- the listing feedback data 216 may provide additional levels of detail regarding interactions with the item listing 108 .
- the listing feedback data 216 may specify an amount of time spent viewing discrete portions of the item listing 302 , such as a duration spent reading the detailed description 310 , a number of user reviews displayed in navigating the user reviews portion 316 , a purchase of an item listed in the similar items portion 318 instead of the subject item of the item listing 302 , and so forth.
- the listing feedback data 216 is representative of any type and format of data that indicates a manner in which the item listing 108 was experienced or interacted with by users of the virtual marketplace 124 .
- Training data is then formed based on the analytics data and one or more parameters of the at least one machine learning model are modified using the training data (block 710 ).
- the feedback module 114 for instance, combines the listing feedback data together with the item data 202 generated by the machine learning model 118 as training dataset 218 .
- the format of training dataset 218 output by the feedback module 114 varies according to the machine learning model implanted by the item generation module 110 and depends on a format of training datasets used to originally train the machine learning model 118 .
- the training dataset is then passed to the training module 116 , which provides the training dataset 218 as input to the machine learning model 118 .
- the machine learning model 118 Upon input of the training dataset 218 , the machine learning model 118 is configured to process the training dataset 218 according to one or more objective functions upon which the machine learning model 118 was initialized, together with one or more loss functions that penalize negative differences between predicted and observed feedback data, thereby causing the machine learning model 118 to refine one or more internal parameters via processing of the training dataset 218 .
- the machine learning model 118 with its one or more modified parameters is then output as refined machine learning model 220 .
- fabrication instructions for an additional item and metadata describing the additional item are generated (block 712 ).
- the autonomous item generation system 104 performs the operations as described above with respect to block 702 , using the refined machine learning model 220 instead of the machine learning model 118 . Operation of procedure 700 then optionally returns to block 704 , continuing to refine model parameters based on analytic data describing interactions with item listings 108 generated by the autonomous item generation system 104 .
- FIG. 8 depicts a procedure 800 in an example implementation of outputting a user interface for an autonomous item generation system in accordance with aspects of the techniques described herein.
- a display of a user interface for an autonomous item generation system that includes controls for specifying a machine learning model to be used in generating an item and an audience for the machine learning model to consider in generating the item is output (block 802 ).
- the autonomous item generation system 104 for instance, outputs interface 500 at a display of computing device 102 .
- the interface 500 includes model selection control 502 and audience specification control 504 .
- the model selection control 502 enables selection of a particular machine learning model 118 to be implemented by the autonomous item generation system 104 and the audience specification control 504 enables a user to change input parameters considered by the model selected via model selection control 502 and observe how the changed input parameters alter a resulting item 106 generated by the machine learning model 118 .
- Input is received at the user interface specifying at least one of the machine learning model to be used, or the audience to be considered, in generating the item (block 804 ).
- a selection of one or more of options 510 , 512 , or 514 offered by the model selection control 502 and/or one or more options 516 , 518 , or 520 of the audience specification control 504 is received.
- the user interface is then updated to display a preview of the item as generated by the selected machine learning model for the specified audience (block 806 ).
- the autonomous item generation system updates interface 500 to output item preview 506 , which includes a preview digital rendering 508 of an item 106 that would be generated by machine learning model 118 according to input parameters specified by the selection(s) made with respect to controls 502 and 504 .
- machine learning model 118 control parameters are alternatively or additionally output in the item preview 506 portion of the interface 500 .
- Operation of procedure 800 then optionally returns to block 804 , enabling selection of a different combination of the one or more of options 510 , 512 , or 514 offered by the model selection control 502 and/or one or more options 516 , 518 , or 520 of the audience specification control 504 .
- interface 600 depicts an update to the item preview 506 from that depicted in the illustrated example of FIG. 5 , responsive to a different option selected from the audience specification control 504 .
- the user interface output by procedure 800 enables a user of the autonomous item generation system 104 to glean insight into operations of the autonomous item generation system 104 that would otherwise be unable to ascertain from inspection of raw data inputs to, and outputs from, the machine learning model 118 .
- FIG. 9 illustrates an example system generally at 900 that includes an example computing device 902 that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. This is illustrated through inclusion of the autonomous item generation system 104 .
- the computing device 902 may be, for example, a server of a service provider, a device associated with a client (e.g., a client device), an on-chip system, and/or any other suitable computing device or computing system.
- the example computing device 902 includes a processing system 904 , one or more computer-readable media 906 , and one or more I/O interface 908 that are communicatively coupled, one to another.
- the computing device 902 may further include a system bus or other data and command transfer system that couples the various components, one to another.
- a system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
- a variety of other examples are also contemplated, such as control and data lines.
- the processing system 904 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 904 is illustrated as including hardware elements 910 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors.
- the hardware elements 910 are not limited by the materials from which they are formed or the processing mechanisms employed therein.
- processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)).
- processor-executable instructions may be electronically-executable instructions.
- the computer-readable storage media 906 is illustrated as including memory/storage 912 .
- the memory/storage 912 represents memory/storage capacity associated with one or more computer-readable media.
- the memory/storage component 912 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth).
- the memory/storage component 912 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth).
- the computer-readable media 906 may be configured in a variety of other ways as further described below.
- Input/output interface(s) 908 are representative of functionality to allow a user to enter commands and information to the example service provider device 902 , and also allow information to be presented to the user and/or other components or devices using various input/output devices.
- input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to recognize movement as gestures that do not involve touch), and so forth.
- Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth.
- the computing device 902 may be configured in a variety of ways as further described below to support user interaction.
- modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types.
- module generally represent software, firmware, hardware, or a combination thereof.
- the features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
- Computer-readable media may include a variety of media that may be accessed by the example computing device 902 .
- computer-readable media may include “computer-readable storage media” and “computer-readable signal media.”
- Computer-readable storage media may refer to media and/or devices that enable persistent and/or non-transitory storage of information, in contrast to mere signal transmission, carrier waves, or signals per se. Thus, computer-readable storage media refers to non-signal bearing media.
- the computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data.
- Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.
- Computer-readable signal media may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 902 , such as via a network.
- Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism.
- Signal media also include any information delivery media.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.
- hardware elements 910 and computer-readable media 906 are representative of modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein, such as to perform one or more instructions.
- Hardware may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware.
- ASIC application-specific integrated circuit
- FPGA field-programmable gate array
- CPLD complex programmable logic device
- hardware may operate as a processing device that performs program tasks defined by instructions and/or logic embodied by the hardware as well as a hardware utilized to store instructions for execution, e.g., the computer-readable storage media described previously.
- software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 910 .
- the example computing device 902 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of a module that is executable by the example computing device 902 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 910 of the processing system 904 .
- the instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more example computing devices 902 and/or processing systems 904 ) to implement techniques, modules, and examples described herein.
- the techniques described herein may be supported by various configurations of the computing device 902 and are not limited to the specific examples of the techniques described herein. This functionality may also be implemented all or in part through use of a distributed system, such as over a “cloud” 914 via a platform 916 as described below.
- the cloud 914 includes and/or is representative of a platform 916 for resources 918 .
- the platform 916 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 914 .
- the resources 918 may include applications and/or data that can be utilized while computer processing is executed on servers that are remote from the example computing device 902 .
- Resources 918 can also include services provided over the Internet and/or through a subscriber network, such as a cellular or Wi-Fi network.
- the platform 916 may abstract resources and functions to connect the example computing device 902 with other computing devices.
- the platform 916 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for the resources 918 that are implemented via the platform 916 .
- implementation of functionality described herein may be distributed throughout the system 900 .
- the functionality may be implemented in part on the example computing device 902 as well as via the platform 916 that abstracts the functionality of the cloud 914 .
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- General Physics & Mathematics (AREA)
- Strategic Management (AREA)
- Development Economics (AREA)
- Economics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Entrepreneurship & Innovation (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Human Computer Interaction (AREA)
- Tourism & Hospitality (AREA)
- Human Resources & Organizations (AREA)
- Game Theory and Decision Science (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Medical Informatics (AREA)
- Quality & Reliability (AREA)
- Manufacturing & Machinery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Primary Health Care (AREA)
- Operations Research (AREA)
- Biomedical Technology (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Bioethics (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Materials Engineering (AREA)
Abstract
Description
- This application claims the benefit of priority to U.S. Patent Application Ser. No. 62/899,060, filed on Sep. 11, 2019, which is hereby incorporated by reference in its entirety.
- Virtual marketplaces such as network-based commerce systems are increasingly becoming a preferred mechanism by which vendors offer goods and services for sale, in contrast to conventional brick-and-mortar retail stores. Although the proliferation of virtual marketplaces enables vendors to reach a wider audience and offer shopping experiences that are curated for individual users, virtual marketplaces still present significant disadvantages. For instance, while virtual marketplaces provide item listings with certain tools to facilitate purchase of items, such as search interfaces to browse item listings and controls for purchasing a subject item of an item listing, virtual marketplaces require vendors to manually designate and design specific aspects of each item listing for publication at the virtual marketplace.
- Additionally, while virtual marketplaces connect interested buyers with vendors, it remains the vendor's responsibility to manually perform various operations required to complete the sale of an item, such as processing financial transactions, contracting for shipment of the item to the buyer, and so forth. Furthermore, due to the computational and network resources required to aggregate and analyze data that describes potential buyer feedback pertaining to an item listing, virtual marketplaces only sporadically provide vendors with information describing item listing feedback. Due to this limited information, vendors are unable to instantly realize how an item listing, or the subject item, should be modified to account for changing trends and behaviors. Consequently, vendors are faced with undue delay at each stage in an item's lifecycle, such as item conception, item manufacture, item listing, item delivery, ascertaining item feedback, item design modification, and so forth.
- To overcome these problems, a system and techniques for autonomous item generation are described. An autonomous item generation system receives at least one machine learning model trained to generate both fabrication instructions for generating an item as well as metadata describing the item, automatically and independent of user input. The autonomous item generation system causes the at least one machine learning model to generate the fabrication instructions and metadata for an item. The autonomous item generation system then transmits the fabrication instructions to a fabrication device, which causes the fabrication device to generate the item. A listing for the item is generated from the item metadata output by the at least one machine learning model, and the autonomous item generation system publishes the listing to a virtual marketplace. The autonomous item generation system is configured to obtain analytics data describing one or more interactions with the item listing as published to the virtual marketplace, such as views of the item listing, favorites of the item listing, purchases of the item via the item listing, navigation from the item listing to a different item listing, shares of the item listing, comments on the item listing, user feedback to the item listing, combinations thereof, and so forth.
- In some implementations, the autonomous item generation system is configured to serve as the virtual marketplace by publishing the item listing to a network (e.g., the Internet) and monitoring network traffic pertaining to the item listing. Alternatively or additionally, the autonomous item generation system is configured to leverage an existing virtual marketplace platform and implement one or more application programming interfaces (APIs) of the virtual marketplace to cause publication of the item listing on the virtual marketplace and obtain analytics data pertaining to the item listing from the virtual marketplace. Based on the analytics data, and optionally additional feedback data pertaining to the item, the autonomous item generation system forms training data for refining the at least one machine learning model. The training data is provided as input to the at least one machine learning model, which causes modification of one or more control parameters of the at least one machine learning model. The autonomous item generation system then generates fabrication instructions and metadata for an additional item using the at least one machine learning model with its modified parameter(s) and repeats its operations to continuously refine the machine learning model(s), without requiring user input or intervention to do so.
- This Summary introduces a selection of concepts in a simplified form that are further described below in the Detailed Description. As such, this Summary is not intended to identify essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- The detailed description is described with reference to the accompanying figures.
-
FIG. 1 is an illustration of an environment in an example implementation that is operable to employ the autonomous item generation techniques described herein. -
FIG. 2 depicts an example implementation in which an autonomous item generation system ofFIG. 1 generates an item and an item listing for the item using at least one machine learning model and refines the at least one machine learning model based on feedback data for the item. -
FIG. 3 depicts an example implementation of a user interface displaying an item listing generated by the autonomous item generation system ofFIG. 1 . -
FIG. 4 depicts an example implementation of a user interface displaying an item listing generated by the autonomous item generation system ofFIG. 1 . -
FIG. 5 depicts an example implementation of a user interface for the autonomous item generation system ofFIG. 1 . -
FIG. 6 depicts an example implementation of a user interface for the autonomous item generation system ofFIG. 1 . -
FIG. 7 is a flow diagram depicting a procedure in an example implementation for the autonomous item generation system ofFIG. 1 generating an item using at least one machine learning model and refining the machine learning model for generating additional items. -
FIG. 8 is a flow diagram depicting a procedure in an example implementation for outputting and modifying a user interface for the autonomous item generation system ofFIG. 1 . -
FIG. 9 illustrates an example system including various components of an example device that can be implemented as any type of computing device as described and/or utilized with reference toFIGS. 1-8 to implement embodiments of the techniques described herein. - With advances in computing device technology, virtual marketplaces are increasingly used as a mechanism to publish item listings (e.g., offers of goods for sale). While such virtual marketplaces enable vendors to reach a wider audience than otherwise enabled by conventional brick-and-mortar storefronts, vendors continue to face undue burdens involved with bringing an initial concept design to a tangible item, determining a strategy for listing the item on a virtual marketplace, publishing the listing to the virtual marketplace, entering into transactions with potential buyers, securing shipping for transporting the item to a buyer, and verifying that the item actually reaches the buyer. For instance, from designers tasked with creating item designs, manufacturers tasked with ensuring the resulting item matches the item's design, marketers tasked with creating listings for the item, and sales representatives tasked with ensuring the item reaches the buyer, each of these various actions required to develop an item from conception to a tangible good requires substantial human input and is thus prone to human error and delay.
- These inefficiencies are further compounded when considering the lifecycle of an item's development, which requires further time and resources to ascertain feedback describing the public's reaction to the item before determining modifications to be incorporated in subsequent item design iterations. For instance, due to the computational and network resources required to collect, store, and analyze virtual marketplace user behavior information, virtual marketplaces are dissuaded from continuously analyzing and publishing user behavior information due to the prohibitive amount of resources required to do so. This delay in aggregating and providing user behavior information forces vendors to wait for publication of the information, and further spend significant time analyzing the behavior information in the hope of identifying certain characteristics of the item design, the item listing, or other aspects in the pipeline of item conception to delivery that can benefit from improvement. Furthermore, an accuracy of such vendor analysis is dependent on an expertise level of individuals tasked with gleaning trends from user behavior information, and even the most skilled team of individuals is unable to simultaneously handle analysis for multiple different items, much less an entire item catalog.
- Accordingly, systems and techniques are described herein that enable autonomous item generation, in which generation of instructions for fabricating, manufacturing, or otherwise creating a tangible item and generation of a listing for the tangible item at a virtual marketplace are performed automatically, and independent of user input. To do so, an autonomous item generation system employs at least one machine learning model trained to generate, for a tangible item, item data that includes fabrication instructions for the item, a description for the item, tags describing various attributes of the item, and a recommended price for listing the item at a virtual marketplace. In addition to automatically generating the fabrication instructions and listing for the item, the autonomous item generation system is further configured to interface with the virtual marketplace to automatically perform operations involved with transporting the item from a fabrication device used to generate the item to a purchasing entity, including financial transactions, shipping operations, and so forth.
- The autonomous item generation system is further configured to obtain feedback data describing one or more interactions with the item listing as published to the virtual marketplace, and continuously modify control parameters of the machine learning model used to generate the item and item listing, all without human user intervention or guidance. In this manner, the autonomous item generation system is configured to identify virtual marketplace trends and behaviors in real-time, well before even the most skilled human analyst could detect the same trends and behaviors when provided with the same behavior data. Thus, the autonomous item generation system and techniques described herein are configured to generate items and item listings without human guidance, identify observed interactions with the product listings, and continuously adapt to such observed interactions in generating subsequent items and item listings at both a rate and a scale that is not possible via conventional systems.
-
FIG. 1 illustrates adigital medium environment 100 in an example implementation that is operable to employ the autonomous item generation techniques described herein. The illustratedenvironment 100 includescomputing device 102, which may be implemented according to a variety of different configurations. Thecomputing device 102, for instance, may be configured as a desktop computer, a laptop computer, a mobile device (e.g., assuming a handheld configuration such as a tablet or mobile phone), and so forth. Thus, thecomputing device 102 may range from full resource devices with substantial memory and/or processing resources to devices with limited memory and/or processing resources (e.g., mobile devices). Additionally, although asingle computing device 102 is shown, thecomputing device 102 may be representative of a plurality of different devices, such as multiple servers utilized by a business to perform operations “over the cloud” as described in further detail below with respect toFIG. 9 . - The
computing device 102 is illustrated as including an autonomousitem generation system 104. The autonomousitem generation system 104 is implemented at least partially in hardware of thecomputing device 102 and represents functionality of thecomputing device 102 to generate anitem 106 and an item listing 108 publication at a virtual marketplace, automatically and independent of user input, guidance, instruction, or other form of intervention that facilitates generation of theitem 106 oritem listing 108. In this manner, theitem 106 is representative of a tangible good, product, and the like. In some implementations, theitem 106 is representative of digital content, such as a digital graphic, an animation, a video, and so forth. Theitem listing 108 is representative of publication information describing theitem 106, as described in further detail below. - To enable generation of the
item 106 and theitem listing 108, the autonomousitem generation system 104 employs anitem generation module 110, atransaction module 112, afeedback module 114, and atraining module 116. Theitem generation module 110, thetransaction module 112, thefeedback module 114, and thetraining module 116 are implemented at least partially in hardware of the computing device 102 (e.g., through use of a processing system and computer-readable storage media), as described in further detail below with respect toFIG. 9 . - The
item generation module 110 is representative of functionality of thecomputing device 102 to generate theitem 106 anditem listing 108, without user input otherwise required by conventional systems (e.g., input specifying design criteria for theitem 106, input specifying information to be included or emphasized in the item listing 108, input specifying demographic targeting information for theitem listing 108, and so forth). To do so, theitem generation module 110 employsmachine learning model 118.Machine learning model 118 is representative of one or more machine learning models, where each model represented bymachine learning model 118 can be configured according to a range of different machine learning model architectures. For instance,machine learning model 118 is representative of a model having an architecture based on neural networks (e.g., fully-connected neural networks, convolutional neural networks, or recurrent neural networks), deep learning networks, generative adversarial networks (GANs), decision trees, support vector machines, linear regression, logistic regression, Bayesian networks, random forest learning, dimensionality reduction algorithms, boosting algorithms, combinations thereof, and so forth. - Regardless of an underlying machine learning model architecture,
machine learning model 118 is representative of one or more trained machine learning models that are configured to generatefabrication instructions 120 for theitem 106 and metadata describing theitem 106, where the metadata is useable by the autonomousitem generation system 104 to generate theitem listing 108. For instance, themachine learning model 118 may be representative of a GAN that is trained to generate fabrication instructions for a particular type of item (e.g., an article of clothing, a work of art, a three-dimensional model, and so forth). Such a trained GAN implementation of themachine learning model 118 may be generated by providing themachine learning model 118 with a plurality of training sets, where each training set includes information that is useable to guide the machine learning model towards producing a desired output. - For instance, in the context of a GAN trained to generate fabrication instructions and metadata for an article of clothing, each training dataset may include fabrication instructions for an example article of clothing (e.g., instructions describing one or more fabrics or materials to be used in generating the article of clothing; cut patterns for the one or more fabrics or materials; adhesion instructions for combining the cuts of materials or fabrics such as sewing patterns, sewing thread types, fabric adhesive types, and the like; folding instructions for the article of clothing; and so forth). In addition to fabrication instructions for the example article of clothing, the training dataset may further include metadata describing the example article of clothing (e.g., a name for the article of clothing, a product category for the article of clothing, descriptive attributes of the article of clothing, a demographic audience for the article of clothing, and so forth). Each training dataset for the example
machine learning model 118 trained to generate fabrication instructions and metadata for an article of clothing may further include information describing feedback pertaining to the subject article of clothing for the training dataset. For instance, such feedback information may specify a number of times a listing for the article of clothing was viewed via a virtual marketplace, a number of purchases made of the article of clothing, a percentage of views that resulted in a share or favorite of the article of clothing, reviews for the article of clothing, information specifying comparative feedback data for different articles of clothing listed on the virtual marketplace, information describing an appearance of a listing for the article of clothing on the virtual marketplace, combinations thereof, and so forth. - In this mariner, the
machine learning model 118 is representative of a model trained on a generic dataset for one or more characteristics to be learned by the model during training. For instance, continuing the example of amachine learning model 118 trained to learn artwork characteristics, themachine learning model 118 is representative of a model trained on a training dataset that includes a collection of different works of art that share one or more common characteristics (e.g., being portraits of a human subject, depicting landscapes, being abstract art, being vector artwork, comprising a certain color palette, and so forth). As such, the characteristics learned bymachine learning model 118 during training are dictated by the contents of the dataset used during training, such that the training dataset defines a style or theme of the machine learning model's 118 output following training. - Accordingly, the
machine learning model 118 may be representative of a model trained on a training dataset that consists of human portraits, such that themachine learning model 118 is configured to output artwork that depicts human portraits after training is complete. In some implementations,machine learning model 118 is representative of one or more models trained on a plurality of different training datasets. For instance, continuing the previous example of a training dataset consisting of human portraits, after training themachine learning model 118 to output artwork depicting human portraits, training may continue using an additional training dataset that includes artwork depicting landscapes, such that themachine learning model 118 is further trained to output artwork depicting human portraits against landscape backgrounds. - In a similar manner, the
machine learning model 118 may be representative of a plurality of different machine learning models arranged in a stacked configuration, where the output of one model is provided as input to a different model of the stacked configuration. In such an example implementation, each model of the stacked configuration of machine learning models may be trained on a different training dataset, such as one trained to output landscape artwork, another trained to output abstract artwork, and another trained to output watercolor artwork. Continuing this example stacked configuration, landscape artwork output by the initial model would be provided as input to the model trained to output abstract artwork, which would output an abstract artwork representation of the input landscape, which in turn would be provided to the watercolor artwork-trained model, such that the resulting output of the stacked configuration of machine learning models is an abstract watercolor landscape work of art. Thus, the specific characteristics learned by themachine learning model 118 are dependent on the training dataset(s) used to generate themachine learning model 118 and are not restricted to the examples provided herein. - As an additional example, in an implementation where the
machine learning model 118 represents a GAN trained to generate fabrication instructions and metadata for a piece of art, each training dataset may include fabrication instructions for a specific piece of art (e.g., instructions designating one or more materials to use in generating the piece of art such as paint type and color, ink, paper, canvas, combinations thereof, and the like; printing instructions for generating the piece of art on a particular medium; dimensional constraints for the piece of art; and so forth). Individual training datasets supplement the fabrication instructions for the piece of art by including metadata describing the particular piece of art (e.g., a title for the piece of art, a description of the art, tags for listing the piece of art in a virtual marketplace, a recommended price for the piece of art, combinations thereof, and so forth). Each training dataset may further include feedback information for the piece of art, such as feedback information similar to that described above with respect to the example training dataset for training themachine learning model 118 to generate fabrication instructions for an article of clothing. - Given such example training datasets, in an example implementation where the
machine learning model 118 is configured as a GAN, the GAN may be trained by causing different portions of the GAN (e.g., a generator portion and a discriminator portion) to compete in an adversarial objective (e.g., a min-max game) that seeks to maximize positive feedback associated with acorresponding item 106 generated according tofabrication instructions 120 output by the GAN. For instance, the feedback data of the example training datasets may be normalized on a scale that indicates whether feedback data for an item is generally positive or negative (e.g., feedback data indicating numerous views, purchases, shares, positive reviews of the item may be characterized and quantified as indicating positive feedback for the subject item of the training dataset). - Such positive feedback data can be contrasted with feedback data indicating few purchases, shares, or favorites of the item, feedback data indicating negative user reviews, and/or feedback data indicating a view of the item and subsequent purchase of a different similar item, which may be characterized and quantified as indicating negative feedback for the subject item of the training dataset. Under a training objective function, the
machine learning model 118 may be configured to generatefabrication instructions 120 and metadata for anitem 106 in a manner that seeks to maximize positive feedback data for theitem 106. - In some implementations, training the
machine learning model 118 includes supplementing training data from the training datasets with noise (e.g., Gaussian input noise), which causes the generator portion of the GAN to generate samples that could potentially fool the discriminator portion in the mini-max game objective example. In this manner, themachine learning model 118 is representative of one or more machine learning models that are trained to identify different aspects of item fabrication instructions and/or descriptive metadata for the item that influences positive feedback associated with the item. - During training on an initial dataset, the
machine learning model 118 is instructed (e.g., via an objective function using convolutional neural networks) to generate an output defined by characteristics of a training dataset (e.g., articles of clothing with long sleeves). Outputs of themachine learning model 118 are then compared to the training dataset, and the model is governed to generated realistic articles of clothing with long sleeves based on a loss function determined from the comparison (e.g., F1 loss, visual perceptual quality loss, combinations thereof, and so forth in an implementation where themachine learning model 118 is configured as a GAN). Training continues until themachine learning model 118 converges and consistently generates outputs that are within a comparative threshold to the training dataset, at which point themachine learning model 118 is output, or adapted to different characteristics using additional training datasets. - In some implementations, the
machine learning model 118 is further trained to identify differences in feedback data associated with an item among different demographic segments. For instance, training data may indicate that for a same article of clothing (e.g., a down jacket), the article of clothing is generally associated with positive feedback for a particular geographic location demographic segment during a three month window (e.g., during a winter season for the particular geographic location) and is generally associated with negative feedback for the northern hemisphere demographic segment at other times (e.g., during spring, summer, and fall seasons for the particular geographic location). In this manner, themachine learning model 118 is further trained with the objective of maximizing positive feedback associated for an item at an audience-specific level, where the audience can be constrained according to any range of control parameters (e.g., geographic location, time of day, day(s) of week, audience age, audience gender, combinations thereof, and so forth). - In some implementations, the
machine learning model 118 may be received by the autonomousitem generation system 104 together with an indication of control parameters in the machine learning model's 118 latent space(s). Alternatively or additionally, thetraining module 116 is configured to identify one or more control parameters in the machine learning model's 118 latent space(s). Such control parameters correlate to any aspect of the machine learning model's 118 output. For instance, in an example where themachine learning model 118 is configured to output works of art depicting landscapes, one control parameter may affect a size of the sky in the resulting landscapes, another control parameter may define a color palette (e.g., one or more colors) used in depicting mountains in the landscape, another control parameter may describe a medium on which the landscape is depicted, another control parameter may describe characteristics of a demographic audience for which the landscape is generated, and so forth. In implementations where themachine learning model 118 is received by the autonomousitem generation system 104 without an indication as to which control parameters correlate with specific output characteristics, thetraining module 116 is configured to identify control parameters by adjusting individual control parameters of themachine learning model 118 and determining how the adjustment affects the resulting model output. - Thus, regardless of architectural configuration of the
machine learning model 118, themachine learning model 118 is representative of one or more trained machine learning models that are configured to generatefabrication instructions 120 for theitem 106 and metadata describing theitem 106 that is useable by the autonomousitem generation system 104 to generate the item listing 108, in a manner that seeks to maximize positive feedback associated with theitem 106. - Upon generating
fabrication instructions 120 for theitem 106 using themachine learning model 118, theitem generation module 110 is configured to transmit thefabrication instructions 120 to afabrication device 122, which is representative of one or more machines that are configured to generate theitem 106, responsive to receipt of the fabrication instructions. For instance, in an example implementation where theitem 106 is an article of clothing, thefabrication device 122 is representative of one or more textile machines, such as a textile sourcing machine, a textile spinning machine, a textile finishing machine, cloth finishing machine, a knitting machine, a fabric seaming machine, a crochet machine, a quilting machine, a tufting machine, a weaving machine, a component (e.g., zipper, button, etc.) manufacturing machine, a measuring machine, a cutting machine, an embroidery machine, a sewing machine, a washing machine, a drying machine, a folding machine, a monogramming machine, an applique attachment machine, combinations thereof, and the like. - As another example, in an implementation where the
item 106 is a piece of art, thefabrication device 122 may be configured as one or more of a two-dimensional printer, a three-dimensional printer, a computer numerical control (CNC) machine, combinations thereof, and so forth. As yet another example, in an implementation where theitem 106 is representative of digital content, the fabrication device may be representative of computer-aided design (CAD) software implemented at least partially in hardware of a computing device, such as in hardware of thecomputing device 102. Thus, thefabrication device 122 is representative of any one or combination of multiple devices that are capable of generating theitem 106 based on thefabrication instructions 120 output by themachine learning model 118. - The
transaction module 112 is representative of functionality of thecomputing device 102 to generate the item listing 108 for theitem 106, based on the metadata describing theitem 106 as output by themachine learning model 118. Theitem listing 108 generated by thetransaction module 112 is representative of information that describes an appearance of the item listing 108 as published at avirtual marketplace 124, both as visually appearing to a viewing user of themarketplace 124 as well as appearing in data observed by a search engine (e.g., when indexingvirtual marketplace 124 or otherwise becoming aware of the item listing 108). For instance, the item listing 108 is representative of data specifying at least one of a title for theitem 106, a detailed description for theitem 106, a representative image for theitem 106, a suggested price for theitem 106, one or more different items that are similar to theitem 106, combinations thereof, and so forth. Example implementations of anitem listing 108 generated by thetransaction module 112 are illustrated inFIGS. 3 and 4 and described in further detail below. - The
transaction module 112 is further representative of functionality of the computing device to interface with thevirtual marketplace 124 or directly implement the virtual marketplace as part of the autonomousitem generation system 104. For instance, thevirtual marketplace 124 is representative of a service configured to publishitem listings 108 where items (e.g., tangible goods) are offered for sale. In some implementations, thevirtual marketplace 124 is representative of a social networking system or other type of informational system that is configured to output the item listing 108 for display to one or more users. Thevirtual marketplace 124 may be hosted on dedicated or shared server machines (not shown) that are communicatively coupled to enable communications between the server machines. In implementations where thevirtual marketplace 124 is implemented at thecomputing device 102, thevirtual marketplace 124 may be implemented over distributed servers as described in further detail below with respect toFIG. 9 . In this manner, thevirtual marketplace 124 is further representative of connected components that allow the components to share and access common data, such as data hosted on one or more databases. - The
virtual marketplace 124 is further representative of a platform that provides at least one of a publishing mechanism, a listing mechanism, or a price-setting mechanism that enable a seller to list, or publish information pertaining to tangible goods for sale. In a similar manner, thevirtual marketplace 124 is representative of a platform that enables a buyer to express interest in, or indicate a desire to purchase, the tangible goods offered for sale. In this manner, thevirtual marketplace 124 may comprise at least one publication engine and at least one shopping engine. The publication engine of the virtual marketplace is associated with one or more Application Programming Interfaces (APIs) that enable thetransaction module 112 to communicate the item listing 108 to thevirtual marketplace 124 and cause thevirtual marketplace 124 to publish the item listing 108 in a manner that can be observed and interacted with by a user of the virtual marketplace (e.g., a potential buyer of the item 106). The shopping engine of thevirtual marketplace 124 is associated with one or more APIs that enable a user of thevirtual marketplace 124 to accept an offer for sale of theitem 106 by agreeing to pay a price associated with theitem 106. In some implementations, the APIs supported by the shopping engine of thevirtual marketplace 124 support different price listing formats for publication of theitem listing 108. As an example, price listing formats include fixed-price listing formats (e.g., the traditional classified advertisement-type listing or a catalog listing), auction-type price listing formats, buyout-type listing formats (e.g., the Buy-It-Now (BIN) technology developed by eBay Inc., of San Jose, Calif.), combinations thereof, and so forth. - The
virtual marketplace 124 is further representative of a navigation engine that enables a user of the virtual marketplace to browse and inspectvarious item listings 108 published by thevirtual marketplace 124. For instance, the navigation engine of thevirtual marketplace 124 enables a user to identify and discover various item listings by providing a search module that enables keyword and/or image searches ofitem listings 108 or other information published by thevirtual marketplace 124. In some implementations, thevirtual marketplace 124 may organizeitem listings 108 according to various data structures (e.g., category, catalog, or other form of classification for differentiating andgrouping item listings 108, relative to one another). In this manner, thevirtual marketplace 124 may provide tools that enable users to browse publisheditem listings 108 according to metadata tags that categorize the item listing 108, rather than having to index through an entirety ofitem listings 108 published by the virtual marketplace. Various other navigation techniques and item listing classification and categorization approaches may be enabled by thevirtual marketplace 124 without departing from the spirit and scope of the examples described herein. - The
virtual marketplace 124 is further representative of a messaging system that enables generation and delivery of various entities involved with facilitating a transaction via thevirtual marketplace 124. For instance, the messaging system implemented by thevirtual marketplace 124 may facilitate communications among a selling entity that published the item listing 108 to the virtual marketplace, apurchasing entity 126 that purchases theitem 106 via interaction with the item listing 108, a shipping entity (not shown) contracted to physically deliver theitem 106 from thefabrication device 122 to thepurchasing entity 126, one or more financial institutions tasked with transferring funds among the various entities (e.g., the autonomousitem generation system 104, thevirtual marketplace 124, thefabrication device 122, the shipping entity, thepurchasing entity 126, and so forth). - Communication of data among these various entities involved in facilitating the fabrication of the
item 106, the publication of the item listing 108 to the virtual marketplace, the contracting for sale of theitem 106 to thepurchasing entity 126, the delivery of theitem 106 to thepurchasing entity 126, and the transfer(s) of funds among the various entities is enabled via thenetwork 128. Thenetwork 128 is representative of a real time communication protocol that connects the autonomousitem generation system 104 to one or more of thefabrication device 122, thevirtual marketplace 124, thepurchasing entity 126, one or more shipping entities, and one or more financial institutions involved in these example activities. For instance, thenetwork 128 may represent functionality of a real-time communication protocol, such as a remote procedure call that enables a streaming, always-connected link among different entities. Alternatively or additionally, thenetwork 128 may be representative of the Internet, a subscriber network such as a cellular of Wi-Fi network, combinations thereof, and so forth. - The
feedback module 114 is representative of functionality of thecomputing device 102 to obtain analytics data from thevirtual marketplace 124, such as information describing user feedback pertaining to the item listing 108 as published at thevirtual marketplace 124. Data obtained by thefeedback module 114 is representative of any form of information that indicates a manner in which the item listing 108 was observed and/or interacted with by users of thevirtual marketplace 124. For instance, thefeedback module 114 may include one or more APIs configured to obtain data describing a number of views (e.g., a number of impressions) of the item listing 108, a number of purchases of the item via the item listing 108, a number of favorites of the item listing 108, a number of shares of the item listing 108, user reviews submitted for the item listing 108, combinations thereof, and so forth. - The
feedback module 114 is further representative of functionality of the autonomousitem generation system 104 to obtain user profile information (e.g., age, gender, location, and so forth) pertaining to individual users that interacted with the item listing 108, and metadata describing the interaction (e.g., a duration of the interaction, specific aspects of the item listing 108 with which a user interacted, an amount of time spent viewing certain portions of the item listing, a date and time of the interaction, combinations thereof, and so forth). Thus, thefeedback module 114 is representative of functionality of the autonomousitem generation system 104 to understand how the particular item listing 108 for theitem 106 was received, relative to other item listings published on thevirtual marketplace 124. - The
training module 116 is representative of functionality of the autonomousitem generation system 104 to modify themachine learning model 118 by modifying at least one control parameter of themachine learning model 118 based on data obtained by thefeedback module 114 pertaining to theitem listing 108. In this manner, thetraining module 116 is configured to generate additional training datasets for refining themachine learning module 118, where each additional training dataset includes data describing the item listing 108 and thefabrication instructions 120 for theitem 106 along with at least one instance of analytics data obtained by thefeedback module 114 describing a user interaction with the item listing 108 at thevirtual marketplace 124. - For instance, the
training module 116 may generate a training dataset that includes the control parameters for themachine learning model 118 used in generating thefabrication instructions 120 and metadata for the item 106 (e.g., metadata used to generate the item listing 108). Thetraining module 116 supplements the control parameters in the training dataset with predicted feedback data for theitem 106, and provides the training dataset to themachine learning model 118 as input along with a loss function that penalizes differences, between predicted feedback data of the training dataset and observed feedback data obtained from thefeedback module 114, indicating that the control parameters resulted in negative feedback for theitem 106. By penalizing differences indicating negative feedback relative to predicted and observed feedback data (e.g., differences indicating that a predicted average user review score for theitem 106 is higher than an observed average user review score for the item 106), the loss function implemented by thetraining module 116 rewards differences indicating that feedback pertaining to theitem 106 was more positive than predicted. - Through training based on such example training datasets compared to observed feedback data, the
training module 116 is configured to cause themachine learning model 118 to output ideal control parameters for use in generatingfabrication instructions 120 and metadata useable to create an item listing 108 for a subsequent iteration of theitem 106. In some implementations, control parameters output through this training process may be configured as a ranking of different combinations of control parameters for themachine learning model 118, where the ranking is ordered based control parameter combinations that are likely to garner the most positive feedback via publication to thevirtual marketplace 124. - The
training module 116 is configured to refine control parameters of themachine learning model 118 using any number of different loss functions. For instance, loss functions may be layered, such that the loss function penalizing negative differences between predicted and observed user review average scores is layered with a loss function that penalizes negative feedback differences between predicted and observed user interactions with the item listing 108 (e.g., click-through rates, purchase rates, etc.). In some implementations, in addition to training themachine learning model 118 using loss functions that consider differences between predicted and observed feedback data, thetraining module 116 is configured to employ one or more multi-armed bandit approaches to explore novel control parameter combinations that differ from previously attempted control parameter combinations for themachine learning model 118. - In this manner, the
training module 116 is configured to continuously refine themachine learning model 118 to adapt its subsequent generation offabrication instructions 120 and item metadata for use in generating the item listing 108 foradditional items 106, while accounting for user behaviors and trends at thevirtual marketplace 124. By enabling performance of theitem generation module 110, thetransaction module 112, thefeedback module 114, and thetraining module 116 to be accomplished independent of user input or other intervention that guides the generation of thefabrication instructions 120 or design of the item listing 108, the autonomousitem generation system 104 advantageously enables real-time adaptation to changes in user behavior and trends at thevirtual marketplace 124 at a rate that is impossible to achieve by conventional systems that require human input or intervention. The advantages enabled by the autonomousitem generation system 104 relative to conventional approaches are exponentially increased when generating a catalog ofitems 106 andcorresponding item listings 108, as the human hours required by conventional approaches prohibit generatingfabrication instructions 120 anditem listings 108 for anitem 106 in real-time. - Having considered an example digital medium environment, consider now a discussion of example implementations of autonomously generating an item and an item listing for the item using the techniques described herein.
-
FIG. 2 depicts asystem 200 in an example implementation showing operation of the autonomousitem generation system 104 ofFIG. 1 in greater detail as generating anitem 106 and anitem listing 108, automatically and independent of user input via themachine learning model 118, and as refining themachine learning model 118 based on data describing one or more interactions with the item listing 108 as published to thevirtual marketplace 124. To do so,system 200 illustrates components of the autonomousitem generation system 104, including theitem generation module 110, thetransaction module 112, thefeedback module 114, and thetraining module 116. Theitem generation module 110 is configured to cause themachine learning model 118 tooutput item data 202 for theitem 106, where a type of theitem 106 depends on an objective and training dataset used to originally train themachine learning model 118. - The
item data 202 includes thefabrication instructions 120 for theitem 106 along with metadata for theitem 106 including anitem description 204, at least oneitem tag 206, anditem pricing data 208. Theitem description 204 is representative of a title for theitem 106 to be included in the item listing 108, a detailed textual description for the item listing 108, and a digital rendering (e.g., an image, a video, an animation, combinations thereof, and so forth) of theitem 106 to be represented in theitem listing 108. The item tags 206 included in theitem data 202 are representative of metadata to be embedded in the item listing 108 that enables thevirtual marketplace 124 and/or a search engine (not shown) to identify and categorize the item listing 108 (e.g., relative to other item listings published at the virtual marketplace 124). - Item pricing data specifies at least one suggested price to be associated with the item 106 (e.g., to be displayed as part of the item listing 108). In some implementations, the item tags 206 further specify audience information for the item listing 108 to define a particular manner in which the item listing 108 is published at the
virtual marketplace 124. For instance, the item tags 206 may include information specifying a particular demographic for the item listing 108 that restricts its publication to the particular demographic (e.g., specifying differentitem pricing data 208 for European and Asian markets, specifying different visual appearances for conveying theitem description 204 in the item listing 108 at different times of the day, and so forth). Thus, theitem data 202 generated by themachine learning model 118 is representative of information that is useable by thefabrication device 122 to fabricate theitem 106 as well as information that is useable by thetransaction module 112 to generate theitem listing 108. - Upon receiving the
item data 202 from theitem generation module 110, thetransaction module 112 is configured to generate the item listing 108, where a visual appearance of the item listing 108 as published to thevirtual marketplace 124 is defined by one or more of theitem description 204, the item tags 206, or theitem pricing data 208. For an example representation of anitem listing 108 generated by the autonomousitem generation system 104, considerFIG. 3 . -
FIG. 3 depicts anexample interface 300 of thevirtual marketplace 124 as displaying anitem listing 302. In the illustrated example ofFIG. 3 , the item listing 302 represents an instance of the item listing 108 generated by thetransaction module 112, where the item listing 302 is created for an article ofclothing item 106 generated by themachine learning model 118. Specifically, the item listing 302 includes anitem title 304 for a “Men's Casual Button Down Shirt,” and adigital rendering 306 of theitem 106. Thedigital rendering 306 indicates how theitem 106 would visually appear following fabrication by thefabrication device 122, according to thefabrication instructions 120. The item listing 302 further includes aprice 308 for purchasing theitem 106 depicted by thedigital rendering 306 and adetailed description 310 that provides a viewing user with additional information describing theitem 106. - The example item listing 302 further includes an item details
portion 312 configured to display additional information not provided by theitem title 304, thedigital rendering 306 of the item, theprice 308, or thedetailed description 310. For instance, in the example context of theitem 106 being an article of clothing, the item details 312 may specify specific dimensions of the article of clothing, textiles used to construct the article of clothing, and any other information included in theitem data 202 output by themachine learning model 118. - The
item listing 302 is further illustrated as including ashipping options portion 314, a user reviews portion 316, and asimilar items portion 318. Theshipping options portion 314 is representative of information displayed to a viewing user of the item listing 302 that informs the viewing user as to available choices for logistically transferring thesubject item 106 of the item listing 302 from thefabrication device 122 that manufactures theitem 106 to a location of the viewing user (e.g., the purchasing entity 126). The user reviews portion 316 is representative of explicit feedback information pertaining to theitem 106 as received from one or more users of thevirtual marketplace 124 that have previously purchased theitem 106. Thesimilar items portion 318 is configured to displayrepresentations virtual marketplace 124 that are identified as being similar to theitem 106 for which the item listing 302 is generated (e.g., based on comparison of the item tags 206 for theitem 106 to tags associated with therepresentations - In some implementations, the
shipping options portion 314, the user reviews portion 316, and thesimilar items portion 318 of the item listing 302 are defined by thevirtual marketplace 124 to which the item listing is published, rather than being defined by thetransaction module 112. Alternatively, thetransaction module 112 may be configured to control a visual appearance of one or more of theshipping options portion 314, the user reviews portion 316, or thesimilar items portion 318 by virtue of a communicative connection between thevirtual marketplace 124 and the transaction module 112 (e.g., network 128), as represented by the double-headed arrow connecting thetransaction module 112 and thevirtual marketplace 124 inFIG. 2 . - The item listing 302 further includes
controls virtual marketplace 124, where interaction with thecontrols item listing 302. For instance,control 326 enables a viewing user to immediately purchase thesubject item 106 of the item listing 302,control 328 enables the viewing user to add theitem 106 to a shopping cart during browsing of the virtual marketplace, andcontrol 330 enables the viewing user to favorite theitem 106. In this manner, controls 326, 328, and 330 are representative aspects of the item listing 302 from which interaction data may be gleaned to ascertain a positive or negative reaction to the item listing and used as feedback data for further refining control parameters of themachine learning model 118 used to generate the item listing 302 and itssubject item 106. - Returning to
FIG. 2 , thetransaction module 112 is illustrated as including alisting component 210, afinance component 212, and alogistics component 214. In implementations where thevirtual marketplace 124 is implemented as part of the autonomousitem generation system 104, thelisting component 210, thefinance component 212, and thelogistics component 214 are representative of the transaction module's 112 ability to enable functionality of the standalonevirtual marketplace 124, as described above with reference toFIG. 1 . Alternatively, in implementations where thevirtual marketplace 124 is implemented independently from the autonomousitem generation system 104, thelisting component 210, thefinance component 212, and thelogistics component 214 are representative of functionality of the autonomous item generation system to automatically handle interactions with thevirtual marketplace 124 that otherwise cannot be performed by conventional systems absent human user intervention. - For instance, the
listing component 210 is representative of functionality of thetransaction module 112 to communicate and cause publication of the item listing 108 at thevirtual marketplace 124. In accordance with one or more implementations, thelisting component 210 is representative of one or more APIs configured to interface with thevirtual marketplace 124 and list theitem 106 according to one or more shopping engines or price-listing platforms supported by thevirtual marketplace 124. Thefinance component 212 is representative of functionality of thetransaction module 112 to interface with one or more financial institutions to transfer funds among the various entities involved in facilitating the fabrication of theitem 106, publishing the item listing 108, purchasing theitem 106, and facilitating shipment of theitem 106 to apurchasing entity 126. In some implementations, thefinance component 212 is configured to handle returns and process refunds in the event apurchasing entity 126 is dissatisfied with theitem 106 and attempts to return theitem 106 via thevirtual marketplace 124. - In a similar manner, the
logistics component 214 is representative of functionality of thetransaction module 112 to identify one or more shipping options for logistically transporting theitem 106 to apurchasing entity 126. For instance, thelogistics component 126 is configured to identify geographic locations associated with afabrication device 122 that manufactured theitem 106 and thepurchasing entity 126 to which theitem 106 is to be transported. Given the geographic locations, thelogistics component 214 is configured to interface with one or more shipping entities to obtain quotes for costs associated with transporting theitem 106 to thepurchasing entity 126. In some implementations, thelogistics component 214 is configured to update the item listing 108 to convey such shipping cost quotes for aparticular purchasing entity 126 viewing the item listing (e.g., by updating information included in theshipping options portion 314 of the example item listing 302 illustrated inFIG. 3 .). - Upon receiving an indication from the
virtual marketplace 124 of thepurchasing entity 126 purchasing theitem 106, thefinance component 212 is configured to interface with a financial instruction associated with thepurchasing entity 126 to verify that thepurchasing entity 126 has sufficient funds to purchase theitem 106 and, if so, contracts with a shipping entity for transporting theitem 106 to thepurchasing entity 126. In some implementations, thelogistics component 214 is configured to select a particular shipping entity and shipping method for transporting theitem 106 to thepurchasing entity 126 based on various considerations, such as a price willing to be paid for shipping by thepurchasing entity 126, a shipping speed desired by thepurchasing entity 126, a cost for the autonomousitem generation system 104 to transport theitem 106 to thepurchasing entity 126, combinations thereof, and so forth. Thus, through inclusion of thelisting component 210, thefinance component 212, and thelogistics component 214, thetransaction module 112 is configured to automatically handle interactions with thevirtual marketplace 124 that otherwise cannot be performed by conventional systems absent human user intervention in facilitating the publication of the item listing 108 as well as sale activities involved with facilitating a sale of thesubject item 106 for theitem listing 108. - The
feedback module 114 is configured to receive listing feedback data 216 from thevirtual marketplace 124, which is representative of analytics data provided by thevirtual marketplace 124 describing one or more interactions with theitem listing 108. For instance, using the example item listing 302 ofFIG. 3 , the listing feedback data 216 may specify different interactions with the item listing 302 such as a number of page views, or impressions of the item listing 302, a number of different computing devices that viewed the item listing 302, a number of favorites of the item listing 302, a number of purchases of the subject item of the item listing 302, a number of shares of the item listing 302, and so forth. For each of these example interactions, the listing feedback data 216 may further provide information describing a user profile associated with the interaction, such as a location of the user during the interaction, a date and time associated with the interaction, demographic information for the user profile (e.g., age, gender, etc.), historical user behavior data for the user profile relative to thevirtual marketplace 124, combinations thereof, and so forth. - The listing feedback data 216 may provide additional levels of detail regarding interactions with the
item listing 108. For instance, the listing feedback data 216 may specify an amount of time spent viewing discrete portions of the item listing 302, such as a duration spent reading thedetailed description 310, a number of user reviews displayed in navigating the user reviews portion 316, a purchase of an item listed in thesimilar items portion 318 instead of the subject item of the item listing 302, and so forth. In this manner, the listing feedback data 216 is representative of any type and format of data that indicates a manner in which the item listing 108 was experienced or interacted with by users of thevirtual marketplace 124. - Given the listing feedback data 216, the
feedback module 114 is configured to generate at least onetraining dataset 218 for use in refining themachine learning model 118. To do so, the feedback module combines the listing feedback data 216 with theitem data 202 in a format that corresponds to training dataset format used to originally train the machine learning model 118 (e.g., the format of the predicted feedback data included in the training dataset generated by the training module 116). By virtue of its initial training, thefeedback module 114 does not need to annotate or otherwise label the training dataset 218 (e.g., as quantifying or otherwise classifying the listing feedback data 216 as indicating that the item listing 108 is associated with positive or negative feedback). - Instead, by being trained to identify aspects of information included in initial training dataset counterparts to the listing feedback data 216 represented in the
training dataset 218, themachine learning model 118 is configured to infer relationships between different aspects of theitem data 202 and the resulting interactions with the item listing 108 via the virtual marketplace. To do so, thetraining module 116 feeds thetraining dataset 218 as an input to themachine learning model 118, which causes themachine learning model 118 to modify one or more control parameters (e.g., internal model node weights) according to a loss function for the model that penalizes negative differences between predicted and observed feedback data. Themachine learning model 118 with its one or more modified parameters is output by thetraining module 116 as the refinedmachine learning model 220, which is communicated to theitem generation module 110 for use in place of themachine learning model 118 in subsequently generatingitem data 202 for adifferent item 106. As an example of anitem 106 and item listing 108 subsequently output by the refinedmachine learning model 220, considerFIG. 4 . -
FIG. 4 depicts anexample interface 400 of thevirtual marketplace 124 as displaying anitem listing 402, representative of an instance of anitem listing 108 generated fromitem data 202 output by the refinedmachine learning model 220. Specifically, item listing 402 represents example changes betweenitem data 202 output by themachine leaning model 118 anditem data 202 output by the refinedmachine learning model 220. For instance, item listing 402 include anitem title 404 for a “Men's Double Pocket Tailored Shirt,” adigital rendering 406 of the subject item of the item listing 402, aprice 408 for the subject item, and adetailed description 410 for the subject item, which each differ from their counterpart aspects of theitem listing 302. Such changes may be indicative of themachine learning model 118 interpreting the listing feedback data 216 as indicating certain trends gleaned from interactions with thevirtual marketplace 124, such as that double pocketed men's collared shirts are currently more popular than collared shirts without pockets, that articles of clothing including tags noting that the clothing is “tailored” are associated with positive feedback, that item listings featuring multiple digital renderings of the subject item are associated with increased impression and purchase rates, that the revised layout of item listing 402 is preferred over the layout oflisting 302, and so forth. - To configure the
machine learning model 118 to recognize and adapt to such changing trends, thetraining module 116 is configured to identify control parameters in the latent space(s) of themachine learning model 118 that correlate with different design aspects (e.g., sleeve length, pocket styles, listing tags, item fabric(s), and so forth. In this manner, by informing themachine learning model 118 of information included in the listing feedback data 216 via thetraining datasets 218, the autonomousitem generation system 104 is configured to adapt to changing trends and alterfabrication instructions 120 and item listing 108 characteristics for items subsequently generated by the refinedmachine learning model 220 automatically and without relying on guiding user intervention. - Thus, the autonomous
item generation system 104 is configured to continuously monitor activities associated withitem listings 108 published to thevirtual marketplace 124 and refine control parameters of themachine learning model 118 used to generate the item listing 108 to adapt to inferred trends and behaviors. Because the autonomousitem generation system 104 is configured to perform its continuous cycle of operations independent of user input and identify trends and behaviors to consider in refining machine learning model parameters before such trends or behaviors can be identified by a user of the autonomousitem generation system 104, the autonomousitem generation system 104 is configured to output a user interface that enables a user to glean insight into the system's operations. -
FIG. 5 depicts anexample interface 500 of the autonomousitem generation system 104 configured for output on a display device of the computing device implementing the autonomousitem generation system 104, such as a display device ofcomputing device 102. Theinterface 500 includes amodel selection control 502 and anaudience specification control 504. Theaudience specification control 504 enables a user of the autonomousitem generation system 104 to change input parameters considered by the model selected viamodel selection control 502, such that the user can observe how the changed input parameters alter a resultingitem 106 generated by themachine learning model 118. For instance, responsive to receiving selection of one or more models via themodel selection control 502 and a selection of one or more options for defining an audience via theaudience specification control 504, the autonomous item generation system updates interface 500 tooutput item preview 506, which includes a previewdigital rendering 508 of anitem 106 that would be generated bymachine learning model 118 according to input parameters specified by the selection(s) made with respect tocontrols digital rendering 508 as being output in theitem preview 506 portion of theinterface 500, theitem preview 506 is configured to include a display of any information included initem data 202, such as visual representations of theitem data 202, textual descriptions of theitem data 202, and combinations thereof. - In the illustrated example,
model selection control 502 includesoptions option 510 enables selection of an instance ofmachine learning model 118 trained to generateitem data 202 for men's clothing items,option 512 enables selection of an instance ofmachine learning model 118 trained to generateitem data 202 for works of art, andoption 514 enables a user of the autonomousitem generation system 104 to upload their own model (e.g., an instance of themachine learning model 118 trained to generateitem data 202 for anitem 106 not categorized as men's clothing or works of art). - The
audience specification control 504 in the illustrated example ofFIG. 5 includesoptions option 516 enables specification of a “general public” audience segment (e.g., no constraints on the audience to be considered by the machine learning model 118), control 518 enables designation of custom demographic parameters to be considered by the machine learning model 118 (e.g., a specified geographic region for an audience of anitem listing 108, a specified audience age and gender combination, a specified time of day for publishing the item listing 108, and so forth), andcontrol 520 enables designation of a particular individual user to be considered as the audience for the machine learning model's 118 generation of theitem data 202. - By interacting with the
controls interface 500, a user of the autonomousitem generation system 104 is informed of considerations made by the autonomousitem generation system 104 in performing its automatic operations. For instance,interface 500 indicates to a user of the autonomousitem generation system 104 that an instance of themachine learning model 118 trained to generate art items, when considering the general public as an audience, will generate anitem 106 that depicts a waterfront dock scene at sunset with certain nature aspects to achieve a realistic, photo-quality appearance, based on current parameters of themachine learning model 118. - In some implementations, the
item preview 506 portion of theinterface 500 may further include information that describes control parameters of themachine learning model 118 selected for the specified audience. For instance, theitem preview portion 506 may specify that the samemachine learning model 118 configured to generate landscape works of art, when targeting a Swiss audience, selects control parameters for themachine learning model 118 that promote inclusion of mountains in the landscape artwork. Conversely, by interacting with theaudience specification control 504 to change the geographic demographic audience from Switzerland to Hawaii, theitem preview portion 506 may specify that control parameters emphasizing inclusion of beaches and oceans in the landscape artwork are to be utilized. In this manner, theinterface 500 provides a user of the autonomousitem generation system 104 with insight as to what considerations are made when selecting control parameters for differentmachine learning models 118, audience considerations, and combinations thereof. -
FIG. 6 depicts anexample interface 600 of the autonomousitem generation system 104, where the selected option of theaudience specification control 504 has been altered fromoption 516 to option 518, indicating that specific audience demographic characteristics (not shown) are to be considered instead of the general public considered in theexample interface 500. In the illustrated example, content of theitem preview 506 is altered to indicate how a resultingitem 106 generated by the samemachine learning model 118 would differ based on the specified audience demographic characteristics. Specifically,interface 600 indicates to the user of the a user of the autonomousitem generation system 104 that the same instance of themachine learning model 118 trained to generate art items as selected inFIG. 5 , when considering the updated audience demographic characteristics, would instead generate anitem 106 that depicts a surreal mountain landscape scene. In this manner, an interface of the autonomousitem generation system 104 provides a user with tools to obtain insight regarding the ongoing revision of a particularmachine learning model 118 implemented by the autonomousitem generation system 104 in a manner that would not be possible by inspecting raw input and output data from themachine learning model 118. - Having considered example details of automatically generating data useable to fabricate an
item 106 and generate a listing for the item to be published at a virtual marketplace, consider now example procedures to illustrate aspects of the techniques described herein. - The following discussion describes techniques that may be implemented utilizing the previously described systems and devices. Aspects of each of the procedures may be implemented in hardware, firmware, software, or a combination thereof The procedures are shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. In portions of the following discussion, reference may be made to
FIGS. 1-6 . -
FIG. 7 depicts aprocedure 700 in an example implementation of autonomous item and item listing generation in accordance with aspects of the techniques described herein. Notably, each and every operation ofprocedure 700 is performed automatically and independent of user input or intervention. Using at least one machine learning model, fabrication instructions for an item and metadata describing the item are generated (block 702). Theitem generation module 110 of the autonomousitem generation system 104, for instance, causesmachine learning model 118 to generateitem data 202, which includesfabrication instructions 120 that are useable by thefabrication device 122 to fabricate atangible item 106. In addition to the fabrication instructions, theitem data 202 includes metadata describing theitem 106, such asitem description 204, item tags 206, anditem pricing data 208. - Fabrication of the item is caused by transmitting the fabrication instructions to the fabrication device (block 704). The
item generation module 110, for instance transmits thefabrication instructions 120 to thefabrication device 122 in a manner that causes the fabrication device to fabricate, manufacture, or otherwise output theitem 106. A listing for the item is then created using the metadata describing the item (block 706). Thetransaction module 122 of the autonomousitem generation system 104 obtains theitem data 202 from theitem generation module 110 and generates item listing 108, such as the example item listings depicted inFIGS. 3 and 4 . - The item listing is published to a virtual marketplace and analytics data describing one or more interactions with the item listing is obtained (block 708). The
transaction module 112, for instance, employslisting component 210 to interface with thevirtual marketplace 124 and publish the item listing 108 in a manner that makes the item listing discoverable on the virtual marketplace 124 (e.g., to a browsing user of thevirtual marketplace 124, to a search engine indexing thevirtual marketplace 124, and so forth). Thefeedback module 114 of the autonomousitem generation system 104 obtains listing feedback data 216, which is representative of information describing one or more interactions with the item listing 108 as published to thevirtual marketplace 124. Example interactions include a number of views (e.g., a number of impressions) of the item listing 108, a number of purchases of the item via the item listing 108, a number of favorites of the item listing 108, a number of shares of the item listing 108, user reviews submitted for the item listing 108, combinations thereof, and so forth. - The listing feedback data 216 may provide additional levels of detail regarding interactions with the
item listing 108. For instance, the listing feedback data 216 may specify an amount of time spent viewing discrete portions of the item listing 302, such as a duration spent reading thedetailed description 310, a number of user reviews displayed in navigating the user reviews portion 316, a purchase of an item listed in thesimilar items portion 318 instead of the subject item of the item listing 302, and so forth. In this manner, the listing feedback data 216 is representative of any type and format of data that indicates a manner in which the item listing 108 was experienced or interacted with by users of thevirtual marketplace 124. - Training data is then formed based on the analytics data and one or more parameters of the at least one machine learning model are modified using the training data (block 710). The
feedback module 114, for instance, combines the listing feedback data together with theitem data 202 generated by themachine learning model 118 astraining dataset 218. The format oftraining dataset 218 output by thefeedback module 114 varies according to the machine learning model implanted by theitem generation module 110 and depends on a format of training datasets used to originally train themachine learning model 118. The training dataset is then passed to thetraining module 116, which provides thetraining dataset 218 as input to themachine learning model 118. Upon input of thetraining dataset 218, themachine learning model 118 is configured to process thetraining dataset 218 according to one or more objective functions upon which themachine learning model 118 was initialized, together with one or more loss functions that penalize negative differences between predicted and observed feedback data, thereby causing themachine learning model 118 to refine one or more internal parameters via processing of thetraining dataset 218. Themachine learning model 118 with its one or more modified parameters is then output as refinedmachine learning model 220. - Using the at least one machine learning model with one or more modified parameters, fabrication instructions for an additional item and metadata describing the additional item are generated (block 712). The autonomous
item generation system 104, for instance, performs the operations as described above with respect to block 702, using the refinedmachine learning model 220 instead of themachine learning model 118. Operation ofprocedure 700 then optionally returns to block 704, continuing to refine model parameters based on analytic data describing interactions withitem listings 108 generated by the autonomousitem generation system 104. -
FIG. 8 depicts aprocedure 800 in an example implementation of outputting a user interface for an autonomous item generation system in accordance with aspects of the techniques described herein. A display of a user interface for an autonomous item generation system that includes controls for specifying a machine learning model to be used in generating an item and an audience for the machine learning model to consider in generating the item is output (block 802). The autonomousitem generation system 104, for instance, outputs interface 500 at a display ofcomputing device 102. Theinterface 500 includesmodel selection control 502 andaudience specification control 504. Themodel selection control 502 enables selection of a particularmachine learning model 118 to be implemented by the autonomousitem generation system 104 and theaudience specification control 504 enables a user to change input parameters considered by the model selected viamodel selection control 502 and observe how the changed input parameters alter a resultingitem 106 generated by themachine learning model 118. - Input is received at the user interface specifying at least one of the machine learning model to be used, or the audience to be considered, in generating the item (block 804). A selection of one or more of
options model selection control 502 and/or one ormore options audience specification control 504 is received. The user interface is then updated to display a preview of the item as generated by the selected machine learning model for the specified audience (block 806). For instance, responsive to receiving selection of one or more models via themodel selection control 502 and a selection of one or more options for defining an audience via theaudience specification control 504, the autonomous item generation system updates interface 500 tooutput item preview 506, which includes a previewdigital rendering 508 of anitem 106 that would be generated bymachine learning model 118 according to input parameters specified by the selection(s) made with respect tocontrols machine learning model 118 control parameters are alternatively or additionally output in theitem preview 506 portion of theinterface 500. - Operation of
procedure 800 then optionally returns to block 804, enabling selection of a different combination of the one or more ofoptions model selection control 502 and/or one ormore options audience specification control 504. For example,interface 600 depicts an update to theitem preview 506 from that depicted in the illustrated example ofFIG. 5 , responsive to a different option selected from theaudience specification control 504. In this manner, the user interface output byprocedure 800 enables a user of the autonomousitem generation system 104 to glean insight into operations of the autonomousitem generation system 104 that would otherwise be unable to ascertain from inspection of raw data inputs to, and outputs from, themachine learning model 118. - Having described example procedures in accordance with one or more implementations, consider now an example system and device that can be utilized to implement the various techniques described herein.
-
FIG. 9 illustrates an example system generally at 900 that includes anexample computing device 902 that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. This is illustrated through inclusion of the autonomousitem generation system 104. Thecomputing device 902 may be, for example, a server of a service provider, a device associated with a client (e.g., a client device), an on-chip system, and/or any other suitable computing device or computing system. - The
example computing device 902 includes aprocessing system 904, one or more computer-readable media 906, and one or more I/O interface 908 that are communicatively coupled, one to another. Although not shown, thecomputing device 902 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines. - The
processing system 904 is representative of functionality to perform one or more operations using hardware. Accordingly, theprocessing system 904 is illustrated as includinghardware elements 910 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. Thehardware elements 910 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions. - The computer-
readable storage media 906 is illustrated as including memory/storage 912. The memory/storage 912 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage component 912 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage component 912 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 906 may be configured in a variety of other ways as further described below. - Input/output interface(s) 908 are representative of functionality to allow a user to enter commands and information to the example
service provider device 902, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to recognize movement as gestures that do not involve touch), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, thecomputing device 902 may be configured in a variety of ways as further described below to support user interaction. - Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
- An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the
example computing device 902. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “computer-readable signal media.” - “Computer-readable storage media” may refer to media and/or devices that enable persistent and/or non-transitory storage of information, in contrast to mere signal transmission, carrier waves, or signals per se. Thus, computer-readable storage media refers to non-signal bearing media. The computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.
- “Computer-readable signal media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the
computing device 902, such as via a network. Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Signal media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media. - As previously described,
hardware elements 910 and computer-readable media 906 are representative of modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein, such as to perform one or more instructions. Hardware may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware. In this context, hardware may operate as a processing device that performs program tasks defined by instructions and/or logic embodied by the hardware as well as a hardware utilized to store instructions for execution, e.g., the computer-readable storage media described previously. - Combinations of the foregoing may also be employed to implement various techniques described herein. Accordingly, software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or
more hardware elements 910. Theexample computing device 902 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of a module that is executable by theexample computing device 902 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/orhardware elements 910 of theprocessing system 904. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or moreexample computing devices 902 and/or processing systems 904) to implement techniques, modules, and examples described herein. - The techniques described herein may be supported by various configurations of the
computing device 902 and are not limited to the specific examples of the techniques described herein. This functionality may also be implemented all or in part through use of a distributed system, such as over a “cloud” 914 via aplatform 916 as described below. - The
cloud 914 includes and/or is representative of aplatform 916 forresources 918. Theplatform 916 abstracts underlying functionality of hardware (e.g., servers) and software resources of thecloud 914. Theresources 918 may include applications and/or data that can be utilized while computer processing is executed on servers that are remote from theexample computing device 902.Resources 918 can also include services provided over the Internet and/or through a subscriber network, such as a cellular or Wi-Fi network. - The
platform 916 may abstract resources and functions to connect theexample computing device 902 with other computing devices. Theplatform 916 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for theresources 918 that are implemented via theplatform 916. Accordingly, in an interconnected device embodiment, implementation of functionality described herein may be distributed throughout thesystem 900. For example, the functionality may be implemented in part on theexample computing device 902 as well as via theplatform 916 that abstracts the functionality of thecloud 914. - Although the invention has been described in language specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed invention.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/015,822 US20210073833A1 (en) | 2019-09-11 | 2020-09-09 | Autonomous Item Generation |
US18/061,740 US20230098794A1 (en) | 2019-09-11 | 2022-12-05 | Autonomous Item Fabrication Utilizing a Trained Machine Learning Model |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962899060P | 2019-09-11 | 2019-09-11 | |
US17/015,822 US20210073833A1 (en) | 2019-09-11 | 2020-09-09 | Autonomous Item Generation |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/061,740 Division US20230098794A1 (en) | 2019-09-11 | 2022-12-05 | Autonomous Item Fabrication Utilizing a Trained Machine Learning Model |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210073833A1 true US20210073833A1 (en) | 2021-03-11 |
Family
ID=74849579
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/872,908 Active 2040-12-01 US11430037B2 (en) | 2019-09-11 | 2020-05-12 | Real time item listing modification |
US17/015,822 Abandoned US20210073833A1 (en) | 2019-09-11 | 2020-09-09 | Autonomous Item Generation |
US17/869,520 Abandoned US20220358556A1 (en) | 2019-09-11 | 2022-07-20 | Real Time Item Listing Modification |
US18/061,740 Pending US20230098794A1 (en) | 2019-09-11 | 2022-12-05 | Autonomous Item Fabrication Utilizing a Trained Machine Learning Model |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/872,908 Active 2040-12-01 US11430037B2 (en) | 2019-09-11 | 2020-05-12 | Real time item listing modification |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/869,520 Abandoned US20220358556A1 (en) | 2019-09-11 | 2022-07-20 | Real Time Item Listing Modification |
US18/061,740 Pending US20230098794A1 (en) | 2019-09-11 | 2022-12-05 | Autonomous Item Fabrication Utilizing a Trained Machine Learning Model |
Country Status (2)
Country | Link |
---|---|
US (4) | US11430037B2 (en) |
KR (2) | KR20210031366A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210078169A1 (en) * | 2019-09-13 | 2021-03-18 | Deepmind Technologies Limited | Data-driven robot control |
US20220405817A1 (en) * | 2021-06-16 | 2022-12-22 | Pod Foods Co. | Virtualized wholesaling |
US11934926B2 (en) * | 2019-08-27 | 2024-03-19 | Sap Se | Sensitivity in supervised machine learning with experience data |
CN117893244A (en) * | 2024-03-15 | 2024-04-16 | 中国海洋大学 | Comprehensive management and control system for seaweed hydrothermal carbonization application based on machine learning |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102234497B1 (en) * | 2020-07-06 | 2021-04-01 | 쿠팡 주식회사 | Electronic device for providing product sale managing information and method thereof |
US11631092B2 (en) * | 2020-10-28 | 2023-04-18 | Shopify Inc. | Methods and apparatus for maintaining and/or updating one or more item taxonomies |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090037295A1 (en) * | 2007-07-31 | 2009-02-05 | Justin Saul | Fashion matching algorithm solution |
US20100153183A1 (en) * | 1996-09-20 | 2010-06-17 | Strategyn, Inc. | Product design |
US20150324490A1 (en) * | 2014-05-09 | 2015-11-12 | Autodesk, Inc. | User specific design customization for 3d printing |
US20180150947A1 (en) * | 2016-11-28 | 2018-05-31 | Adobe Systems Incorporated | Facilitating sketch to painting transformations |
US20190034976A1 (en) * | 2017-07-26 | 2019-01-31 | Jehan Hamedi | Systems and Methods for Automating Content Design Transformations Based on User Preference and Activity Data |
US10275909B2 (en) * | 2016-11-23 | 2019-04-30 | 3Dsystems, Inc. | Systems and methods for an integrated system for visualizing, simulating, modifying and 3D printing 3D objects |
US20190251612A1 (en) * | 2018-02-15 | 2019-08-15 | Adobe Inc. | Generating user-customized items using a visually-aware image generation network |
US20200167856A1 (en) * | 2018-11-26 | 2020-05-28 | Dot Bustelo | System and Method for Visual Art Streaming Runtime Platform |
US20210073576A1 (en) * | 2019-09-05 | 2021-03-11 | International Business Machines Corporation | Crowd Sourced Trends and Recommendations |
US11004128B1 (en) * | 2017-05-23 | 2021-05-11 | Amazon Technologies, Inc. | Automatically designing customized garments based on customer behavior |
US20210178697A1 (en) * | 2019-12-17 | 2021-06-17 | Northrop Grumman Systems Corporation | Machine-learning-based additive manufacturing using manufacturing data |
US20210390396A1 (en) * | 2018-07-11 | 2021-12-16 | The Board Of Trustees Of The Leland Stanford Junior University | Systems and Methods for Generative Models for Design |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080097830A1 (en) * | 1999-09-21 | 2008-04-24 | Interpols Network Incorporated | Systems and methods for interactively delivering self-contained advertisement units to a web browser |
WO2004104738A2 (en) * | 2003-05-16 | 2004-12-02 | Rite-Solutions, Et Al. | Method and apparatus for tracking customer purchases using a unique anonymous identifier |
US20040260767A1 (en) * | 2003-06-19 | 2004-12-23 | David Kedem | Dynamic web advertisement and content display system |
KR20060005153A (en) | 2004-07-12 | 2006-01-17 | 주식회사 마이엔진 | Real-time online selected-information provision method and system based on analysis for customer-priority information |
US8306859B2 (en) * | 2006-07-21 | 2012-11-06 | Say Media, Inc. | Dynamic configuration of an advertisement |
US20080052157A1 (en) * | 2006-08-22 | 2008-02-28 | Jayant Kadambi | System and method of dynamically managing an advertising campaign over an internet protocol based television network |
US8601386B2 (en) * | 2007-04-20 | 2013-12-03 | Ingenio Llc | Methods and systems to facilitate real time communications in virtual reality |
US20090055254A1 (en) * | 2007-08-23 | 2009-02-26 | Yahoo! Inc. | Dynamic and interactive advertisements |
US10692092B2 (en) * | 2007-12-21 | 2020-06-23 | Ebay Inc. | System and method for providing on-line advertising with dynamic content |
US7941340B2 (en) * | 2008-09-30 | 2011-05-10 | Yahoo! Inc. | Decompilation used to generate dynamic data driven advertisements |
WO2012060747A1 (en) * | 2010-11-03 | 2012-05-10 | Telefonaktiebolaget L M Ericsson (Publ) | Signalling gateway, method, computer program and computer program product for communication between http and sip |
US9984338B2 (en) * | 2011-05-17 | 2018-05-29 | Excalibur Ip, Llc | Real time e-commerce user interface for monitoring and interacting with consumers |
US20130046651A1 (en) * | 2011-08-17 | 2013-02-21 | Zachary James Edson | Gaming Marketplace Apparatuses, Methods and Systems |
US20130117099A1 (en) * | 2011-11-08 | 2013-05-09 | ChoozOn Corporation | System for providing offer groups to curate promotional offers |
US20170221093A1 (en) * | 2011-12-07 | 2017-08-03 | Google Inc. | Dynamically Generating Video / Animation, in Real-Time, in a Display or Electronic Advertisement Based on User Data |
US20140180834A1 (en) * | 2012-12-14 | 2014-06-26 | Auto Ads Today, LLC | Dynamic advertisement system |
KR101354624B1 (en) | 2013-05-28 | 2014-02-07 | 한상선 | Electronic commerce system and method for applying randomly discount rate to the cost of products when requesting a bargain |
US9665900B1 (en) | 2013-06-07 | 2017-05-30 | Amazon Technologies, Inc. | Item-specific sales ranking systems and methods |
US10122824B1 (en) * | 2013-09-13 | 2018-11-06 | Reflektion, Inc. | Creation and delivery of individually customized web pages |
US9691096B1 (en) | 2013-09-16 | 2017-06-27 | Amazon Technologies, Inc. | Identifying item recommendations through recognized navigational patterns |
US9871877B2 (en) * | 2014-08-08 | 2018-01-16 | Evergage, Inc. | Socially augmented browsing of a website |
US10825069B2 (en) * | 2014-11-14 | 2020-11-03 | The Joan and Irwin Jacobs Technion-Cornell Institute | System and method for intuitive content browsing |
US10410273B1 (en) * | 2014-12-05 | 2019-09-10 | Amazon Technologies, Inc. | Artificial intelligence based identification of item attributes associated with negative user sentiment |
US9519931B2 (en) | 2015-05-15 | 2016-12-13 | Ebay Inc. | System and method for personalized actionable notifications |
US20170098266A1 (en) * | 2015-10-02 | 2017-04-06 | Transcona Media Network Inc. | Real-time local marketplace information system and method |
KR101816296B1 (en) | 2016-02-26 | 2018-01-08 | 쿠팡 주식회사 | Method and apparatus for providing shopping service |
KR101893568B1 (en) | 2017-02-10 | 2018-08-30 | 주식회사 힐링애드 | System for providing open social market |
-
2020
- 2020-05-12 US US16/872,908 patent/US11430037B2/en active Active
- 2020-07-28 KR KR1020200093583A patent/KR20210031366A/en not_active Application Discontinuation
- 2020-09-09 US US17/015,822 patent/US20210073833A1/en not_active Abandoned
-
2022
- 2022-04-14 KR KR1020220046309A patent/KR102556299B1/en active IP Right Grant
- 2022-07-20 US US17/869,520 patent/US20220358556A1/en not_active Abandoned
- 2022-12-05 US US18/061,740 patent/US20230098794A1/en active Pending
Patent Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100153183A1 (en) * | 1996-09-20 | 2010-06-17 | Strategyn, Inc. | Product design |
US20090037295A1 (en) * | 2007-07-31 | 2009-02-05 | Justin Saul | Fashion matching algorithm solution |
US8103551B2 (en) * | 2007-07-31 | 2012-01-24 | Style Du Jour, Inc. | Fashion matching algorithm solution |
US8249941B2 (en) * | 2007-07-31 | 2012-08-21 | Style Du Jour | Fashion matching algorithm solution |
US20120265774A1 (en) * | 2007-07-31 | 2012-10-18 | Justin Saul | Fashion matching algorithm solution |
US20150324490A1 (en) * | 2014-05-09 | 2015-11-12 | Autodesk, Inc. | User specific design customization for 3d printing |
US10275909B2 (en) * | 2016-11-23 | 2019-04-30 | 3Dsystems, Inc. | Systems and methods for an integrated system for visualizing, simulating, modifying and 3D printing 3D objects |
US20180150947A1 (en) * | 2016-11-28 | 2018-05-31 | Adobe Systems Incorporated | Facilitating sketch to painting transformations |
US11004128B1 (en) * | 2017-05-23 | 2021-05-11 | Amazon Technologies, Inc. | Automatically designing customized garments based on customer behavior |
US20190034976A1 (en) * | 2017-07-26 | 2019-01-31 | Jehan Hamedi | Systems and Methods for Automating Content Design Transformations Based on User Preference and Activity Data |
US10380650B2 (en) * | 2017-07-26 | 2019-08-13 | Jehan Hamedi | Systems and methods for automating content design transformations based on user preference and activity data |
US20200034887A1 (en) * | 2017-07-26 | 2020-01-30 | Adhark, Inc. | Systems and methods for automating content design transformations based on user preference and activity data |
US20190251612A1 (en) * | 2018-02-15 | 2019-08-15 | Adobe Inc. | Generating user-customized items using a visually-aware image generation network |
US20210390396A1 (en) * | 2018-07-11 | 2021-12-16 | The Board Of Trustees Of The Leland Stanford Junior University | Systems and Methods for Generative Models for Design |
US20200167856A1 (en) * | 2018-11-26 | 2020-05-28 | Dot Bustelo | System and Method for Visual Art Streaming Runtime Platform |
US20210073576A1 (en) * | 2019-09-05 | 2021-03-11 | International Business Machines Corporation | Crowd Sourced Trends and Recommendations |
US20210178697A1 (en) * | 2019-12-17 | 2021-06-17 | Northrop Grumman Systems Corporation | Machine-learning-based additive manufacturing using manufacturing data |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11934926B2 (en) * | 2019-08-27 | 2024-03-19 | Sap Se | Sensitivity in supervised machine learning with experience data |
US20210078169A1 (en) * | 2019-09-13 | 2021-03-18 | Deepmind Technologies Limited | Data-driven robot control |
US11712799B2 (en) * | 2019-09-13 | 2023-08-01 | Deepmind Technologies Limited | Data-driven robot control |
US20220405817A1 (en) * | 2021-06-16 | 2022-12-22 | Pod Foods Co. | Virtualized wholesaling |
CN117893244A (en) * | 2024-03-15 | 2024-04-16 | 中国海洋大学 | Comprehensive management and control system for seaweed hydrothermal carbonization application based on machine learning |
Also Published As
Publication number | Publication date |
---|---|
US11430037B2 (en) | 2022-08-30 |
US20230098794A1 (en) | 2023-03-30 |
KR20220052885A (en) | 2022-04-28 |
KR102556299B1 (en) | 2023-07-18 |
KR20210031366A (en) | 2021-03-19 |
US20210073887A1 (en) | 2021-03-11 |
US20220358556A1 (en) | 2022-11-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230098794A1 (en) | Autonomous Item Fabrication Utilizing a Trained Machine Learning Model | |
US20200387763A1 (en) | Item recommendations based on image feature data | |
US20180218435A1 (en) | Systems and methods for customizing search results and recommendations | |
KR101852581B1 (en) | Image evaluation | |
US11301939B2 (en) | System for generating shareable user interfaces using purchase history data | |
US10303728B2 (en) | Personalized landing pages | |
US11593864B2 (en) | Shopping trip planner | |
US10275818B2 (en) | Next generation improvements in recommendation systems | |
US10672015B2 (en) | Streaming events modeling for information ranking to address new information scenarios | |
US10991028B1 (en) | Product collections grouped under a single product identifier | |
US11687582B2 (en) | Automated image-based inventory record generation systems and methods | |
US20220398608A1 (en) | Application program interfaces for order and delivery service recommendations | |
US20220343275A1 (en) | Production and logistics management | |
KR102308445B1 (en) | Electronic commercial transaction system based on 3D modeling data | |
US11663645B2 (en) | Methods and apparatuses for determining personalized recommendations using customer segmentation | |
KR20230137861A (en) | Method and apparatus for providing offline purchase service providing convenience of purchase through customized preparation | |
CN112991001A (en) | System and method for recommending 2D images | |
US11423471B2 (en) | Methods and systems for automated selection and ordering of hair products | |
KR20220044715A (en) | Method, apparatus and computer program for fashion item recommendation | |
US11741528B1 (en) | Application program interfaces for vendor recommendations | |
US20140244428A1 (en) | Dynamic presentation of recommended products to users | |
US20230351654A1 (en) | METHOD AND SYSTEM FOR GENERATING IMAGES USING GENERATIVE ADVERSARIAL NETWORKS (GANs) | |
US20230316387A1 (en) | Systems and methods for providing product data on mobile user interfaces | |
US20240161258A1 (en) | System and methods for tuning ai-generated images | |
US20140180876A1 (en) | System and method for providing access to product information and related functionalities |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: EBAY INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MANCO, MAXIM;FANG, FANG;SRINIVASAN, NATRAJ;AND OTHERS;SIGNING DATES FROM 20200902 TO 20200904;REEL/FRAME:053725/0191 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |