WO2024192347A1 - System for automatically performing marketing experimentation and analysis - Google Patents
System for automatically performing marketing experimentation and analysis Download PDFInfo
- Publication number
- WO2024192347A1 WO2024192347A1 PCT/US2024/020143 US2024020143W WO2024192347A1 WO 2024192347 A1 WO2024192347 A1 WO 2024192347A1 US 2024020143 W US2024020143 W US 2024020143W WO 2024192347 A1 WO2024192347 A1 WO 2024192347A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- customer
- server
- experimentation
- audience
- marketing
- Prior art date
Links
- 238000012360 testing method Methods 0.000 claims abstract description 111
- 230000003993 interaction Effects 0.000 claims abstract description 65
- 238000009877 rendering Methods 0.000 claims abstract description 29
- 238000002474 experimental method Methods 0.000 claims description 199
- 238000000034 method Methods 0.000 claims description 39
- 230000004044 response Effects 0.000 claims description 37
- 238000004891 communication Methods 0.000 claims description 23
- 238000010801 machine learning Methods 0.000 claims description 7
- 238000012544 monitoring process Methods 0.000 claims description 2
- 241000282326 Felis catus Species 0.000 description 16
- 239000000047 product Substances 0.000 description 13
- 235000013305 food Nutrition 0.000 description 10
- 241001465754 Metazoa Species 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 230000008685 targeting Effects 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 4
- 230000000903 blocking effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 229940079593 drug Drugs 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 235000014510 cooky Nutrition 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 230000029087 digestion Effects 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 238000002483 medication Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000004580 weight loss Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0207—Discounts or incentives, e.g. coupons or rebates
Definitions
- the present disclosure generally relates to systems for performing marketing experimentation and more specifically, to systems for automatically performing marketing experimentation and analysis in accordance with a plurality of criteria.
- a system for automatically performing marketing experimentation and analysis including a storefront server in communication with a plurality of customer devices, the storefront server configured to transmit to the customer devices instructions to render a customer-facing user interface (UI) on the customer device, the customerfacing UI including a plurality of user-interactable UI elements, a database in communication with the storefront server, the database having stored thereon customer specific data corresponding to the plurality of customer devices, an audience server in communication with the database and storefront server, the audience server configured to monitor customer interactions with the customer facing UI and determine a plurality of customer audiences based on one or more customer interactions with the customer facing UI, each customer audience being associated with a similar customer interaction, a marketing experimentation (ME) server in communication with the audience server, storefront server, and an admin device, the ME server configured to, generate an experimentation UI and transmit the experimentation UI to the admin device for display, the experimentation UI including a menu rendering of the plurality of customer audiences, detect, at the UI
- the ME server is further configured to detect, at the experimentation UI, a user selection of a marketing experimentation type, and cause the storefront server to render, at the customer facing UI, the experimentation content according to the selected marketing experimentation type.
- the marketing experimentation type is one of an A/B test, and a multi -armed bandit test.
- the ME server is further configured to detect, at the experimentation UI, a selection of a second experimentation content, a first allocation amount, and a second allocation amount, and in response to detecting the user selection, cause the storefront server to execute the marketing test including, rendering, at the customer facing UI, the experimentation content at the one of the user-interactable UI element on a number of customer devices corresponding to the target customer audience and the first allocation amount, and rendering, at the customer facing UI, the second experimentation content at the one of the user-interactable UI element on a number of customer devices corresponding to the target customer audience and the second allocation amount.
- the audience server is configured to automatically determine customer audiences via machine learning.
- one or more of the customer audiences is comprised of customers having corresponding customer specific data including an indication of a similar geographical area.
- one or more of the customer audiences is comprised of customers having corresponding customer specific data including an indication of a similar type of customer device used to interact with the customer facing UI.
- the experimentation content is a plurality of different experimentation contents
- the ME server is configured to in response to detecting the user selection, cause the storefront server to execute the marketing test including, for each experimentation content included in the plurality thereof rendering, at the customer facing UI, the corresponding experimentation content at the one of the user-UI element on customer devices corresponding to a subset of the target customer audience.
- the ME server is configured to detect one or more user defined priorities at the experimentation UI and cause the storefront server to execute the marketing experiment in accordance with one or more user defined priorities.
- a method of automatically performing marketing experimentation and analysis including, at a storefront server in communication with a plurality of customer devices, transmitting to the customer devices instructions to render a customer-facing user interface (UI) on the customer device, the customer-facing UI including a plurality of user- interactable UI elements, at an audience server in communication with the storefront server, monitoring customer interactions with the customer facing UI and determining a plurality of customer audiences based on one or more customer interactions with the customer facing UI, each customer audience being associated with a similar customer interaction, at a marketing experimentation (ME) server in communication with the audience server, storefront server, and an admin device, generating an experimentation UI and transmitting the experimentation UI to the admin device for display, the experimentation UI including a menu rendering of the plurality of customer audiences, detecting, at the experimentation UI, a user selection of: one of the user- interactable UI elements for the customer-facing UI, a target customer audience from the menu, and experimentation content to
- ME marketing experimentation
- the method further includes, at the ME server detecting, at the experimentation UI, a user selection of a marketing experimentation type, and causing the storefront server to render, at the customer facing UI, the experimentation content according to the selected marketing experimentation type.
- the marketing experimentation type is one of an A/B test, and a multi -armed bandit test.
- the method further includes, at the ME server detecting, at the experimentation UI, a selection of a second experimentation content, a first allocation amount, and a second allocation amount, in response to detecting the user selection, causing the storefront server to execute the marketing test including rendering, at the customer facing UI, the experimentation content at the one of the user-interactable UI element on a number of customer devices corresponding to the target customer audience and the first allocation amount, and rendering, at the customer facing UI, the second experimentation content at the one of the user- interactable UI element on a number of customer devices corresponding to the target customer audience and the second allocation amount.
- the method further includes, at the audience server automatically determining customer audiences via machine learning.
- one or more of the customer audiences is comprised of customers having corresponding customer specific data including an indication of a similar geographical area.
- one or more of the customer audiences is comprised of customers having corresponding customer specific data including an indication of a similar type of customer device used to interact with the customer facing UI.
- the experimentation content is a plurality of different experimentation contents
- the method further includes, at the ME server in response to detecting the user selection, causing the storefront server to execute the marketing test including, for each experimentation content included in the plurality thereof rendering, at the customer facing UI, the corresponding experimentation content at the one of the user-UI element on customer devices corresponding to a subset of the target customer audience.
- the method further includes, at the ME server, detecting one or more user defined priorities at the experimentation UI and causing the storefront server to execute the marketing experiment in accordance with one or more user defined priorities.
- Fig. l is a block diagram illustrating an implementation of a system for automatically performing marketing experimentation and analysis in accordance with an exemplary embodiment of the present disclosure
- FIG. 2 is an illustration of a customer facing user interface generated by the storefront server of the system of Fig. 1;
- FIGs. 3A-3P illustrate exemplary user interfaces for creating, and editing a marketing experiment
- FIG. 4 is a flowchart illustrating a method of automatically performing a marketing experimentation via the system of Fig. 1 and in accordance with an exemplary embodiment of the present disclosure.
- FIGS. 5A-5E illustrate exemplary user interfaces illustrating automatically performing a marketing experiment via the system of Fig. 1 and in accordance with an exemplary embodiment of the present disclosure.
- Figs. 5A-5B illustrate exemplary experimentation user interfaces and
- Figs. 5C-5E illustrate corresponding customer facing user interfaces rendered at different client devices. DETAILED DESCRIPTION
- FIG. 1-4 a system for automatically performing marketing experimentation and analysis, and alternatively referred to as system 100 for short, in accordance with an exemplary embodiment of the present disclosure.
- the system 100 includes one or more computers or computing devices having one or more processors and memory (e.g., one or more nonvolatile storage devices).
- memory or computer readable storage medium(s) of memory store programs, modules and data structures, or a subset thereof, for a processor to control and run the various systems and methods disclosed herein.
- a non-transitory computer readable storage medium having stored thereon computer-executable instructions which, when executed by a processor, performs one or more of any combination of the methods or steps disclosed herein.
- one or more of the computers or computing devices (e.g., servers) included in the system 100 may include a collection of networked computing devices, servers and/or processing units in communication with one another.
- one or more of the networked computing devices, servers and/or processing units may be vertically and/or horizontally scaled to accommodate for increases in processing requests (e.g., increasing network traffic).
- one or more of the servers included in the system 100 may be configured to execute one or more different collections of executable code, referred to as “microservices” for short, in accordance with a microservice architecture.
- a “microservice” as referenced herein may refer to a collection of computer executable code configured to perform a predetermined function that is executed by a respective server or via a cloud-based infrastructure.
- the functionality of a server or a microservice thereof may be accessible at another server via one or more application programming interfaces (APIs) and/or networks.
- APIs application programming interfaces
- one or more computers or computing devices included in the system 100 may be referred to as servers.
- the storefront server 102 is in communication with a plurality of customer devices 110.
- the storefront server 102 may be configured to transmit to the customer devices 110 instructions to render a customer-facing UI 200 on the customer device 110.
- the customer facing UI 200 may be configured to display one or more interactable digital representations of products for purchase therefrom.
- the customer facing UI 200 is configured to facilitate digital transactions between customer devices 110 and the storefront server 102 (e.g., purchases of products by customers via the customer facing UI 200).
- a customer audience refers to a dataset representative of a grouping of customers based on similar interactions and/or customer data.
- one customer audience may be cat owners and customers grouped into that audience may be 1) customers having data stored on the database 104 indicating that they own a cat and/or 2) customers interacting with the customer facing UI 200 generated by the storefront server 102 in a manner that indicates they own a cat (e.g., the purchase of cat food, toys, medications).
- Non-limiting examples of customer data and/or data used to determine and/or assign customers into a customer audience may include: type of pet, amount of money spent within a given time period, geolocation information, type of device used to interact with the customer facing UI, time spent with items selected for purchase without proceeding to finalize the purchase (e.g., time spent with items in a ‘cart’).
- Customer audience datasets may be stored on the database 104 and the storefront server 102, audience server 106 and/or marketing experimentation server 108 may be configured to automatically query the database 104 to retrieve a customer audience dataset.
- Customer interaction data may relate to aspects of customer interactions with the interactable UI elements on the customer facing UI 200.
- aspects of customer interactions t include: selecting (e.g., clicking, highlighting) UI elements corresponding to products or services for a specific type of pet or animal, duration that user reads or reviews UI elements (e.g., articles, product descriptions, product reviews), time between consecutive user selections with UI elements, number of selections of UI elements with one or more UI elements within a predetermined amount of time.
- Other aspects of customer interactions and customer data may be, in some embodiments, related to one another.
- the audience server 106 may be configured to determine that a customer interaction is with a UI element corresponding to a cat food product and may automatically generate customer data indicating that the customer owns a cat.
- some further customer audiences may include, but are not limited to: cat owners, dog owners, multi-pet owners, customers achieving a threshold spending level, customers displaying uncertainty in purchases, and so on.
- the system 100 is configured to enable a user to input one or more search parameters or rules corresponding to a desired set of shared customer characteristics for a new customer audience and automatically retrieve a listing of customers having those shared characteristics.
- the system 100 is configured to receive the user defined rules in a format that requires little to no programming experience on the part of the user.
- the user defined rules may be received by the system 100 as plain text (e.g., “search for customers who started purchasing cat food within the last three months”).
- the system 100 is configured to receive the user defined rules and automatically convert the rules into a structured query language (SQL) format or any other standard data management language.
- SQL structured query language
- the audience server 106 may be configured to detect or receive the input plain text command discussed in the preceding example and automatically convert the text into an SQL command and execute the SQL command to retrieve a listing of customers that first purchased a cat food product via the customer facing UI 200 within three months prior to the current date.
- the audience server 106 is configured to automatically execute the SQL command on the database 104 or another database in communication with the audience server 106 where relevant customer data may be found.
- the system 100 may be configured to enable a user, with little to no computer programming experience, to easily input criteria for a desired customer audience and automatically generate a list of customers within said audience.
- the audience server 106 is configured to monitor customer interactions with the customer facing UI 200 and determine a plurality of customer audiences based on the customer interactions with the customer facing UI 200. In some embodiments, determining a customer audience via the audience server 106 includes detecting one or more interactions with the customer facing UI 200 at a plurality of different customer devices 110 that are similar to one another and establishing or defining a customer audience based on those similar interactions.
- An example of a similar interaction may include, but is not limited to, viewing one or more products for sale for a predetermined amount of time without purchasing said products (e.g., viewing products frequently and/or for an extended period of time without purchasing).
- the audience server 106 may be configured to establish a customer audience specific to that type of interaction (e.g., interactions indicating uncertainty in purchases) and automatically assign customers exhibiting that type of interaction with the established customer audience.
- the audience server 106 may be configured to monitor the customer facing UI 200 that is generated by the storefront server 102.
- the audience server 106 may be configured to automatically associate a customer with at least one customer audience, based on a record of the interactions between a client device 110 and UI 200. Assigning a customer to a customer audience may include, at the audience server 106, causing customer specific data associated with the customer and stored on database 104 to be updated to include an indication that the customer is included in the specific customer audience.
- the audience server 106 has stored thereon data for a plurality of different customer audiences and an indication of which specific customers are included in each.
- Customers may be entities that, via a client device 110, interact with the customer facing UI 200 generated by the storefront server 102.
- the client device(s) 110 may alternatively be referred to as customer device(s) 110.
- a customer interacting with the customer facing UI 200 via a client device 110 may have data associated with their interaction. For example, a customer interacting with the customer facing UI 200 via a client device 110 may do so while logged in to a customer account having associated customer specific account data stored on database 104.
- the customer specific account data may include, but is not limited to: customer name, shipping address, type of pet owned, similar geographical area or location, and/or similar type of customer device(s) used to interact with the customer facing UI 200.
- a customer interacting with the customer facing UI 200 via a client device 110 may do so while not logged in to any customer specific account.
- the system 100 may be configured to use unique identifying data specific to the client device 110 (e.g., internet protocol (IP) address) to associate customer interactions with the customer specific account data stored on the database 104.
- IP internet protocol
- a “customer” as referenced herein may refer to an entity that interacts with the customer facing UI 200, via a client device 110, and that is identifiable by the storefront server 102 and/or audience server 106.
- Customer interactions with the customer facing UI 200 may include inputs at a client device 110 that cause the client device 110 to interact with interactable UI elements 202 displayed on the customer facing UI 200.
- the audience server 106 may be configured to determine the customer interacting with the customer facing UI 200 and, based on the interactions, associate the customer with one or more customer audiences. For sake of brevity, it will be assumed herein that customers interacting with the customer facing UI 200 are doing so while logged in to a customer specific account having associated data stored on database 104.
- a customer may access the customer facing UI 200 via a client device 110 and input their login credentials to associate their interactions on the customer facing UI 200 with their customer specific account data stored on database 104.
- the storefront server 102 and/or audience server 106 may be configured to automatically associate any interactions with the customer facing UI 200 via that specific client device 110 with the customer specific account data for said customer (e.g., Bob).
- the audience server 106 may be configured to monitor customer interactions with different interactable UI elements 202 and automatically associate the customer with one or more customer audiences. For example, the audience server 106 may be configured to, in response to a customer interacting with the UI element 202 corresponding to “cat deals”, automatically associate the customer with a cat owner customer audience.
- the audience server 106 may be configured to associate a plurality of different customers with a plurality of different customer audiences. In some embodiments, a single customer may be associated with a plurality of customer audiences. For example, and referring to table 1 below, there is shown a listing of customers and the audiences to which they are associated. Although only four audiences are shown, it should be understood that a customer may be associated with more or fewer than four audiences.
- a single customer may be associated with up to about one-hundred different customer audiences.
- the audience server 106 is configured to generate and maintain a record of customers included in each determined customer audience.
- another server e.g., the ME server 108 is configured to transmit a request for customer audience data to the audience server 106 and the audience server 106 may be configured to transmit a listing of customers included in that audience back to the server that transmitted the request.
- the audience server 106 may be configured to automatically monitor customer interactions and associate customers with different customer audiences independent of one or more other servers included in the system 100 (e.g., the ME server
- Table 1 Examples of Customers and their associated audiences
- the customer facing UI may include any number of different UI elements 202 arranged in any number of ways.
- the customer facing UI 200 shown in Fig. 2 will be referenced herein so as not to obscure pertinent aspects of the present disclosure.
- the ME server 108 is configured to alter or change the interactable UI elements 202 included in the customer facing UI 200 in order to conduct a marketing experiment.
- the ME server 108 is configured to alter or change interactable UI elements 202 based on specified zones 204 or areas of the customer facing UI 200.
- a zone 204 may include an area or section of the customer facing UI 200 where one or more interactable UI elements 202 are located.
- the customer facing UI 200 is illustrated as including three zones 204.
- the uppermost zone 204 includes an interactable UI element 202 configured to act as a banner that automatically cycles through different interactable elements (e.g., a carousel banner) at a predetermined interval.
- the zones 204 located below the topmost zone 204 in Fig. 2 includes a plurality of different interactable UI elements 202 displayed simultaneously. It should be understood though that each zone 204 illustrated in Fig. 2 is an example, and that zones 204 may encompass different portions or sections of the customer facing UI 200.
- the ME server 108 is configured to enable a user (e.g., an administrator) to create, edit, delete, and/or view data associated with one or more marketing experimentations.
- the ME server 108 may be configured to generate an experimentation UI (discussed in more detail below) and transmit the experimentation UI to an admin device 112, or a client device 110.
- the admin device 112 may be in communication with the ME server 108 such that the experimentation UI may be rendered thereon.
- the admin device 112 may be any suitable computing device such as, but not limited to, a laptop, desktop computer, smart phone, or tablet.
- the ME server 108 may be configured to automatically perform a marketing experiment on customers included in one or more customer audiences having associated customer audience data that is maintained (e.g., edited, created, deleted) by the audience server 106. For example, in some instances it may be desirable to execute an A/B type test at the customer facing UI 200 directed toward a “dog owner” customer audience. As such, the ME server 108 may be configured to receive customer specific account data associated with the “dog owner” customer audience from the audience server 106 and automatically perform the A/B type test at client devices 110 associated with the customer specific account data (e.g., where the customer account is logged in). In this manner, the system 100 of the present disclosure may enable for marketing experiments to be performed within one or more desired customer audiences more accurately than in conventional systems and methods.
- FIGs. 3A-3P there are shown example experimentation user interfaces 300 generated by the ME server 108 and displayed on an admin device 112.
- the example user interfaces 300 shown in Figs. 3A-3P illustrate the creation of a new marketing experiment.
- the experimentation UIs 300 illustrated in Figs. 3A-3P include various input fields related to customer audiences and marketing experiments. However, in Figs. 3A-3P, the terms “experiment” and “customer audience” are not illustrated. Instead, in Figs. 3A-3P, the terms “experience” and “segments” are illustrated.
- the term “experience” as appearing in the embodiment of Figs. 3A-3P refers to marketing experiments and the term “segment” refers to customer audiences.
- Figs. 3A-3P will be described with reference to marketing experiments and customer audiences.
- Figs. 3A-3P will be described with reference to inputs at the experimentation UI 300 transmitted from the admin device 112 to the ME server 108 (e.g., clicks at different interactable elements, data inputs).
- inputs at the experimentation UI 300 transmitted from the admin device 112 to the ME server 108 e.g., clicks at different interactable elements, data inputs.
- the ME server 108 is configured to detect the inputs and cause the rendering of the experimentation UI 300 at the admin device 112 to be updated accordingly.
- a selection at a graphical element displayed at an exemplary UI may refer to an input received at the admin device 112 corresponding to that graphical element.
- a selection by a user may be a user input at a peripheral input device (e.g., a mouse, a keyboard, touch screen, voice command received via microphone) resulting in an interaction with elements of the experimentation UI 300.
- Marketing experiments may refer to a procedure in which one or more interactable UI elements 202 included at the customer facing UI 200 are replaced with one or more different interactable UI elements and interaction with the different interactable UI elements is monitored.
- one or more of the interactable UI elements 202 may be replaced with a different interactable UI element (e.g., a variation).
- the variations may be different for different subsets of customer devices 110.
- a first subset of customer devices 110 displaying a rendering of the customer facing UI 200 may include a first variation of a specific interactable UI element 202 whereas a second subset of customer devices 110 may include a second variation of the specific interactable UI element 202.
- the ME server 108 may be configured to cause the storefront server 102 to render different variations at the customer facing UI 200 for one or more different customer devices 110.
- the ME server 108 is configured to automatically monitor and/or record interactions with the variations. In this manner, the ME server 108 may be configured to automatically analyze the outcome of marketing experiments performed on the customer facing UI 200.
- the ME server 108 may be configured to detect one or more user inputs at the admin device 112 indicating that the experimentation UI 300 is requested and cause the admin device 112 to render the experimentation UI 300 thereon.
- the experimentation UI 300 in Fig. 3 A illustrates a visual representation of a table 302 of one or more marketing experiments 304 grouped by date (e.g., columns in table 302) and zone (e.g., rows in table 302).
- the table 302 may generally be a visual representation of a schedule including indications of different marketing experiments 304.
- the zones indicated in the left most column may correspond to the zones 204 of the customer facing UI 200 and shown and described above with reference to Fig. 2.
- each row in table 302 may correspond to a different zone 204 of the customer facing UI 200.
- the columns may represent dates for which a marketing experiment 304 is scheduled to be active.
- the ME server 108 is configured to prevent more than one marketing experiments 304 from being run on the same zone at the same time.
- the ME server 108 may be configured to ensure that only one marketing experiment is scheduled to be performed at zone ID 12345 on any given day.
- data for each of the marketing experiments 304 is stored on the ME server 108.
- data for each of the marketing experiments 304 is stored on the database 104.
- the table 302 may include an indication as to the status of one or more of the marketing experiments 304. For example, in the third row there is shown an indication of two marketing experiments 304 where the first marketing experiment 304 scheduled to run from November 22, 2022 to November 25, 2022 is “live” and the second marketing experiment 304 scheduled to begin on November 26 is listed as “scheduled”. As such, the table 302 may provide a visual indication that the first marketing experiment 304 is currently running whereas the second marketing experiment has yet to begin.
- the ME server 108 is configured to receive one or more user inputs indicating a request to create a new marketing experiment.
- the experimentation UI 300 may include an interactable button 306 that when clicked may cause the ME server 108 to render at the admin device 112 an experimentation UI 300 similar to what is shown in Fig. 3B.
- the ME server 108 may be configured to cause the admin device 112 to render an experimentation UI 300 for inputting information for a new marketing experiment.
- the experimentation UI 300 may include an experiment name input field 308, zone input field 310 and one or more date input fields 312.
- the experiment name input field 308 may enable the ME server 108 to receive a name for the experiment (e.g., “experiment 1”) and the zone input field 310 may enable he ME server 108 to receive selection of the zone 204 of the customer facing UI 200 where an experiment is to take place.
- the zone input field 310 is a drop-down list including a listing of different zones 204 included in the customer facing UI 200.
- the date input fields 312 may enable he ME server 108 to receive an input of a desired start date and end date for the marketing experiment.
- the ME server 108 may be configured to detect, at the experimentation UI 300, a selection of one of the user-interactable UI elements for the customerfacing UI 300. For example, at the ME server 108 may detect a selection at the input field 310 of a zone 204 thereby including one or more user-interactable UI elements 202 where a marketing experimentation variation may be used. In Fig. 3C, the ME server 108 has detected inputs at fields 308, 310, and 312 and updated the rendering of the experimentation UI 300 accordingly. Further to this example, in Fig.
- the input at the name input field 308 is “Dog Cat Multi Pet Experience 2022-11-21” and the input at the zone input field 310 indicates the zone 204 is “GoodyBox_Hero_Promo_Food_Supplement”.
- the name of the new marketing experiment and the zone 204 of the customer facing UI 200 has been specified.
- the inputs at the date input fields 312 indicate that the marketing experiment will start on November 21 , 2022 at 12:00 AM and end on November 27, 2022 at 11 :59 PM.
- the ME server 108 may be configured receive selections for the name, zone placement and/or start and end dates of a marketing experiment.
- the ME server 108 is configured to receive a selection of one or more customer audiences to be included in the marketing experiment.
- the experimentation UI may include a menu rendering of the plurality of customer audiences received from the audience server 106.
- the experimentation UI 300 may include a customer audience selection menu 314 configured to receive selection a customer audience from a plurality of different customer audiences.
- the ME server 108 may be configured to detect, at the experimentation UI 300, a selection of a target customer audience from the menu 314.
- the ME server 108 in response to detecting an input at the selection menu 314, the ME server 108 is configured to transmit a request to the audience server 106 for a listing of all determined customer audiences and cause the experimentation UI 300 to display the listing.
- the ME server 108 may be configured to communicate with the audience server 106 to receive a listing of all different customer audiences (e.g., cat owners, dog owners, multi -pet owners) determined by the audience server 106.
- the audience server 106 may be configured to transmit data corresponding to a listing of different customer audiences to the ME server 108.
- the ME server 108 may be configured to display the listing of customer audiences at the experimentation UI 300 at the customer audience selection menu 314.
- the ME server 108 is configured to enable a user to allocate one or more types or forms of marketing experiment to the customer audience selected at the customer audience selection menu 314.
- the experimentation UI 300 may include an allocation selection button 316 configured to receive an allocation of a marketing experiment to the customer audience selected at the audience selection menu 314.
- the illustrated ME server 108 has detected an input at button 316 in Fig. 3C.
- the ME server 108 may be configured cause the experimentation UI 300 to display an experimentation selection menu 318.
- the selection menu 318 may include one or more interactable elements to receive selection of one or more different types of marketing experiments.
- the different types of marketing experiments may include, but are not limited to: 1) simple targeting/ share of voice, 2) A/B test, and 3) a multi-armed bandit (MAB) test.
- the ME server 108 may be configured to detect a selection of the type of marketing experiment to be performed and automatically associate the selection with the customer audience selected at the selection menu 314. In some embodiments, the ME server 108 may be configured to provide a default customer audience selection in response to not detecting a selection at the customer audience selection menu 314. For example, in Fig. 3D, the selected customer audience is set to a default state indicating that no selection was made. In some embodiments, the ME server 108 is configured to set the default customer audience to be all customer audiences not specifically selected. For example, the ME server 108 may be configured to receive selection of a plurality of different customer audiences to include in a marketing experiment (as discussed in more detail below). In some embodiments, the ME server 108 is configured to set the default when no selection is received to be all non-specified customer audiences. In other embodiments, the ME server 108 may not be configured to provide a default customer audience selection.
- the experimentation selection menu 318 may include a variation selection field 320 configured to receive selection one or more variations for interactable UI elements 202 to display during the marketing experiment.
- the ME server 108 may be configured to detect, at the experimentation UI 300, a selection of experimentation content to be rendered in association with the user-interactable UI element.
- the ME server 108 may be configured to detect an input at the variation selection field 320 and, in response, display a listing of available variations at the experimentation UI 300 and receive a selection from the displayed listing (e.g., as shown in Fig. 3E).
- the ME server 108 and/or database 104 stores data relating to interactable UI element variations.
- the interactable UI element variations are preloaded into the database 104 and/or ME server 108.
- the database 104 and/or ME server 108 may store data relating to a plurality of different interactable UI element variations that may be displayed at the customer facing UI 200 during a marketing experiment executed by the ME server 108.
- the experimentation selection menu 318 is selected and the ME server 108 has caused the experimentation UI 300 to render a listing of available interactable UI element variations for selection based on the selection.
- an interactable UI element variation is a UI element that has at least some data that is different than a UI element 202 being currently displayed on the customer facing UI 200. For example, if a UI element 202 displayed at the customer facing UI 200 has data corresponding to a specific dog food product (e.g., dog food product A), then an interactable UI element variation may include data corresponding to a different dog food product (e.g., dog food product B).
- a difference between the UI element 202 and a selected variation thereof may be a difference in how the UI element is rendered on the customer facing UI 200.
- an existing UI element 202 may have a first appearance whereas a variation thereof may have a second appearance that differs in at least one of, size, shape, color, and/or indicia.
- a variation thereof may have a second appearance that differs in at least one of, size, shape, color, and/or indicia.
- Each of the different variations may have data that is different from each other and/or from an existing UI element 202 rendered on the customer facing UI 200.
- an existing UI element 202 may be for a dog chew toy and the variation labeled “Dog Content 1 2022-11-21” may include data relating to a flea and tick medication for a dog.
- an interactable UI element variation is selected at the variation selection field 320.
- the ME server 108 may be configured to determine if sufficient information and/or selections have been provided for an allocated experiment based on the type of experiment selected in the experimentation selection menu 318. For example, in Fig. 3F, the simple targeting/ share of voice type of experiment is selected which may require only one variation be selected in order for the ME server to execute the simple targeting/ share of voice experiment. As such, the ME server 108 may be configured to determine that a single variation (e.g., “Not Dog Content 11 2022-11-21”) at the variation selection field 320 has been selected and therefore that sufficient, or required, information to perform the experiment has been provided In Fig.
- a single variation e.g., “Not Dog Content 11 2022-11-21”
- the selected the type of experiment to be performed (e.g., a simple targeting/share of voice test) has been selected and sufficient information to perform the selected type of experiment on the default customer audience has been received at the ME server 108.
- the ME server 108 may be configured to receive a request to save the information and selections for the marketing experiment input so far.
- the ME server 108 may be configured detect an input at the “save” button displayed below the variation selection field 320 and store data related to the experiment allocated in Figs. 3B-3F in the database 104 and/or locally on the ME server 108.
- the ME server 108 has received a request to save the information and selections for the type of allocated experiment.
- the ME server 108 in response to detecting an indication that the allocated experiment is to be saved, may cause the experimentation UI 300 to be updated similar to what is shown in Fig. 3G.
- the experimentation UI 300 may include an allocated experiment drop-down menu 322 that includes a visual indication of the experiment allocated by the user (e.g., as illustrated in Figs. 3D-3F).
- the ME server 108 is configured to detect an input at the allocated experiment drop-down menu 322 and cause the experimentation UI 300 to display details such as, but not limited to, experimentation type and selected variations for that allocated experiment.
- the ME server 108 is configured to split customer account specific data included the selected customer audience datasets based on a specified traffic percentage.
- the experimentation UI 300 may include a traffic percentage field 324.
- the traffic percentage field 324 may correspond to the allocated experiment displayed at the menu 322.
- the experiment allocated in Figs. 3D-3F has a traffic percentage value of 100% indicating that there is no split in traffic.
- Traffic refers to customers and/or client devices 110 that access the customer facing UI. As such, in Fig.
- the ME server 108 is configured to cause 100% of the customer devices 110 accessing the customer facing UI 200 that are included in the audience selected at field 314 (e.g., the default customer audience) to display the interactable UI element variation (e.g., “Not Dog Content 11 2022-11-21”) selected in Fig. 3F at the zone 204 corresponding to the zone selected in field 310 during the period of time input at field 312.
- the interactable UI element variation e.g., “Not Dog Content 11 2022-11-21”
- the ME server 108 is configured to perform a marketing experiment on more than one customer audience.
- the ME server 108 may be configured to detect an input at UI element 326 indicating that a second customer audience is to be added to the marketing experiment.
- the ME server 108 may be configured to, in response to detecting the input at UI element 326, cause the experimentation UI 300 to include a customer audience selection menu 314 associated with the second segment (e.g., second customer audience) for the marketing experiment.
- the audience selection menu 314 displayed in Fig. 3H may be generally the same as discussed above with reference to Figs. 3C-3D.
- the ME server 108 may be configured to detect a selection of a customer audience at the selection menu 314 and cause the experimentation UI 300 to display an indication of the selection as shown in Fig. 31.
- the user has selected the “Dog Owner” customer audience and the experimentation UI 300 includes the experimentation selection menu 318.
- the ME server 108 receives selection of the A/B test type of marketing experiment and causes the experimentation selection menu 318 to display a plurality of fields associated with the A/B test type.
- An A/B test as referenced herein may include a type of marketing experiment in which different variations of an interactable UI element are displayed to different subsets of customers.
- control variation A may be displayed to 10% of customers interacting with the customer facing UI 200
- non-control variation B may be displayed to 60% of customers
- non-control variation C may be displayed to 30% of customers.
- control variation A may be a banner that is currently being rendered at the customer facing UI 200
- non-control variations B-C may each be different versions of that banner (e.g., including different text, indicia, images, hyperlinks).
- the ME server 108 may be configured to cause the experimentation selection menu 318 to include a number of interactable fields associated with the AZB test type.
- the fields associated with the A/B test type include, but are not limited to, a winner rollout selection field (e.g., immediate, scheduled, manual), a control selection field, and a variation selection field.
- the winner rollout selection field may correspond to options for updating the customer facing UI 200 in response to the results of the A/B test.
- a variation performing better than other variations may refer to the amount of interactions said variation receives during the marketing experiment being greater than a number of interactions the remaining variations receive.
- the ME server 108 may be configured to determine that the variation that has the highest number of interactions to be the winning variation.
- the ME server 108 may be configured to monitor interactions with the different variations over the course of the marketing experiment and automatically determine the winning variation. For example, in an instance where the marketing experiment is an MAB test with a click through rate key metric, the ME server 108 may be configured to detect mouse click inputs or touch screen tap inputs at a rendered UI element variation 202 included in the customer facing UI 200 rendered on a client device 110. The ME server 108 may be configured to automatically generate and store a record of those interactions. For example, the ME server 108 is configured to detect the inputs at the UI element variations defined by the marketing experiment and generate a record indicating a number of times and/or frequency with which each UI element variation was interacted with.
- the ME server 108 may be configured to store the generated record in a local non -transitory computer readable storage medium and/or database 104.
- the record of interactions may be referred to as click stream data and the ME server 108 may be configured to transmit the click stream data to an external server or database (e.g., database 104) for storage.
- the ME server 108 and/or a server in communication with the ME server 108 may be configured to execute a machine learning (ML) algorithm on the click stream data and automatically adjust the traffic allocation to the different UI element variations based on the click stream data.
- the ME server 108 may be configured to, based on the generated record, automatically perform an analysis of the marketing experiment.
- the record may include, for each variation in an experiment, a number of times the variation was interacted with (e.g., variation A was interacted with 10,000 times, variation B was interacted with 3000 times, and variation C was interacted with 30,000 times).
- the ME server 108 may be configured to cause the customer facing UI 200 to display the winning variation at the zone 204 corresponding to what is selected in the zone input field following the completion of the marketing experiment (e.g., immediately, at a scheduled date and/or time) to customers in the customer audience specified at input field 314.
- a winning variation as referenced herein may refer to a variation that during a marketing experiment most closely achieves a desired customer response.
- a desired customer response may be related to a number or instances of interactions with a variation (e.g., variation with the highest number of interactions being the winning variation). Referring to the example in the preceding paragraph, variation C would be the winning variation.
- an A/B test executed on the customer facing UI 200 by the ME server 108 may including a control A, and two variations B, and C displayed at a specific zone 204 on the customer facing UI 200 for customers included in the Dog Owner customer audience where some segments of the customer audience receive a control display and other segments of the customer audience receive one or more variations of the display.
- variation B is the winning variation for the Dog Owner customer audience.
- the ME server 108 may be configured to automatically transmit an indication of the winning variation and corresponding zone placement to the storefront server 102.
- the storefront server 102 may be configured to cause the customer facing UI 200 to display the winning variation at the specified zone for customers included in the Dog Owner customer audience.
- the system 100 of the present disclosure may be configured to automatically update the customer facing UI 200 based on an outcome of a marketing experimentation.
- the ME server 108 may be configured to cause the customer facing UI 200 to render the winning variation for customers displaying the customer facing UI 200 on a client device regardless of whether those customers were involved in the corresponding marketing experiment or not.
- the ME server 108 has detected a selection of an MAB marketing test type.
- An MAB test as referenced herein, is a type of marketing experiment similar to an A/B test except that for an MAB test the ME server 108 is configured to dynamically allocate customer traffic to variations that have a higher number of interactions than other variations.
- the ME server 108 may be configured to, while executing an MAB test, automatically adjust the traffic allocation to variations that better achieve a desired customer response than those that do not.
- the MAB server 108 may be configured to automatically adjust the traffic allocation such that variation A receives 60% of the traffic and variation B receives 40% of the traffic.
- the ME server 108 may be configured to cause the experimentation selection menu 318 displays a plurality of fields associated with the selected type of marketing experiment.
- the selected type of marketing experiment is an MAB test.
- the ME server 108 may be configured to cause the experimentation UI 300 to display a plurality of interactable fields at the selection menu 318 associated with the MAB test type.
- the interactable fields displayed at the selection menu 318 include, but are not limited to, optimization selection fields (e.g., click through rate, revenue/1000 impressions), winner rollout selection fields (e.g., immediate, scheduled, manual), and at least two variation selection fields 320 (e.g., variation 1 and variation 2).
- optimization selection fields e.g., click through rate, revenue/1000 impressions
- winner rollout selection fields e.g., immediate, scheduled, manual
- at least two variation selection fields 320 e.g., variation 1 and variation 2.
- the ME server 108 receives a selection of an interactable UI element variations at the selection fields 320 in generally the same manner as described above with reference to Figs. 3E-3F.
- the ME server 108 is configured to perform an AZB test and/or MAB test that includes more than two variations of interactable UI elements.
- the experimentation UI 300 may include an “add variation” interactable element 328 that when interacted with causes the selection menu 318 to include another variation selection field 320.
- the ME server 108 may be configured to detect an input at element 328 and cause the selection menu 318 to render an additional variation selection field 320.
- the ME server 108 may be configured to include any number of variations of interactable UI elements in a marketing experiment.
- the ME server 108 is configured to limit the number of variations to be less than a predetermined number of variations selected to avoid any one variation included in a marketing experiment from receiving too little traffic to yield statistically significant results. In some embodiments, the maximum number of variations is less than or equal to about ten variations.
- the ME server 108 receives a plurality of inputs and selections generally associated with repeating the process of adding variation selection fields 320 and selecting variations such that there are five different variations included in the MAB test as outlined above.
- the selected variations include “Dog Content # 2022-11-21” where # in this instance is between 1-5.
- Each of the variations selected in Fig. 3M may be different from one another in at least one way.
- Dog Content 1 2022-11-21 may correspond to a weight loss dog food product and Dog Content 2 2022-11-21 may correspond to a dog food product for improving digestion, and so on for each of the different variations selected.
- the user may save the allocated MAB marketing experiment type with the five variations.
- the ME server 108 may, in response to detecting an indication that information and selections for the allocated MAB test have been provided (e.g., detecting input at the “save” button in the selection menu 318 in Fig. 3M) cause the experimentation UI to display an indication of the allocated experiment.
- the experimentation UI 300 may include the allocated experiment dropdown menu 322 and traffic percentage input field 324 as described above.
- the ME server 108 is configured to perform one or more different experiments on a customer audience (e g., customer audience input at audience selection menu 314).
- the ME server 108 may be configured to detect an input at the “add allocation” button 330 displayed at the experimentation UI 300. In response to detecting an input at the button 330, the ME server 108 may cause the experimentation UI 300 to display a new selection menu 318 in combination with the allocated experiment drop-down menu 322 (as shown in Fig. 30) corresponding to the previously allocated experiment.
- the ME server 108 receives an allocation of a second experiment to the market experiment for the customer audience input at customer audience selection field 314 (e.g., Dog Owners).
- the ME server 108 may be configured to automatically adjust the percentage values in the traffic percentage input fields 324 according to the number of allocated experiments. For example, in Fig. 30, the traffic percentage for the allocated MAB test created in Figs. 3J-3N changed from 100% to 50% corresponding to there being a total of two allocated experiments. As such, in some embodiments, the ME server 108 may be configured to require that the sum total of percentages in the traffic percentage input fields 324 are equal to one-hundred.
- the ME server 108 may be configured to detect inputs at the selection menu 318 in generally the same manner as described above. For example, user may repeat the process of allocating an experiment by interacting with the selection menu 318 shown in Fig. 30 in generally the same manner as discussed above such that a second experiment is allocated to the Dog Owner customer audience in accordance with the marketing experiment being created.
- the ME server 108 is configured to limit the number of allocations to be less than or equal to a predetermined maximum number of allocations.
- the maximum number of allocations similar to the maximum number of variations, is a number of allocations selected to ensure that for a given experiment no variation receives too little traffic to yield statistical results.
- the maximum number of allocations and/or variations may be a function of expected or recorded levels of customer traffic at the customer facing UI 200. For example, as expected or historical customer traffic increases, the maximum number of allocations and/or variations may also increase. In some embodiments, the maximum number of allocations is about ten allocations.
- the ME server receives the addition of a third customer audience (e.g., segment) to the market experiment where the third customer audience is Multi Pet Owners and allocated three different market tests to those customer audiences, in generally the same manner as described above with reference to Figs. 3B-3O.
- the three allocated experiments include two different A/B tests and an MAB test.
- the percentage of traffic allocation is set to be 33% for each of the A/B tests and 34% to the MAB test.
- the ME server 108 may be configured to detect a value input at one or more of the traffic percentage fields 324 and update the traffic percentage for the corresponding allocated test accordingly.
- the ME server 108 may be configured to detect an input at the field 324 corresponding to the MAB test including a value of sixty and update the traffic percentage displayed therein to be 60%.
- the ME server 108 may be configured to repeat this process to alter the traffic percentage for one of the A/B tests as well.
- the ME server 108 may be configured to perform multiple different experiments on a customer audience in accordance with any distribution of traffic.
- the marketing experiment data for the marketing experiment generated in Figs. 3A-3P may include an indication that the name of the experiment is “Dog Cat Multi Pet Experience 2022-11-21” and that the zone of the customer facing UI where the experiment is to be performed is the zone “GoodyBox_Hero_Promo_Food_Supplement”.
- the marketing experimentation data may indicate that the start date is November 21, 2022 at 12:00 AM and the end date is November 27, 2022 at 11 :59 PM.
- the marketing experimentation data may include an indication that there is an MAB test with five variations at a 50% traffic allocation and a simple targeting/share of voice test with one variation at a 50% traffic allocation.
- the marketing experimentation data may include an indication that there is a first A/B test with two variations at a 33% traffic allocation, an MAB test with 4 variations at a 34% traffic allocation, and a second A/B test with two variations at a 33% traffic allocation.
- FIG. 4 there is shown a flowchart illustrating a method 400 of automatically performing a marketing experiment via the system 100 in accordance with an exemplary embodiment of the present disclosure.
- the method 400 is described with reference to a single customer (e.g., “Bob”) accessing the customer facing UI 200 via a client device 110 while at least one marketing experiment is actively running.
- a single customer e.g., “Bob”
- the methods discussed herein may be applied to a plurality of customers accessing the customer facing UI 200 either simultaneously or at different times.
- a customer may submit a request to access the customer facing UI 200 via client device 110 and log in to their customer specific account.
- the storefront server 102 may be configured to detect the request and the log in credentials and automatically associate the client device 110 with the customer specific account data stored on database 104.
- the storefront server 102 may be configured to transmit an indication to the ME server 108 that a client device 110 is requesting access to the customer facing UI 200.
- the indication transmitted from the storefront server 102 to the ME server 108 includes a unique visitor and/or customer identification number (e.g., a customer ID).
- the storefront server 102 may transmit the unique customer ID for Bob to the ME server 108.
- the method 400 may include the step 402 of, at the ME server 402, determining customer specific data.
- the ME server 108 is configured to perform step 402 in response to a client device 110 displaying the customer facing UI 200 or requesting to display the customer facing UI 200.
- the determined customer specific data includes, but is not limited to, the customer name, a unique customer identifier (e.g., customer ID), and one or more audiences to which the customer is included.
- the customer specific data includes the customer’s name “Bob”, customer ID “1234” and a listing of the audiences to which the customer is included, which in this example, is cat_owner, autoship customer, and high roller.
- the customer name, customer ID, and listing of audiences are stored in database 104 and the ME server 108 is configured to search or query the database 104 for that data.
- the storefront server 102 and/or audience server 106 are configured to transmit the customer data to the ME server 108.
- the step 402 includes determining a unique traffic bucket value for the customer.
- the ME server 108 may be configured to determine the unique traffic bucket value based on a determination of an experiment being run on the customer facing UI 200 that the client device 110 is currently displaying. For example, the ME server 108 may be configured to determine that there is a zone 204 displayed at the customer facing UI 200 that has a corresponding marketing experiment being currently run (e.g., see Fig. 3 A above). The ME server 108, in response to determining that there is a marketing experiment being run at a zone 204 included in the customer facing UI 200 displayed on the client device 110 may determine a unique ID for the experiment.
- each experiment having associated experiment data stored in the database 104 and/or ME server 108 may include a unique identifier (e.g., a unique numeric value).
- the ME server 108 may be configured to generate the traffic bucket value.
- the ME server 108 is configured to generate the traffic bucket value by combining the customer ID with the experiment ID and performing a hashing function on the combination. For example, if the customer ID is 1234 and the experiment ID is 5476, the ME server 108 may be configured to add the two values together and perform a hashing function.
- the hashing function is a modulo operation. For example, in Fig.
- the method 400 may include the step 404 of, at the ME server 108, assigning the customer to a customer audience the included in the experiment.
- the marketing experiment determined by the ME server 108 includes three different customer audiences: dog_owner, cat_owner, and high_roller.
- the ME server 108 may be configured to compare the customer account data with the customer audiences indicated in the marketing experiment data to determine if there are any matches.
- the customer “Bob” is included in the cat owner and high roller customer audiences, both of which are included in the marketing experiment.
- the ME server 108 may be configured to determine which of the matching customer audiences to assign the specific customer to.
- the customer audiences included in the marketing experiment may have associated priority data indicating a priority of one audience over the others. For example, in Fig. 4, the cat owner audience has a higher priority (e.g., priority: 10), than the high_roller audience (e.g., priority: 1).
- the ME server 108 may be configured to assign a customer to a corresponding audience based on the associated priority value. For example, as illustrated in Fig. 4, the customer is automatically assigned to the cat owner customer audience instead of the high roller customer audience because the cat owner customer audience has a higher priority value.
- the ME server 108 is configured to detect input at the experimentation user interfaces 300 indicating priority values for one or more customer audiences included in a marketing experiment to be performed during creation of a new marketing experiment (e.g., as shown and described above with reference to Figs. 3 A-3P) or the editing of a pending experiment.
- the method 400 may include the step 406 of, at the ME server 108, automatically assigning the customer to an allocated experiment included in the audience to which they are assigned.
- each audience included in the marketing experiment may include one or more allocated experiments, as discussed above with reference to Figs. 3O-3P.
- there are three allocated experiments for the cat owner audience e.g., a targeted placement experiment, an A/B test with three variations, and an MAB test with three variations).
- the ME server 108 may be configured to determine which of the allocated experiments the customer should be assigned to.
- the ME server 108 is configured to determine the assignment based on the traffic bucket value determined at step 402.
- the ME server 108 is configured to assign a number of customers to each allocated experiment corresponding to the traffic percentage of each allocated experiment. For example, in Fig. 4 the traffic percentages for the allocated experiments are 20%, 40% and 40% from top to bottom and the ME server 108 may be configured to assign 20% of the customers accessing the customer facing UI 200 that are included in the cat_owner customer audience to the targeted placement experiment, 40% to the A/B test, and 40% to the MAB test.
- the ME server 108 is configured to achieve the desired traffic allocation based on the hashing function.
- each allocated experiment includes a value range corresponding to the traffic percentage (e.g., values between 0-1999 for the targeted placement, values between 2000-5999 for the A/B test, and values between 6000-9999 for the MAB test).
- the traffic bucket value for the customer “Bob” determined by the ME server 108 is equal to 3921.
- the ME server 108 may automatically determine which value range for the different allocated experiments the traffic bucket value falls within. In this example, the value 3921 falls within the range of values corresponding to the A/B test.
- the ME server 108 may be configured to perform the modulo operation in step 402 based on the value range of the allocated experiments (e.g., (customer ID + experiment ID) % 10000). In this manner, as the volume of traffic (e.g., the number of customers accessing the customer facing UI) increases, the likelihood of meeting the desired traffic allocation percentages may be increased.
- the volume of traffic e.g., the number of customers accessing the customer facing UI
- the likelihood of meeting the desired traffic allocation percentages may be increased.
- two or more customers having a different customer ID may result in the same modulo operation output and be assigned to the same experiment allocation and/or variation (as discussed below).
- the modulus value in Fig. 4 is 10,000, however more than 10,000 customers (e.g., 1,000,000 customers) may access the customer facing UI each of which having a unique customer ID.
- an allocation of 40% to the AB test in Fig. 4 is associated with modulo operation outputs having a value of 2000-5999, the number of customers that are assigned to that allocation may be generally equal to 40% of 1,000,000 (400,000 customers).
- the actual number of customers assigned to a given allocation may be about +/- 1% of the target allocation percentage.
- the actual allocation may be between about 39% to about 41% of the total number of customers accessing the customer facing UT 200 at the same or different times.
- the method 400 may include the step 408 of, at the ME server 108, determining that the allocated experiment is an A/B test and assigning the customer to a variation included in the A/B test.
- the ME server 108 is configured to perform another modulo operation based on the customer ID and experiment ID and based on the number of variations included in the A/B test. For example, in Fig. 4, the A/B test includes three variations A, B, and C.
- the ME server 108 may be configured to perform a modulo operation based on the following: (customer ID + experiment ID) % 3.
- the ME server 108 may be configured to, based on the outcome of the modulo operation (e.g., a value of 0, 1, or 2) determine which of the variations to assign to the customer. In Fig. 4, the outcome of the modulo operation at step 408 is 1 thereby resulting in the customer being assigned to variation B.
- the ME server 108 may be configured to ensure that the same customer is always assigned to the same variation for an A/B test. This may be particularly beneficial in ensuring the accuracy of the results of the A/B test.
- the ME server 108 may be configured to ensure that the same customer is assigned to the same variation regardless of whether the traffic allocation percentage is altered during the run of the experiment. This may enable the system 100 to be capable of performing accurate A/B tests while simultaneously allowing for the desired traffic percentages to be altered. For example, if the traffic percentage for the A/B test were altered, during the time in which the marketing experimentation is being performed, the ME server 108 is configured to still assign the customer “Bob” to the B variation in the AZB test.
- the ME server 108 may be configured to omit the step 408 for MAB tests and targeted placement tests.
- the ME server 108 may be configured to, in an instance where the allocated experiment is an MAB test, automatically adjust the variations assigned to different customers in a dynamic manner during the course of the experiment. For example, when executing an MAB test, the ME server 108 may be configured to automatically adjust the allocation of customers to variations included in the MAB test based on the monitored performance of the different variations. For example, in an instance where there are three variations in an MAB test, the ME server 108 may be configured to determine that at a point in time following the start of the experiment, a first variation is performing better than the remaining two variations and allocate a larger percentage of the customers to that first variation.
- the system 100 may be configured to automatically perform, for each customer accessing the customer facing UI 200, a market experimentation based on one or more customer audience and one or more allocated experiments.
- the ME server 108 may be configured to, for each customer accessing the customer facing UI 200, and for each marketing experiment at a different zone of the customer facing UI 200, assign each customer to a specific allocated experiment and, in some instances, a specific variation.
- the ME server 108 may be configured to transmit an indication related to that determination to the storefront server 102.
- the storefront server 102 may transmit instructions to the client device 110 to render the customer facing UI 200 including the determined variation.
- the ME server 108 may be configured to cause the storefront server 102 to execute a marketing test including a rendering, at the customer facing UI 200, the experimentation content at one of the user- interactable UI elements on customer devices 110 corresponding to the target customer audience.
- the ME server 108 may cause the storefront server 102 to execute the marketing experiment such that the customer device 110 associated with the customer “Bob” renders variation B for a specific interactable UI element 202 on the customer facing UI 200.
- FIG. 5A-5E there is shown exemplary user interfaces illustrating automatically performing a marketing experiment via the system 100 for one or more different users and/or customer audiences and in accordance with an exemplary embodiment of the present disclosure.
- Figs. 5A-5B illustrate an experimentation UI 300 rendered on an admin device 112 and including a display of one or more input parameters and/or selections at least partially defining a marketing experiment that is scheduled to be performed by the ME server 108.
- FIGs. 5C-5E illustrate customer facing user interfaces 200 as rendered on three different client devices 110a-l 10c corresponding to three different users and during execution of the marketing experiment established in Figs. 5A-5B.
- the experimentation UI 300 is shown as depicting settings and/or details of a pending, or currently executing, marketing experiment.
- the ME server 108 may be configured to detect a selection of a scheduled or pending marketing experiment at the table 302 of marketing experiments 304 as shown in Fig. 3 A and cause the details of the selected marketing experiment to be rendered at an admin device 112, similar to what is illustrated in Figs. 5A-5B.
- the marketing experiment shown in Figs. 5A-5B may have been created in a similar manner as described above with regards to Figs. 3A-3P. In Figs.
- the marketing experiment is titled “hp-zone-14-specialty-test” and is scheduled to run from January 23, 2023 at 3: 12:00 PM to January 31, 2023 at 7:00:00 PM.
- the ME server 108 may be configured to display variations of one or more interactable elements 202 rendered at a selected zone 204 of the customer facing UI 200.
- the zone selected at the zone selection field 310 is the “new-deliver-hp-zone-14” of the customer facing UI 200, which is outlined in Figs. 5C-5E.
- the ME server 108 may be configured to perform a marketing experiment that includes a plurality of different customer audiences (e.g., traffic allocation segments) selected at the customer audience selection menu 314.
- the marketing experiment titled “hp-zone-14-specialty-tesf ’ includes at least eight different customer audiences (e.g., segments 1-8) included therein.
- the first customer audience selected at the audience selection menu 314 illustrated in Fig. 5A (e.g., segment 1) is a horse owner audience (e.g., “horse_l 172021”) corresponding to customers and/or users who are included in a horse owner customer audience.
- segment 2 is a farm animal owner audience (e.g., “Farm 1172021”) corresponding to customers and/or users who are included in a farm owner customer audience.
- Farm 1172021 a farm animal owner audience
- the ME server 108 may be configured to perform one or more specific types of marketing experiment selected at the experimentation selection menu 318 and in accordance with a traffic allocation percentage. For example, for each of the two customer audiences illustrated in Figs. 5A-5B, there is a single A/B test to be performed by the ME server 108 with a 100% traffic allocation. Each A/B test, as illustrated in Figs. 5A-5B, include two interactable element variations selected at fields 320.
- the two selected variations are 1) a control variation (e.g., “hp-shop-by-pet_control”) and 2) a non-control variation corresponding to popular horse customer products (e.g., “hp-horse- customer-favorites”).
- the two selected variations are 1) the control variation (e.g., “hp-shop-by-pet_control”) and 2) a non-control variation corresponding to popular farm animal customer products (e.g., “hp-farm-animal-customer- favorites”).
- the ME server 108 may be configured to automatically determine a customer audience associated with the client device 110 and cause a variation of one or more interactable UI elements to be rendered thereon in accordance with a currently running marketing experiment.
- a currently running marketing experiment For example, and as illustrated in Figs. 5C-5E, three different users (“Tom”, “Jerry”, “Pinky”) access the customer facing UI 200 via different client device 110a- 110c while the marketing experiment “hp-zone-14-specialty-test” discussed above with reference to Figs. A-5B is currently running.
- the ME server 108 may be configured to detect the user’s accessing the customer facing UI 200 via the respective client devices 110a- 110c and automatically determine customer specific data for each user in generally the same manner as discussed above with reference to step 402 in Fig. 4.
- the user Tom at the first client device 110a and the user Jerry at the second client device 110b are both included in the horse owner (e.g., Horse l 1172021) customer audience.
- the ME server 108 may be configured to automatically assign each of the first and second client devices 110a, 110b to the A/B test for the first segment of the marketing experiment “hp-zone-14-specialty-tesf ’.
- the user Pinky at the third client device 110c may be included in the farm animal owner (e.g., Farm_l 1172021) customer audience.
- the ME server 108 may be configured to automatically assign the third client device 110c to the second segment of the marketing experiment “hp-zone-14-specialty-test”.
- the ME server 108 may assign each of the users Tom, Jerry, and Pinky to respective the A/B tests for the first and second segments of the marketing experiment in generally the same manner as discussed above with reference to steps 404 and 406 in Fig. 4. It should be noted that because there is only a single A/B test for each of the first and second segments (e.g., a 100% traffic allocation to the AZB tests for each segment) of the marketing experiment being performed in Figs. 5A-5E that the ME server 108 may be configured to automatically assign client devices 110a- 110c to the respective A/B tests regardless of the calculated traffic bucket values. For example, in Fig.
- the determined traffic bucket value of 3921 for user Bob resulted in the client device being assigned to an A/B test having a 40% traffic allocation.
- the A/B tests have 100% traffic allocation and
- the ME server 108 is configured to assign all customers included in the respective customer audience segments (e.g., horse owners, farm animal owners) to the corresponding A/B tests.
- the ME server 108 may be configured to, in response to determining that a customer device 110 is assigned to an A/B test, render at the customer facing UI 200 displayed on the customer device 110 an interactable UI element variation included in the A/B test.
- the ME server 108 may be configured calculate an A/B bucket value in a manner generally the same as step 408 described above with reference to Fig. 4 for each user assigned to an A/B test.
- the ME server 108 may be configured to render at the customer facing UI 200 an interactable UI element variation based on the calculated A/B bucket value. For example, in Fig. 5C, the user Tom is assigned to the A/B test for horse owners and the calculated A/B bucket value is zero.
- the ME server 108 may be configured to render the first variation (e.g., the control variation) at the zone 204 including one or more interactable UI element variations 202a included in the first variation.
- the user Jerry is assigned to the A/B test for horse owners and the calculated A/B bucket value is one.
- the ME server 108 may be configured to render the second variation (e.g., the non-control variation) at the same zone 204 including one or more interactable UI element variations 202b included in the second variation.
- the user Pinky is assigned to the A/B test for farm animal owners and the calculated A/B bucket value is one.
- the ME server 108 may be configured to render the second variation (e.g., the non-control variation) at the same zone 204 including one or more interactable UI element variations 202c included in the second variation.
- the system 100 and/or method 400 discussed above may be configured to improve the performance of automated marketing experiments.
- System 100 and/or method 400 of the present disclosure may be configured to perform marketing experiments at the server-side (e.g., the ME server 108, audience server 106, and/or storefront server 102). In some instances, by performing the marketing experiments at the server-side, the risk of skewed experiment results and/or increases in rendering time may be eliminated or at least reduced.
Landscapes
- Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Engineering & Computer Science (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Finance (AREA)
- Entrepreneurship & Innovation (AREA)
- Game Theory and Decision Science (AREA)
- Economics (AREA)
- Marketing (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
A system for automatically performing marketing experimentation and analysis, the system includes a storefront server that transmits to customer devices instructions to render a customer- facing user interface (UI) that includes user-interactable UI elements. A database has stored thereon customer specific data corresponding to the customer devices. An audience server monitors customer interactions with the customer facing UI and determines customer audiences based on customer interactions with the customer facing UI. A marketing experimentation (ME) server: 1) generates an experimentation UI that includes a rendering of customer audiences and transmits it to the admin device for display, 2) detects, at the experimentation UI, a selection of: user-interactable UI elements, a target customer audience, and experimentation content, 3) causes the storefront server to execute a marketing test, and 4) automatically generates a user interaction record based on detected interactions with experimentation content displayed at the customer facing UI.
Description
TITLE
[0001] System for Automatically Performing Marketing Experimentation and Analysis
CROSS-REFERENCE TO RELATED APPLICATIONS
[0002] This application claims the benefit of U.S. Provisional Patent Application No. 63/490,648 filed March 16, 2023 entitled “System for Automatically Performing Marketing Experimentation and Analysis”, which is incorporated by reference herein in its entirety.
TECHNICAL FIELD
[0003] The present disclosure generally relates to systems for performing marketing experimentation and more specifically, to systems for automatically performing marketing experimentation and analysis in accordance with a plurality of criteria.
SUMMARY
[0004] In one embodiment there is a system for automatically performing marketing experimentation and analysis, the system including a storefront server in communication with a plurality of customer devices, the storefront server configured to transmit to the customer devices instructions to render a customer-facing user interface (UI) on the customer device, the customerfacing UI including a plurality of user-interactable UI elements, a database in communication with the storefront server, the database having stored thereon customer specific data corresponding to the plurality of customer devices, an audience server in communication with the database and storefront server, the audience server configured to monitor customer interactions with the customer facing UI and determine a plurality of customer audiences based on one or more customer interactions with the customer facing UI, each customer audience being associated with a similar customer interaction, a marketing experimentation (ME) server in communication with the audience server, storefront server, and an admin device, the ME server configured to, generate an experimentation UI and transmit the experimentation UI to the admin device for display, the experimentation UI including a menu rendering of the plurality of customer audiences, detect, at the experimentation UI, a user selection of one of the user-interactable UI elements for the customer-facing UI, a target customer audience from the menu, and experimentation content to be rendered in association with the selected user-interactable UI element, in response to detecting the user selection, cause the storefront server to execute a marketing test including rendering, at the customer facing UI, the experimentation content at the selected one of the user-interactable UI elements on the customer
devices corresponding to the target customer audience, automatically generate a user interaction record based on detected interactions with the experimentation content displayed at the customer facing UT.
[0005] In some embodiments, the ME server is further configured to detect, at the experimentation UI, a user selection of a marketing experimentation type, and cause the storefront server to render, at the customer facing UI, the experimentation content according to the selected marketing experimentation type. In some embodiments, the marketing experimentation type is one of an A/B test, and a multi -armed bandit test. In some embodiments, the ME server is further configured to detect, at the experimentation UI, a selection of a second experimentation content, a first allocation amount, and a second allocation amount, and in response to detecting the user selection, cause the storefront server to execute the marketing test including, rendering, at the customer facing UI, the experimentation content at the one of the user-interactable UI element on a number of customer devices corresponding to the target customer audience and the first allocation amount, and rendering, at the customer facing UI, the second experimentation content at the one of the user-interactable UI element on a number of customer devices corresponding to the target customer audience and the second allocation amount.
[0006] In some embodiments, the audience server is configured to automatically determine customer audiences via machine learning. In some embodiments, one or more of the customer audiences is comprised of customers having corresponding customer specific data including an indication of a similar geographical area. In some embodiments, one or more of the customer audiences is comprised of customers having corresponding customer specific data including an indication of a similar type of customer device used to interact with the customer facing UI. In some embodiments, the experimentation content is a plurality of different experimentation contents, and the ME server is configured to in response to detecting the user selection, cause the storefront server to execute the marketing test including, for each experimentation content included in the plurality thereof rendering, at the customer facing UI, the corresponding experimentation content at the one of the user-UI element on customer devices corresponding to a subset of the target customer audience. In some embodiments, the ME server is configured to detect one or more user defined priorities at the experimentation UI and cause the storefront server to execute the marketing experiment in accordance with one or more user defined priorities.
[0007] In another embodiment there is a method of automatically performing marketing experimentation and analysis including, at a storefront server in communication with a plurality of customer devices, transmitting to the customer devices instructions to render a customer-facing user
interface (UI) on the customer device, the customer-facing UI including a plurality of user- interactable UI elements, at an audience server in communication with the storefront server, monitoring customer interactions with the customer facing UI and determining a plurality of customer audiences based on one or more customer interactions with the customer facing UI, each customer audience being associated with a similar customer interaction, at a marketing experimentation (ME) server in communication with the audience server, storefront server, and an admin device, generating an experimentation UI and transmitting the experimentation UI to the admin device for display, the experimentation UI including a menu rendering of the plurality of customer audiences, detecting, at the experimentation UI, a user selection of: one of the user- interactable UI elements for the customer-facing UI, a target customer audience from the menu, and experimentation content to be rendered in association with the selected user-interactable UI element, in response to detecting the user selection, causing the storefront server to execute a marketing test including rendering, at the customer facing UI, the experimentation content at the selected one of the user-interactable UI elements on the customer devices corresponding to the target customer audience, automatically generating a user interaction record based on detected interactions with the experimentation content displayed at the customer facing UI.
[0008] In some embodiments, the method further includes, at the ME server detecting, at the experimentation UI, a user selection of a marketing experimentation type, and causing the storefront server to render, at the customer facing UI, the experimentation content according to the selected marketing experimentation type. In some embodiments, the marketing experimentation type is one of an A/B test, and a multi -armed bandit test. In some embodiments, the method further includes, at the ME server detecting, at the experimentation UI, a selection of a second experimentation content, a first allocation amount, and a second allocation amount, in response to detecting the user selection, causing the storefront server to execute the marketing test including rendering, at the customer facing UI, the experimentation content at the one of the user-interactable UI element on a number of customer devices corresponding to the target customer audience and the first allocation amount, and rendering, at the customer facing UI, the second experimentation content at the one of the user- interactable UI element on a number of customer devices corresponding to the target customer audience and the second allocation amount.
[0009] In some embodiments, the method further includes, at the audience server automatically determining customer audiences via machine learning. In some embodiments, one or more of the customer audiences is comprised of customers having corresponding customer specific data including an indication of a similar geographical area. In some embodiments, one or more of the
customer audiences is comprised of customers having corresponding customer specific data including an indication of a similar type of customer device used to interact with the customer facing UI. In some embodiments, the experimentation content is a plurality of different experimentation contents, and the method further includes, at the ME server in response to detecting the user selection, causing the storefront server to execute the marketing test including, for each experimentation content included in the plurality thereof rendering, at the customer facing UI, the corresponding experimentation content at the one of the user-UI element on customer devices corresponding to a subset of the target customer audience. In some embodiments, the method further includes, at the ME server, detecting one or more user defined priorities at the experimentation UI and causing the storefront server to execute the marketing experiment in accordance with one or more user defined priorities.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The following detailed description of embodiments of the system and method, will be better understood when read in conjunction with the appended drawings of exemplary embodiments. It should be understood, however, that the invention is not limited to the precise arrangements and instrumentalities shown.
[0011] In the drawings:
[0012] Fig. l is a block diagram illustrating an implementation of a system for automatically performing marketing experimentation and analysis in accordance with an exemplary embodiment of the present disclosure;
[0013] Fig. 2 is an illustration of a customer facing user interface generated by the storefront server of the system of Fig. 1;
[0014] Figs. 3A-3P illustrate exemplary user interfaces for creating, and editing a marketing experiment;
[0015] Fig. 4 is a flowchart illustrating a method of automatically performing a marketing experimentation via the system of Fig. 1 and in accordance with an exemplary embodiment of the present disclosure; and
[0016] Figs. 5A-5E illustrate exemplary user interfaces illustrating automatically performing a marketing experiment via the system of Fig. 1 and in accordance with an exemplary embodiment of the present disclosure. Figs. 5A-5B illustrate exemplary experimentation user interfaces and Figs. 5C-5E illustrate corresponding customer facing user interfaces rendered at different client devices.
DETAILED DESCRIPTION
[0017] With the increasing popularity of e-commerce (e.g., business-to-consumer) in recent years it has become increasingly valuable to retailers offering products for sale via online marketplaces to validate existing sales campaigns and discover new sales strategies for future campaigns. As such, it is desirable for retailers to conduct marketing experimentation to investigate opportunities for increasing their sales. However, online retailers in particular are faced with high traffic volume including large numbers of customers (e.g., in the order of hundreds of thousands, millions, tens of millions) having a wide variety of shopping habits accessing the online retail platform. This often increases the complexity and difficulty of implementing marketing experimentations given the large number of customers and variety in shopping habits. Furthermore, conventional systems and methods for performing marketing experimentation on online retail platforms often fail to meet desired traffic splits for certain types of experiments (e.g., A/B tests) and/or accurately account for a variety of shopping habits of the customers accessing the online retail platform, and often require time consuming manual edits to underlying computer executable code in order to create and/or edit marketing experiments. As such, it is desirable to provide a system for automatically performing marketing experimentation and analysis that is capable of performing a plurality of experiments on a large number of customers accurately, in accordance with a plurality of different shared shopping habits or customer characteristics, and reduces the time required to make, edit, and/or execute said marketing experiments.
[0018] Numerous details are described herein in order to provide a thorough understanding of the example embodiments illustrated in the accompanying drawings. However, some embodiments may be practiced without any of the specific details, and the scope of the claims is only limited by those features and aspects specifically recited in the claims. Furthermore, well-known methods, components, and circuits have not be described in exhaustive detail so as not to unnecessarily obscure pertinent aspects of the embodiments described herein.
[0019] Referring to the drawings in detail, wherein like reference numerals indicate like elements throughout, there is shown in Figs. 1-4 a system for automatically performing marketing experimentation and analysis, and alternatively referred to as system 100 for short, in accordance with an exemplary embodiment of the present disclosure.
[0020] In one embodiment, the system 100 includes one or more computers or computing devices having one or more processors and memory (e.g., one or more nonvolatile storage devices). In some embodiments, memory or computer readable storage medium(s) of memory store programs,
modules and data structures, or a subset thereof, for a processor to control and run the various systems and methods disclosed herein. In one embodiment, a non-transitory computer readable storage medium having stored thereon computer-executable instructions which, when executed by a processor, performs one or more of any combination of the methods or steps disclosed herein. In some embodiments, one or more of the computers or computing devices (e.g., servers) included in the system 100 may include a collection of networked computing devices, servers and/or processing units in communication with one another. In some embodiments, one or more of the networked computing devices, servers and/or processing units may be vertically and/or horizontally scaled to accommodate for increases in processing requests (e.g., increasing network traffic). In some embodiments, one or more of the servers included in the system 100 may be configured to execute one or more different collections of executable code, referred to as “microservices” for short, in accordance with a microservice architecture. A “microservice” as referenced herein may refer to a collection of computer executable code configured to perform a predetermined function that is executed by a respective server or via a cloud-based infrastructure. In some embodiments, the functionality of a server or a microservice thereof may be accessible at another server via one or more application programming interfaces (APIs) and/or networks. For sake of brevity, one or more computers or computing devices included in the system 100 may be referred to as servers.
[0021] Referring to Figs. 1-2, there is shown a block diagram illustrating an implementation of the system 100 and an example of a customer facing UI 200. The system 100 may include a storefront server 102, database 104, an audience server 106, and a marketing experimentation (ME) server 108. The storefront server 102 may be in communication with the database 104, audience server 106 and/or ME server 108. In some embodiments, storefront server 102, the ME server 108 and audience server 106 are in communication via respective application programming interfaces (APIs). In some embodiments, the storefront server 102 is configured to generate a customer facing UI (e.g., customer facing UI 200 shown in Fig. 2) such that it is accessible via the internet and viewable by a customer on a respective customer device 110 in communication with the storefront server 102. In some embodiments, the storefront server 102 is in communication with a plurality of customer devices 110. The storefront server 102 may be configured to transmit to the customer devices 110 instructions to render a customer-facing UI 200 on the customer device 110. The customer facing UI 200 may be configured to display one or more interactable digital representations of products for purchase therefrom. In some embodiments, the customer facing UI 200 is configured to facilitate digital transactions between customer devices 110 and the storefront server 102 (e.g., purchases of products by customers via the customer facing UI 200). The products
for purchase at the customer facing UI 200 may have associated data stored in the database 104 and in some instances there may be a plurality of different products having associated data stored in the database 104. In some embodiments, the database 104 has stored thereon customer specific data corresponding to the plurality of customer devices 110.
[0022] In one aspect of the invention, the audience server 106 is in communication with the database 104 and storefront server 102, and/or is configured to monitor customer interactions with the customer facing UI 200. The audience server 106 may be configured to determine a plurality of customer audiences based on one or more customer interactions with the customer facing UI 200, where each customer audience being associated with a similar customer interaction. For example, the audience server 106 may be configured to determine customer audiences and assign customers interacting with the customer facing UI 200 to one or more of the determined audiences. In some embodiments, the audience server 106 is configured to determine customer audiences via a machine learning algorithm.
[0023] Various aspects of the present disclosure are described with reference to customer audiences and as such, it should be understood that a customer audience refers to a dataset representative of a grouping of customers based on similar interactions and/or customer data. For example, one customer audience may be cat owners and customers grouped into that audience may be 1) customers having data stored on the database 104 indicating that they own a cat and/or 2) customers interacting with the customer facing UI 200 generated by the storefront server 102 in a manner that indicates they own a cat (e.g., the purchase of cat food, toys, medications). It should be understood though that there may be a plurality of different customer audiences in accordance with a plurality of different interactions and/or customer data. Non-limiting examples of customer data and/or data used to determine and/or assign customers into a customer audience may include: type of pet, amount of money spent within a given time period, geolocation information, type of device used to interact with the customer facing UI, time spent with items selected for purchase without proceeding to finalize the purchase (e.g., time spent with items in a ‘cart’). Customer audience datasets may be stored on the database 104 and the storefront server 102, audience server 106 and/or marketing experimentation server 108 may be configured to automatically query the database 104 to retrieve a customer audience dataset.
[0024] Customer interaction data may relate to aspects of customer interactions with the interactable UI elements on the customer facing UI 200. Non-limiting examples of aspects of customer interactions t include: selecting (e.g., clicking, highlighting) UI elements corresponding to products or services for a specific type of pet or animal, duration that user reads or reviews UI
elements (e.g., articles, product descriptions, product reviews), time between consecutive user selections with UI elements, number of selections of UI elements with one or more UI elements within a predetermined amount of time. Other aspects of customer interactions and customer data may be, in some embodiments, related to one another. For example, the audience server 106 may be configured to determine that a customer interaction is with a UI element corresponding to a cat food product and may automatically generate customer data indicating that the customer owns a cat. As such, some further customer audiences may include, but are not limited to: cat owners, dog owners, multi-pet owners, customers achieving a threshold spending level, customers displaying uncertainty in purchases, and so on.
[0025] In some embodiments, the system 100 is configured to enable a user to input one or more search parameters or rules corresponding to a desired set of shared customer characteristics for a new customer audience and automatically retrieve a listing of customers having those shared characteristics. In some embodiments, the system 100 is configured to receive the user defined rules in a format that requires little to no programming experience on the part of the user. For example, the user defined rules may be received by the system 100 as plain text (e.g., “search for customers who started purchasing cat food within the last three months”). In some embodiments, the system 100 is configured to receive the user defined rules and automatically convert the rules into a structured query language (SQL) format or any other standard data management language. For example, the audience server 106 may be configured to detect or receive the input plain text command discussed in the preceding example and automatically convert the text into an SQL command and execute the SQL command to retrieve a listing of customers that first purchased a cat food product via the customer facing UI 200 within three months prior to the current date. In some embodiments, the audience server 106 is configured to automatically execute the SQL command on the database 104 or another database in communication with the audience server 106 where relevant customer data may be found. In this manner, the system 100 may be configured to enable a user, with little to no computer programming experience, to easily input criteria for a desired customer audience and automatically generate a list of customers within said audience.
[0026] In some embodiments, the audience server 106 is configured to monitor customer interactions with the customer facing UI 200 and determine a plurality of customer audiences based on the customer interactions with the customer facing UI 200. In some embodiments, determining a customer audience via the audience server 106 includes detecting one or more interactions with the customer facing UI 200 at a plurality of different customer devices 110 that are similar to one another and establishing or defining a customer audience based on those similar interactions. An
example of a similar interaction may include, but is not limited to, viewing one or more products for sale for a predetermined amount of time without purchasing said products (e.g., viewing products frequently and/or for an extended period of time without purchasing). The audience server 106 may be configured to establish a customer audience specific to that type of interaction (e.g., interactions indicating uncertainty in purchases) and automatically assign customers exhibiting that type of interaction with the established customer audience.
[0027] The audience server 106 may be configured to monitor the customer facing UI 200 that is generated by the storefront server 102. The audience server 106 may be configured to automatically associate a customer with at least one customer audience, based on a record of the interactions between a client device 110 and UI 200. Assigning a customer to a customer audience may include, at the audience server 106, causing customer specific data associated with the customer and stored on database 104 to be updated to include an indication that the customer is included in the specific customer audience. In some embodiments, the audience server 106 has stored thereon data for a plurality of different customer audiences and an indication of which specific customers are included in each.
[0028] Various aspects of the present disclosure are discussed with reference to customers and customer interactions with the customer facing UI 200. Customers may be entities that, via a client device 110, interact with the customer facing UI 200 generated by the storefront server 102. The client device(s) 110 may alternatively be referred to as customer device(s) 110. A customer interacting with the customer facing UI 200 via a client device 110 may have data associated with their interaction. For example, a customer interacting with the customer facing UI 200 via a client device 110 may do so while logged in to a customer account having associated customer specific account data stored on database 104. The customer specific account data may include, but is not limited to: customer name, shipping address, type of pet owned, similar geographical area or location, and/or similar type of customer device(s) used to interact with the customer facing UI 200. In other instances, a customer interacting with the customer facing UI 200 via a client device 110 may do so while not logged in to any customer specific account. In instances where the customer is not logged in to any customer specific account, the system 100 may be configured to use unique identifying data specific to the client device 110 (e.g., internet protocol (IP) address) to associate customer interactions with the customer specific account data stored on the database 104. A “customer” as referenced herein may refer to an entity that interacts with the customer facing UI 200, via a client device 110, and that is identifiable by the storefront server 102 and/or audience server 106.
[0029] Customer interactions with the customer facing UI 200 may include inputs at a client device 110 that cause the client device 110 to interact with interactable UI elements 202 displayed on the customer facing UI 200. The audience server 106 may be configured to determine the customer interacting with the customer facing UI 200 and, based on the interactions, associate the customer with one or more customer audiences. For sake of brevity, it will be assumed herein that customers interacting with the customer facing UI 200 are doing so while logged in to a customer specific account having associated data stored on database 104. For example, a customer (e.g., Bob) may access the customer facing UI 200 via a client device 110 and input their login credentials to associate their interactions on the customer facing UI 200 with their customer specific account data stored on database 104. The storefront server 102 and/or audience server 106 may be configured to automatically associate any interactions with the customer facing UI 200 via that specific client device 110 with the customer specific account data for said customer (e.g., Bob).
[0030] The audience server 106 may be configured to monitor customer interactions with different interactable UI elements 202 and automatically associate the customer with one or more customer audiences. For example, the audience server 106 may be configured to, in response to a customer interacting with the UI element 202 corresponding to “cat deals”, automatically associate the customer with a cat owner customer audience. The audience server 106 may be configured to associate a plurality of different customers with a plurality of different customer audiences. In some embodiments, a single customer may be associated with a plurality of customer audiences. For example, and referring to table 1 below, there is shown a listing of customers and the audiences to which they are associated. Although only four audiences are shown, it should be understood that a customer may be associated with more or fewer than four audiences. In some embodiments, a single customer may be associated with up to about one-hundred different customer audiences. In some embodiments, the audience server 106 is configured to generate and maintain a record of customers included in each determined customer audience. In some embodiments, another server (e.g., the ME server 108) is configured to transmit a request for customer audience data to the audience server 106 and the audience server 106 may be configured to transmit a listing of customers included in that audience back to the server that transmitted the request. The audience server 106 may be configured to automatically monitor customer interactions and associate customers with different customer audiences independent of one or more other servers included in the system 100 (e.g., the ME server
Table 1: Examples of Customers and their associated audiences
[0031] What is shown in Fig. 2 is an example user interface and that the customer facing UI may include any number of different UI elements 202 arranged in any number of ways. For sake of brevity though, the customer facing UI 200 shown in Fig. 2 will be referenced herein so as not to obscure pertinent aspects of the present disclosure. In some embodiments, the ME server 108 is configured to alter or change the interactable UI elements 202 included in the customer facing UI 200 in order to conduct a marketing experiment. In some embodiments, the ME server 108 is configured to alter or change interactable UI elements 202 based on specified zones 204 or areas of the customer facing UI 200. A zone 204 may include an area or section of the customer facing UI 200 where one or more interactable UI elements 202 are located. In Fig. 2, the customer facing UI 200 is illustrated as including three zones 204. The uppermost zone 204 includes an interactable UI element 202 configured to act as a banner that automatically cycles through different interactable elements (e.g., a carousel banner) at a predetermined interval. The zones 204 located below the topmost zone 204 in Fig. 2 includes a plurality of different interactable UI elements 202 displayed simultaneously. It should be understood though that each zone 204 illustrated in Fig. 2 is an example, and that zones 204 may encompass different portions or sections of the customer facing UI 200.
[0032] In some embodiments, the ME server 108 is configured to enable a user (e.g., an administrator) to create, edit, delete, and/or view data associated with one or more marketing experimentations. The ME server 108 may be configured to generate an experimentation UI (discussed in more detail below) and transmit the experimentation UI to an admin device 112, or a client device 110. The admin device 112 may be in communication with the ME server 108 such that the experimentation UI may be rendered thereon. The admin device 112 may be any suitable computing device such as, but not limited to, a laptop, desktop computer, smart phone, or tablet. The ME server 108 may be configured to automatically perform a marketing experiment on customers included in one or more customer audiences having associated customer audience data that is
maintained (e.g., edited, created, deleted) by the audience server 106. For example, in some instances it may be desirable to execute an A/B type test at the customer facing UI 200 directed toward a “dog owner” customer audience. As such, the ME server 108 may be configured to receive customer specific account data associated with the “dog owner” customer audience from the audience server 106 and automatically perform the A/B type test at client devices 110 associated with the customer specific account data (e.g., where the customer account is logged in). In this manner, the system 100 of the present disclosure may enable for marketing experiments to be performed within one or more desired customer audiences more accurately than in conventional systems and methods.
[0033] Referring to Figs. 3A-3P there are shown example experimentation user interfaces 300 generated by the ME server 108 and displayed on an admin device 112. The example user interfaces 300 shown in Figs. 3A-3P illustrate the creation of a new marketing experiment. The experimentation UIs 300 illustrated in Figs. 3A-3P include various input fields related to customer audiences and marketing experiments. However, in Figs. 3A-3P, the terms “experiment” and “customer audience” are not illustrated. Instead, in Figs. 3A-3P, the terms “experience” and “segments” are illustrated. The term “experience” as appearing in the embodiment of Figs. 3A-3P refers to marketing experiments and the term “segment” refers to customer audiences. For sake of brevity, the following description of Figs. 3A-3P will be described with reference to marketing experiments and customer audiences. Figs. 3A-3P will be described with reference to inputs at the experimentation UI 300 transmitted from the admin device 112 to the ME server 108 (e.g., clicks at different interactable elements, data inputs). For sake of brevity, not every transmission of inputs will be described in detail, however it should be understood that in one embodiment the ME server 108 is configured to detect the inputs and cause the rendering of the experimentation UI 300 at the admin device 112 to be updated accordingly.
[0034] Various aspects of the exemplary system and methods discussed herein are illustrated with reference to the exemplary user interfaces discussed herein. Furthermore, the exemplary user interfaces are described with reference to users (e.g., administrators) selecting graphical elements or inputting data into fields displayed at the user interfaces. It will be understood that a selection at a graphical element (e.g., an input field or selection field) displayed at an exemplary UI may refer to an input received at the admin device 112 corresponding to that graphical element. For example, a selection by a user may be a user input at a peripheral input device (e.g., a mouse, a keyboard, touch screen, voice command received via microphone) resulting in an interaction with elements of the experimentation UI 300.
[0035] Marketing experiments, as referenced herein, may refer to a procedure in which one or more interactable UI elements 202 included at the customer facing UI 200 are replaced with one or more different interactable UI elements and interaction with the different interactable UI elements is monitored. For example, and referring back to Fig. 2, during execution of a marketing experiment, one or more of the interactable UI elements 202 may be replaced with a different interactable UI element (e.g., a variation). The variations may be different for different subsets of customer devices 110. For example, a first subset of customer devices 110 displaying a rendering of the customer facing UI 200 may include a first variation of a specific interactable UI element 202 whereas a second subset of customer devices 110 may include a second variation of the specific interactable UI element 202. The ME server 108 may be configured to cause the storefront server 102 to render different variations at the customer facing UI 200 for one or more different customer devices 110. In some embodiments, the ME server 108 is configured to automatically monitor and/or record interactions with the variations. In this manner, the ME server 108 may be configured to automatically analyze the outcome of marketing experiments performed on the customer facing UI 200.
[0036] Referring to Fig. 3 A, the ME server 108 may be configured to detect one or more user inputs at the admin device 112 indicating that the experimentation UI 300 is requested and cause the admin device 112 to render the experimentation UI 300 thereon. The experimentation UI 300 in Fig. 3 A illustrates a visual representation of a table 302 of one or more marketing experiments 304 grouped by date (e.g., columns in table 302) and zone (e.g., rows in table 302). The table 302 may generally be a visual representation of a schedule including indications of different marketing experiments 304. The zones indicated in the left most column may correspond to the zones 204 of the customer facing UI 200 and shown and described above with reference to Fig. 2. For example, each row in table 302 may correspond to a different zone 204 of the customer facing UI 200. The columns may represent dates for which a marketing experiment 304 is scheduled to be active. For example, there are two marketing experiment 304 located in the fourth row corresponding to zone ID 12345, the first one being scheduled from November 21, 2022 to November 22, 2022 and the second one being scheduled from November 23, 2022 to November 27, 2022. In some embodiments, the ME server 108 is configured to prevent more than one marketing experiments 304 from being run on the same zone at the same time. For example, the ME server 108 may be configured to ensure that only one marketing experiment is scheduled to be performed at zone ID 12345 on any given day. In some embodiments, data for each of the marketing experiments 304 is
stored on the ME server 108. In other embodiments, data for each of the marketing experiments 304 is stored on the database 104.
[0037] In some embodiments, the table 302 may include an indication as to the status of one or more of the marketing experiments 304. For example, in the third row there is shown an indication of two marketing experiments 304 where the first marketing experiment 304 scheduled to run from November 22, 2022 to November 25, 2022 is “live” and the second marketing experiment 304 scheduled to begin on November 26 is listed as “scheduled”. As such, the table 302 may provide a visual indication that the first marketing experiment 304 is currently running whereas the second marketing experiment has yet to begin. In some embodiments, the ME server 108 is configured to receive one or more user inputs indicating a request to create a new marketing experiment. For example, the experimentation UI 300 may include an interactable button 306 that when clicked may cause the ME server 108 to render at the admin device 112 an experimentation UI 300 similar to what is shown in Fig. 3B.
[0038] Referring to Fig. 3B, in response to receiving a request to create a new marketing experiment, the ME server 108 may be configured to cause the admin device 112 to render an experimentation UI 300 for inputting information for a new marketing experiment. The experimentation UI 300 may include an experiment name input field 308, zone input field 310 and one or more date input fields 312. The experiment name input field 308 may enable the ME server 108 to receive a name for the experiment (e.g., “experiment 1”) and the zone input field 310 may enable he ME server 108 to receive selection of the zone 204 of the customer facing UI 200 where an experiment is to take place. In some embodiments, the zone input field 310 is a drop-down list including a listing of different zones 204 included in the customer facing UI 200. The date input fields 312 may enable he ME server 108 to receive an input of a desired start date and end date for the marketing experiment.
[0039] Referring to Fig. 3C, the ME server 108 may be configured to detect, at the experimentation UI 300, a selection of one of the user-interactable UI elements for the customerfacing UI 300. For example, at the ME server 108 may detect a selection at the input field 310 of a zone 204 thereby including one or more user-interactable UI elements 202 where a marketing experimentation variation may be used. In Fig. 3C, the ME server 108 has detected inputs at fields 308, 310, and 312 and updated the rendering of the experimentation UI 300 accordingly. Further to this example, in Fig. 3C, the input at the name input field 308 is “Dog Cat Multi Pet Experience 2022-11-21” and the input at the zone input field 310 indicates the zone 204 is “GoodyBox_Hero_Promo_Food_Supplement”. As such, in this example the name of the new
marketing experiment and the zone 204 of the customer facing UI 200 has been specified. Furthermore, the inputs at the date input fields 312 indicate that the marketing experiment will start on November 21 , 2022 at 12:00 AM and end on November 27, 2022 at 11 :59 PM. The ME server 108 may be configured receive selections for the name, zone placement and/or start and end dates of a marketing experiment.
[0040] In some embodiments, the ME server 108 is configured to receive a selection of one or more customer audiences to be included in the marketing experiment. The experimentation UI may include a menu rendering of the plurality of customer audiences received from the audience server 106. For example, the experimentation UI 300 may include a customer audience selection menu 314 configured to receive selection a customer audience from a plurality of different customer audiences. The ME server 108 may be configured to detect, at the experimentation UI 300, a selection of a target customer audience from the menu 314. In some embodiments, in response to detecting an input at the selection menu 314, the ME server 108 is configured to transmit a request to the audience server 106 for a listing of all determined customer audiences and cause the experimentation UI 300 to display the listing. For example, the ME server 108 may be configured to communicate with the audience server 106 to receive a listing of all different customer audiences (e.g., cat owners, dog owners, multi -pet owners) determined by the audience server 106. In response to receiving the request, the audience server 106 may be configured to transmit data corresponding to a listing of different customer audiences to the ME server 108. The ME server 108 may be configured to display the listing of customer audiences at the experimentation UI 300 at the customer audience selection menu 314.
[0041] In some embodiments, the ME server 108 is configured to enable a user to allocate one or more types or forms of marketing experiment to the customer audience selected at the customer audience selection menu 314. For example, the experimentation UI 300 may include an allocation selection button 316 configured to receive an allocation of a marketing experiment to the customer audience selected at the audience selection menu 314.
[0042] Referring to Fig. 3D, the illustrated ME server 108 has detected an input at button 316 in Fig. 3C. In response to receiving the input at button 316 in Fig. 3C, the ME server 108 may be configured cause the experimentation UI 300 to display an experimentation selection menu 318. The selection menu 318 may include one or more interactable elements to receive selection of one or more different types of marketing experiments. For example, in Fig. 3D, the different types of marketing experiments, may include, but are not limited to: 1) simple targeting/ share of voice, 2) A/B test, and 3) a multi-armed bandit (MAB) test. The ME server 108 may be configured to detect
a selection of the type of marketing experiment to be performed and automatically associate the selection with the customer audience selected at the selection menu 314. In some embodiments, the ME server 108 may be configured to provide a default customer audience selection in response to not detecting a selection at the customer audience selection menu 314. For example, in Fig. 3D, the selected customer audience is set to a default state indicating that no selection was made. In some embodiments, the ME server 108 is configured to set the default customer audience to be all customer audiences not specifically selected. For example, the ME server 108 may be configured to receive selection of a plurality of different customer audiences to include in a marketing experiment (as discussed in more detail below). In some embodiments, the ME server 108 is configured to set the default when no selection is received to be all non-specified customer audiences. In other embodiments, the ME server 108 may not be configured to provide a default customer audience selection.
[0043] In some embodiments, the experimentation selection menu 318 may include a variation selection field 320 configured to receive selection one or more variations for interactable UI elements 202 to display during the marketing experiment. The ME server 108 may be configured to detect, at the experimentation UI 300, a selection of experimentation content to be rendered in association with the user-interactable UI element. The ME server 108 may be configured to detect an input at the variation selection field 320 and, in response, display a listing of available variations at the experimentation UI 300 and receive a selection from the displayed listing (e.g., as shown in Fig. 3E). In some embodiments, the ME server 108 and/or database 104 stores data relating to interactable UI element variations. In some embodiments, the interactable UI element variations are preloaded into the database 104 and/or ME server 108. As such, the database 104 and/or ME server 108 may store data relating to a plurality of different interactable UI element variations that may be displayed at the customer facing UI 200 during a marketing experiment executed by the ME server 108. For example, and as shown in Fig. 3E, the experimentation selection menu 318 is selected and the ME server 108 has caused the experimentation UI 300 to render a listing of available interactable UI element variations for selection based on the selection.
[0044] In some embodiments, an interactable UI element variation is a UI element that has at least some data that is different than a UI element 202 being currently displayed on the customer facing UI 200. For example, if a UI element 202 displayed at the customer facing UI 200 has data corresponding to a specific dog food product (e.g., dog food product A), then an interactable UI element variation may include data corresponding to a different dog food product (e.g., dog food product B). In some embodiments, a difference between the UI element 202 and a selected variation
thereof may be a difference in how the UI element is rendered on the customer facing UI 200. For example, an existing UI element 202 may have a first appearance whereas a variation thereof may have a second appearance that differs in at least one of, size, shape, color, and/or indicia. Further to this example, and referring to the variations displayed in Fig. 3E, there are a plurality of variations labeled as “Dog Content # 2022-11-21”, where the “#” represents a different integer. Each of the different variations may have data that is different from each other and/or from an existing UI element 202 rendered on the customer facing UI 200. For example, an existing UI element 202 may be for a dog chew toy and the variation labeled “Dog Content 1 2022-11-21” may include data relating to a flea and tick medication for a dog.
[0045] Referring to Fig. 3F, an interactable UI element variation is selected at the variation selection field 320. In some embodiments, the ME server 108 may be configured to determine if sufficient information and/or selections have been provided for an allocated experiment based on the type of experiment selected in the experimentation selection menu 318. For example, in Fig. 3F, the simple targeting/ share of voice type of experiment is selected which may require only one variation be selected in order for the ME server to execute the simple targeting/ share of voice experiment. As such, the ME server 108 may be configured to determine that a single variation (e.g., “Not Dog Content 11 2022-11-21”) at the variation selection field 320 has been selected and therefore that sufficient, or required, information to perform the experiment has been provided In Fig. 3F, the selected the type of experiment to be performed (e.g., a simple targeting/share of voice test) has been selected and sufficient information to perform the selected type of experiment on the default customer audience has been received at the ME server 108. In response to receiving the selection of experiment type and sufficient information, the ME server 108 may be configured to receive a request to save the information and selections for the marketing experiment input so far. For example, the ME server 108 may be configured detect an input at the “save” button displayed below the variation selection field 320 and store data related to the experiment allocated in Figs. 3B-3F in the database 104 and/or locally on the ME server 108.
[0046] Referring to Fig. 3G, the ME server 108 has received a request to save the information and selections for the type of allocated experiment. The ME server 108, in response to detecting an indication that the allocated experiment is to be saved, may cause the experimentation UI 300 to be updated similar to what is shown in Fig. 3G. For example, the experimentation UI 300 may include an allocated experiment drop-down menu 322 that includes a visual indication of the experiment allocated by the user (e.g., as illustrated in Figs. 3D-3F). In some embodiments, the ME server 108 is configured to detect an input at the allocated experiment drop-down menu 322 and cause the
experimentation UI 300 to display details such as, but not limited to, experimentation type and selected variations for that allocated experiment.
[0047] In some embodiments, the ME server 108 is configured to split customer account specific data included the selected customer audience datasets based on a specified traffic percentage. For example, the experimentation UI 300 may include a traffic percentage field 324. The traffic percentage field 324 may correspond to the allocated experiment displayed at the menu 322. For example, in Fig. 3G, the experiment allocated in Figs. 3D-3F has a traffic percentage value of 100% indicating that there is no split in traffic. Traffic, as used herein, refers to customers and/or client devices 110 that access the customer facing UI. As such, in Fig. 3G, the ME server 108 is configured to cause 100% of the customer devices 110 accessing the customer facing UI 200 that are included in the audience selected at field 314 (e.g., the default customer audience) to display the interactable UI element variation (e.g., “Not Dog Content 11 2022-11-21”) selected in Fig. 3F at the zone 204 corresponding to the zone selected in field 310 during the period of time input at field 312. An example of splitting customers based on a specified traffic percentage is discussed in more detail below with reference to Figs. 3N-3P.
[0048] In some embodiments, the ME server 108 is configured to perform a marketing experiment on more than one customer audience. For example, in Fig. 3G, the ME server 108 may be configured to detect an input at UI element 326 indicating that a second customer audience is to be added to the marketing experiment.
[0049] Referring to Fig. 3H, the user has provided input at UI element 326 in Fig. 3G indicating that a second customer audience is to be added to the marketing experiment. For example, the ME server 108 may be configured to, in response to detecting the input at UI element 326, cause the experimentation UI 300 to include a customer audience selection menu 314 associated with the second segment (e.g., second customer audience) for the marketing experiment. The audience selection menu 314 displayed in Fig. 3H may be generally the same as discussed above with reference to Figs. 3C-3D. The ME server 108 may be configured to detect a selection of a customer audience at the selection menu 314 and cause the experimentation UI 300 to display an indication of the selection as shown in Fig. 31.
[0050] Referring to Fig. 31, the user has selected the “Dog Owner” customer audience and the experimentation UI 300 includes the experimentation selection menu 318. In Fig. 31, the ME server 108 receives selection of the A/B test type of marketing experiment and causes the experimentation selection menu 318 to display a plurality of fields associated with the A/B test type. An A/B test as referenced herein may include a type of marketing experiment in which different variations of an
interactable UI element are displayed to different subsets of customers. For example, in an A/B test consisting of three variations (e.g., a control variation (“variation A”), and two non-control variations (“variations B-C)), the control variation A may be displayed to 10% of customers interacting with the customer facing UI 200, the non-control variation B may be displayed to 60% of customers and the non-control variation C may be displayed to 30% of customers. Each variation may be different from each other in at least one aspect. For example, the control variation A may be a banner that is currently being rendered at the customer facing UI 200 and the non-control variations B-C may each be different versions of that banner (e.g., including different text, indicia, images, hyperlinks).
[0051] In response to detecting a selection of the A/B test type, the ME server 108 may be configured to cause the experimentation selection menu 318 to include a number of interactable fields associated with the AZB test type. For example, in Fig. 31, the fields associated with the A/B test type include, but are not limited to, a winner rollout selection field (e.g., immediate, scheduled, manual), a control selection field, and a variation selection field. The winner rollout selection field may correspond to options for updating the customer facing UI 200 in response to the results of the A/B test. For example, following completion of an A/B test that includes two or more variations (e.g., at least one control variation and one or more non-control variations) there may be a variation that performs better than the others. A variation performing better than other variations may refer to the amount of interactions said variation receives during the marketing experiment being greater than a number of interactions the remaining variations receive. The ME server 108 may be configured to determine that the variation that has the highest number of interactions to be the winning variation.
[0052] The ME server 108 may be configured to monitor interactions with the different variations over the course of the marketing experiment and automatically determine the winning variation. For example, in an instance where the marketing experiment is an MAB test with a click through rate key metric, the ME server 108 may be configured to detect mouse click inputs or touch screen tap inputs at a rendered UI element variation 202 included in the customer facing UI 200 rendered on a client device 110. The ME server 108 may be configured to automatically generate and store a record of those interactions. For example, the ME server 108 is configured to detect the inputs at the UI element variations defined by the marketing experiment and generate a record indicating a number of times and/or frequency with which each UI element variation was interacted with. The ME server 108 may be configured to store the generated record in a local non -transitory computer readable storage medium and/or database 104. In some embodiments, the record of
interactions may be referred to as click stream data and the ME server 108 may be configured to transmit the click stream data to an external server or database (e.g., database 104) for storage. In instances where the marketing experiment is an MAB test, the ME server 108 and/or a server in communication with the ME server 108 may be configured to execute a machine learning (ML) algorithm on the click stream data and automatically adjust the traffic allocation to the different UI element variations based on the click stream data. The ME server 108 may be configured to, based on the generated record, automatically perform an analysis of the marketing experiment. For example, the record may include, for each variation in an experiment, a number of times the variation was interacted with (e.g., variation A was interacted with 10,000 times, variation B was interacted with 3000 times, and variation C was interacted with 30,000 times).
[0053] As such, the ME server 108 may be configured to cause the customer facing UI 200 to display the winning variation at the zone 204 corresponding to what is selected in the zone input field following the completion of the marketing experiment (e.g., immediately, at a scheduled date and/or time) to customers in the customer audience specified at input field 314. A winning variation, as referenced herein may refer to a variation that during a marketing experiment most closely achieves a desired customer response. In some embodiments, a desired customer response may be related to a number or instances of interactions with a variation (e.g., variation with the highest number of interactions being the winning variation). Referring to the example in the preceding paragraph, variation C would be the winning variation. As a further example, an A/B test executed on the customer facing UI 200 by the ME server 108 may including a control A, and two variations B, and C displayed at a specific zone 204 on the customer facing UI 200 for customers included in the Dog Owner customer audience where some segments of the customer audience receive a control display and other segments of the customer audience receive one or more variations of the display. As a result of the marketing experiment, it may be determined by the ME server 108 that variation B is the winning variation for the Dog Owner customer audience.
[0054] As such, following the marketing experimentation the ME server 108 may be configured to automatically transmit an indication of the winning variation and corresponding zone placement to the storefront server 102. The storefront server 102 may be configured to cause the customer facing UI 200 to display the winning variation at the specified zone for customers included in the Dog Owner customer audience. In this manner, the system 100 of the present disclosure may be configured to automatically update the customer facing UI 200 based on an outcome of a marketing experimentation. For example, following a marketing experiment, the ME server 108 may be configured to cause the customer facing UI 200 to render the winning variation for customers
displaying the customer facing UI 200 on a client device regardless of whether those customers were involved in the corresponding marketing experiment or not.
[0055] Referring to Figs. 3J-3K, the ME server 108 has detected a selection of an MAB marketing test type. An MAB test, as referenced herein, is a type of marketing experiment similar to an A/B test except that for an MAB test the ME server 108 is configured to dynamically allocate customer traffic to variations that have a higher number of interactions than other variations. For example, the ME server 108 may be configured to, while executing an MAB test, automatically adjust the traffic allocation to variations that better achieve a desired customer response than those that do not. Further to this example, if during an MAB test, a first variation A is achieving a higher number of interactions than a second variation B while the traffic allocation is split evenly at 50%, the MAB server 108 may be configured to automatically adjust the traffic allocation such that variation A receives 60% of the traffic and variation B receives 40% of the traffic. As discussed above, the ME server 108 may be configured to cause the experimentation selection menu 318 displays a plurality of fields associated with the selected type of marketing experiment. In Figs. 3J- 3K the selected type of marketing experiment is an MAB test. As such, the ME server 108 may be configured to cause the experimentation UI 300 to display a plurality of interactable fields at the selection menu 318 associated with the MAB test type. In some embodiments, the interactable fields displayed at the selection menu 318 include, but are not limited to, optimization selection fields (e.g., click through rate, revenue/1000 impressions), winner rollout selection fields (e.g., immediate, scheduled, manual), and at least two variation selection fields 320 (e.g., variation 1 and variation 2). [0056] Referring to Fig. 3L the ME server 108 receives a selection of an interactable UI element variations at the selection fields 320 in generally the same manner as described above with reference to Figs. 3E-3F. In some embodiments, the ME server 108 is configured to perform an AZB test and/or MAB test that includes more than two variations of interactable UI elements. For example, the experimentation UI 300 may include an “add variation” interactable element 328 that when interacted with causes the selection menu 318 to include another variation selection field 320. For example, the ME server 108 may be configured to detect an input at element 328 and cause the selection menu 318 to render an additional variation selection field 320. In some embodiments, the ME server 108 may be configured to include any number of variations of interactable UI elements in a marketing experiment. In some embodiments, the ME server 108 is configured to limit the number of variations to be less than a predetermined number of variations selected to avoid any one variation included in a marketing experiment from receiving too little traffic to yield statistically
significant results. In some embodiments, the maximum number of variations is less than or equal to about ten variations.
[0057] Referring to Fig. 3M, the ME server 108 receives a plurality of inputs and selections generally associated with repeating the process of adding variation selection fields 320 and selecting variations such that there are five different variations included in the MAB test as outlined above. For example, in Fig. 3M, the selected variations include “Dog Content # 2022-11-21” where # in this instance is between 1-5. Each of the variations selected in Fig. 3M may be different from one another in at least one way. For example, Dog Content 1 2022-11-21 may correspond to a weight loss dog food product and Dog Content 2 2022-11-21 may correspond to a dog food product for improving digestion, and so on for each of the different variations selected. As such, the user may save the allocated MAB marketing experiment type with the five variations. Referring to Fig. 3N, the ME server 108 may, in response to detecting an indication that information and selections for the allocated MAB test have been provided (e.g., detecting input at the “save” button in the selection menu 318 in Fig. 3M) cause the experimentation UI to display an indication of the allocated experiment. For example, the experimentation UI 300 may include the allocated experiment dropdown menu 322 and traffic percentage input field 324 as described above. In some embodiments, the ME server 108 is configured to perform one or more different experiments on a customer audience (e g., customer audience input at audience selection menu 314).
[0058] For example, the ME server 108 may be configured to detect an input at the “add allocation” button 330 displayed at the experimentation UI 300. In response to detecting an input at the button 330, the ME server 108 may cause the experimentation UI 300 to display a new selection menu 318 in combination with the allocated experiment drop-down menu 322 (as shown in Fig. 30) corresponding to the previously allocated experiment.
[0059] Referring to Fig. 30, the ME server 108 receives an allocation of a second experiment to the market experiment for the customer audience input at customer audience selection field 314 (e.g., Dog Owners). The ME server 108 may be configured to automatically adjust the percentage values in the traffic percentage input fields 324 according to the number of allocated experiments. For example, in Fig. 30, the traffic percentage for the allocated MAB test created in Figs. 3J-3N changed from 100% to 50% corresponding to there being a total of two allocated experiments. As such, in some embodiments, the ME server 108 may be configured to require that the sum total of percentages in the traffic percentage input fields 324 are equal to one-hundred. The ME server 108 may be configured to detect inputs at the selection menu 318 in generally the same manner as described above. For example, user may repeat the process of allocating an experiment by
interacting with the selection menu 318 shown in Fig. 30 in generally the same manner as discussed above such that a second experiment is allocated to the Dog Owner customer audience in accordance with the marketing experiment being created. In some embodiments, the ME server 108 is configured to limit the number of allocations to be less than or equal to a predetermined maximum number of allocations. In some embodiments, the maximum number of allocations, similar to the maximum number of variations, is a number of allocations selected to ensure that for a given experiment no variation receives too little traffic to yield statistical results. In some embodiments, the maximum number of allocations and/or variations may be a function of expected or recorded levels of customer traffic at the customer facing UI 200. For example, as expected or historical customer traffic increases, the maximum number of allocations and/or variations may also increase. In some embodiments, the maximum number of allocations is about ten allocations.
[0060] Referring to Fig. 3P, the ME server receives the addition of a third customer audience (e.g., segment) to the market experiment where the third customer audience is Multi Pet Owners and allocated three different market tests to those customer audiences, in generally the same manner as described above with reference to Figs. 3B-3O. In Fig. 3P, the three allocated experiments include two different A/B tests and an MAB test. The percentage of traffic allocation is set to be 33% for each of the A/B tests and 34% to the MAB test. However, the ME server 108 may be configured to detect a value input at one or more of the traffic percentage fields 324 and update the traffic percentage for the corresponding allocated test accordingly. For example, the ME server 108 may be configured to detect an input at the field 324 corresponding to the MAB test including a value of sixty and update the traffic percentage displayed therein to be 60%. The ME server 108 may be configured to automatically adjust the remaining percentages such that the sum total of all the percentages are 100%. For example, if the MAB traffic percentage is 60% then each A/B traffic percent may be automatically updated by the ME server 108 to be 20% (e.g., 60% + 20% + 20% = 100%). The ME server 108 may be configured to repeat this process to alter the traffic percentage for one of the A/B tests as well. The ME server 108 may be configured to perform multiple different experiments on a customer audience in accordance with any distribution of traffic.
[0061] Still referring to Fig. 3P, the ME server 108 receives a selection to publish the marketing experiment generated in Figs. 3A-3P such that it is set to a “scheduled” or “pending status”. The ME server 108 may be configured to detect the indication that a marketing experiment has been created (e.g., the selection of the “publish” button in Fig. 3P) and generate data corresponding to the created marketing experiment. In some embodiments, the ME server 108 is configured to store the generated marketing experiment data in a non-transitoiy storage medium included in the ME server 108 and/or
in the database 104. The marketing experiment data may include an indication of one or more of: an experiment name, zone, start date, end date, customer audiences, and one or more allocated experiments corresponding to the customer audiences.
[0062] For example, the marketing experiment data for the marketing experiment generated in Figs. 3A-3P may include an indication that the name of the experiment is “Dog Cat Multi Pet Experience 2022-11-21” and that the zone of the customer facing UI where the experiment is to be performed is the zone “GoodyBox_Hero_Promo_Food_Supplement”. Continuing this example, the marketing experimentation data may indicate that the start date is November 21, 2022 at 12:00 AM and the end date is November 27, 2022 at 11 :59 PM. Further to this example, the marketing experimentation data may include an indication that there are three customer audiences (e.g., segments) to be included in the experiment, those three audiences being Dog Owners (e.g., segment 2), Multi Pet Owners (e.g., segment 3), and the default audience (e.g., segment 1) which includes all customer audiences other than Dog Owners and Multi Pet Owners. Still referring to the example illustrated in Figs. 3A-3P, the marketing experimentation data may include for each customer audience, or segment, an indication of the allocated experiments and the variations and traffic percentages associated therewith. For example, the marketing experimentation data for the default customer audience may include an indication that there is a simple targeting/share of voice test with one variation at 100% traffic allocation. Similarly, for the Dog Owner customer audience, the marketing experimentation data may include an indication that there is an MAB test with five variations at a 50% traffic allocation and a simple targeting/share of voice test with one variation at a 50% traffic allocation. Furthermore, for the Multi Pet Owner customer audience, the marketing experimentation data may include an indication that there is a first A/B test with two variations at a 33% traffic allocation, an MAB test with 4 variations at a 34% traffic allocation, and a second A/B test with two variations at a 33% traffic allocation.
[0063] Referring to Fig. 4, there is shown a flowchart illustrating a method 400 of automatically performing a marketing experiment via the system 100 in accordance with an exemplary embodiment of the present disclosure. The method 400 is described with reference to a single customer (e.g., “Bob”) accessing the customer facing UI 200 via a client device 110 while at least one marketing experiment is actively running. However, it should be understood that the methods discussed herein may be applied to a plurality of customers accessing the customer facing UI 200 either simultaneously or at different times.
[0064] Prior to the method 400, a customer (e.g., “Bob”) may submit a request to access the customer facing UI 200 via client device 110 and log in to their customer specific account. The
storefront server 102 may be configured to detect the request and the log in credentials and automatically associate the client device 110 with the customer specific account data stored on database 104. In response to a successful log in, the storefront server 102 may be configured to transmit an indication to the ME server 108 that a client device 110 is requesting access to the customer facing UI 200. In some embodiments, the indication transmitted from the storefront server 102 to the ME server 108 includes a unique visitor and/or customer identification number (e.g., a customer ID). For example, in response to Bob logging in at the client device 110 the storefront server 102 may transmit the unique customer ID for Bob to the ME server 108.
[0065] The method 400 may include the step 402 of, at the ME server 402, determining customer specific data. In some embodiments, the ME server 108 is configured to perform step 402 in response to a client device 110 displaying the customer facing UI 200 or requesting to display the customer facing UI 200. In some embodiments, the determined customer specific data includes, but is not limited to, the customer name, a unique customer identifier (e.g., customer ID), and one or more audiences to which the customer is included. For example, and as illustrated in Fig. 4, the customer specific data includes the customer’s name “Bob”, customer ID “1234” and a listing of the audiences to which the customer is included, which in this example, is cat_owner, autoship customer, and high roller. In some embodiments, the customer name, customer ID, and listing of audiences are stored in database 104 and the ME server 108 is configured to search or query the database 104 for that data. In other embodiments, the storefront server 102 and/or audience server 106 are configured to transmit the customer data to the ME server 108.
[0066] In some embodiments, the step 402 includes determining a unique traffic bucket value for the customer. The ME server 108 may be configured to determine the unique traffic bucket value based on a determination of an experiment being run on the customer facing UI 200 that the client device 110 is currently displaying. For example, the ME server 108 may be configured to determine that there is a zone 204 displayed at the customer facing UI 200 that has a corresponding marketing experiment being currently run (e.g., see Fig. 3 A above). The ME server 108, in response to determining that there is a marketing experiment being run at a zone 204 included in the customer facing UI 200 displayed on the client device 110 may determine a unique ID for the experiment. For example, each experiment having associated experiment data stored in the database 104 and/or ME server 108 (as discussed above) may include a unique identifier (e.g., a unique numeric value). In response to determining the unique experiment ID, the ME server 108 may be configured to generate the traffic bucket value.
[0067] In some embodiments, the ME server 108 is configured to generate the traffic bucket value by combining the customer ID with the experiment ID and performing a hashing function on the combination. For example, if the customer ID is 1234 and the experiment ID is 5476, the ME server 108 may be configured to add the two values together and perform a hashing function. In some embodiments, the hashing function is a modulo operation. For example, in Fig. 4 the customer ID “1234” and experiment ID “5476” are added together and the resulting sum is used in a modulo operation where the modulus is equal to 10,000 (e.g., (experiment ID + customer ID) % X, where X=10,000). The modulus value may be any predetermined value. In Fig. 4, the determined traffic bucket value is 3921. In this manner, the ME server 108 may be configured to generate the same traffic bucket value for a specific customer and specific marketing experiment each time the customer facing UI 200 is displayed to the customer device 110.
[0068] The method 400 may include the step 404 of, at the ME server 108, assigning the customer to a customer audience the included in the experiment. For example, in Fig. 4, the marketing experiment determined by the ME server 108 includes three different customer audiences: dog_owner, cat_owner, and high_roller. The ME server 108 may be configured to compare the customer account data with the customer audiences indicated in the marketing experiment data to determine if there are any matches. For example, in Fig. 4, the customer “Bob” is included in the cat owner and high roller customer audiences, both of which are included in the marketing experiment. The ME server 108 may be configured to determine which of the matching customer audiences to assign the specific customer to. In some embodiments, the customer audiences included in the marketing experiment may have associated priority data indicating a priority of one audience over the others. For example, in Fig. 4, the cat owner audience has a higher priority (e.g., priority: 10), than the high_roller audience (e.g., priority: 1). The ME server 108 may be configured to assign a customer to a corresponding audience based on the associated priority value. For example, as illustrated in Fig. 4, the customer is automatically assigned to the cat owner customer audience instead of the high roller customer audience because the cat owner customer audience has a higher priority value. In some embodiments, the ME server 108 is configured to detect input at the experimentation user interfaces 300 indicating priority values for one or more customer audiences included in a marketing experiment to be performed during creation of a new marketing experiment (e.g., as shown and described above with reference to Figs. 3 A-3P) or the editing of a pending experiment.
[0069] The method 400 may include the step 406 of, at the ME server 108, automatically assigning the customer to an allocated experiment included in the audience to which they are
assigned. For example, each audience included in the marketing experiment may include one or more allocated experiments, as discussed above with reference to Figs. 3O-3P. In Fig. 4, there are three allocated experiments for the cat owner audience (e.g., a targeted placement experiment, an A/B test with three variations, and an MAB test with three variations). The ME server 108 may be configured to determine which of the allocated experiments the customer should be assigned to. In some embodiments, the ME server 108 is configured to determine the assignment based on the traffic bucket value determined at step 402. In some embodiments, the ME server 108 is configured to assign a number of customers to each allocated experiment corresponding to the traffic percentage of each allocated experiment. For example, in Fig. 4 the traffic percentages for the allocated experiments are 20%, 40% and 40% from top to bottom and the ME server 108 may be configured to assign 20% of the customers accessing the customer facing UI 200 that are included in the cat_owner customer audience to the targeted placement experiment, 40% to the A/B test, and 40% to the MAB test.
[0070] In some embodiments, the ME server 108 is configured to achieve the desired traffic allocation based on the hashing function. For example, in Fig. 4, each allocated experiment includes a value range corresponding to the traffic percentage (e.g., values between 0-1999 for the targeted placement, values between 2000-5999 for the A/B test, and values between 6000-9999 for the MAB test). In Fig. 4, the traffic bucket value for the customer “Bob” determined by the ME server 108 is equal to 3921. The ME server 108 may automatically determine which value range for the different allocated experiments the traffic bucket value falls within. In this example, the value 3921 falls within the range of values corresponding to the A/B test. The ME server 108 may be configured to perform the modulo operation in step 402 based on the value range of the allocated experiments (e.g., (customer ID + experiment ID) % 10000). In this manner, as the volume of traffic (e.g., the number of customers accessing the customer facing UI) increases, the likelihood of meeting the desired traffic allocation percentages may be increased.
[0071] In some instances, two or more customers having a different customer ID may result in the same modulo operation output and be assigned to the same experiment allocation and/or variation (as discussed below). For example, the modulus value in Fig. 4 is 10,000, however more than 10,000 customers (e.g., 1,000,000 customers) may access the customer facing UI each of which having a unique customer ID. Even though an allocation of 40% to the AB test in Fig. 4 is associated with modulo operation outputs having a value of 2000-5999, the number of customers that are assigned to that allocation may be generally equal to 40% of 1,000,000 (400,000 customers). In some instances, the actual number of customers assigned to a given allocation may be about +/-
1% of the target allocation percentage. For example, for the AB test with a target allocation of 40% the actual allocation may be between about 39% to about 41% of the total number of customers accessing the customer facing UT 200 at the same or different times.
[0072] In some embodiments, the method 400 may include the step 408 of, at the ME server 108, determining that the allocated experiment is an A/B test and assigning the customer to a variation included in the A/B test. In some embodiments, the ME server 108 is configured to perform another modulo operation based on the customer ID and experiment ID and based on the number of variations included in the A/B test. For example, in Fig. 4, the A/B test includes three variations A, B, and C. The ME server 108 may be configured to perform a modulo operation based on the following: (customer ID + experiment ID) % 3. The ME server 108 may be configured to, based on the outcome of the modulo operation (e.g., a value of 0, 1, or 2) determine which of the variations to assign to the customer. In Fig. 4, the outcome of the modulo operation at step 408 is 1 thereby resulting in the customer being assigned to variation B. By determining the variation assignment based on a hashing function (e g., modulo operation) the ME server 108 may be configured to ensure that the same customer is always assigned to the same variation for an A/B test. This may be particularly beneficial in ensuring the accuracy of the results of the A/B test.
Furthermore, by performing the modulo operation at step 408 for A/B tests, the ME server 108 may be configured to ensure that the same customer is assigned to the same variation regardless of whether the traffic allocation percentage is altered during the run of the experiment. This may enable the system 100 to be capable of performing accurate A/B tests while simultaneously allowing for the desired traffic percentages to be altered. For example, if the traffic percentage for the A/B test were altered, during the time in which the marketing experimentation is being performed, the ME server 108 is configured to still assign the customer “Bob” to the B variation in the AZB test.
[0073] In some embodiments, the ME server 108 may be configured to omit the step 408 for MAB tests and targeted placement tests. The ME server 108 may be configured to, in an instance where the allocated experiment is an MAB test, automatically adjust the variations assigned to different customers in a dynamic manner during the course of the experiment. For example, when executing an MAB test, the ME server 108 may be configured to automatically adjust the allocation of customers to variations included in the MAB test based on the monitored performance of the different variations. For example, in an instance where there are three variations in an MAB test, the ME server 108 may be configured to determine that at a point in time following the start of the experiment, a first variation is performing better than the remaining two variations and allocate a larger percentage of the customers to that first variation.
[0074] In this manner, the system 100 may be configured to automatically perform, for each customer accessing the customer facing UI 200, a market experimentation based on one or more customer audience and one or more allocated experiments. For example, the ME server 108 may be configured to, for each customer accessing the customer facing UI 200, and for each marketing experiment at a different zone of the customer facing UI 200, assign each customer to a specific allocated experiment and, in some instances, a specific variation. The ME server 108 may be configured to transmit an indication related to that determination to the storefront server 102. In response to receiving the indication, the storefront server 102 may transmit instructions to the client device 110 to render the customer facing UI 200 including the determined variation. In this manner, the ME server 108 may be configured to cause the storefront server 102 to execute a marketing test including a rendering, at the customer facing UI 200, the experimentation content at one of the user- interactable UI elements on customer devices 110 corresponding to the target customer audience. For example, and still referring to Fig. 4, the ME server 108 may cause the storefront server 102 to execute the marketing experiment such that the customer device 110 associated with the customer “Bob” renders variation B for a specific interactable UI element 202 on the customer facing UI 200. [0075] Referring to Figs. 5A-5E, there is shown exemplary user interfaces illustrating automatically performing a marketing experiment via the system 100 for one or more different users and/or customer audiences and in accordance with an exemplary embodiment of the present disclosure. Figs. 5A-5B illustrate an experimentation UI 300 rendered on an admin device 112 and including a display of one or more input parameters and/or selections at least partially defining a marketing experiment that is scheduled to be performed by the ME server 108. Figs. 5C-5E illustrate customer facing user interfaces 200 as rendered on three different client devices 110a-l 10c corresponding to three different users and during execution of the marketing experiment established in Figs. 5A-5B.
[0076] Referring to Figs. 5A-5B, the experimentation UI 300 is shown as depicting settings and/or details of a pending, or currently executing, marketing experiment. For example, the ME server 108 may be configured to detect a selection of a scheduled or pending marketing experiment at the table 302 of marketing experiments 304 as shown in Fig. 3 A and cause the details of the selected marketing experiment to be rendered at an admin device 112, similar to what is illustrated in Figs. 5A-5B. The marketing experiment shown in Figs. 5A-5B may have been created in a similar manner as described above with regards to Figs. 3A-3P. In Figs. 5A-5B, the marketing experiment is titled “hp-zone-14-specialty-test” and is scheduled to run from January 23, 2023 at 3: 12:00 PM to January 31, 2023 at 7:00:00 PM.
[0077] As discussed above, the ME server 108 may be configured to display variations of one or more interactable elements 202 rendered at a selected zone 204 of the customer facing UI 200. In Figs. A-5B, the zone selected at the zone selection field 310 is the “new-deliver-hp-zone-14” of the customer facing UI 200, which is outlined in Figs. 5C-5E. As discussed above, the ME server 108 may be configured to perform a marketing experiment that includes a plurality of different customer audiences (e.g., traffic allocation segments) selected at the customer audience selection menu 314. For example, in Figs. 5A-5B, the marketing experiment titled “hp-zone-14-specialty-tesf ’ includes at least eight different customer audiences (e.g., segments 1-8) included therein. The first customer audience selected at the audience selection menu 314 illustrated in Fig. 5A (e.g., segment 1) is a horse owner audience (e.g., “horse_l 172021”) corresponding to customers and/or users who are included in a horse owner customer audience. The second customer audience selected at the audience selection menu 314 illustrated in Fig. 5B (e.g., segment 2) is a farm animal owner audience (e.g., “Farm 1172021”) corresponding to customers and/or users who are included in a farm owner customer audience. For sake of brevity, only the first two segments of the total number of segments included in the marketing experiment will be referenced when describing aspects of the present disclosure.
[0078] For each selected customer audience, the ME server 108 may be configured to perform one or more specific types of marketing experiment selected at the experimentation selection menu 318 and in accordance with a traffic allocation percentage. For example, for each of the two customer audiences illustrated in Figs. 5A-5B, there is a single A/B test to be performed by the ME server 108 with a 100% traffic allocation. Each A/B test, as illustrated in Figs. 5A-5B, include two interactable element variations selected at fields 320. For the first customer audience (e g., horse owners), the two selected variations are 1) a control variation (e.g., “hp-shop-by-pet_control”) and 2) a non-control variation corresponding to popular horse customer products (e.g., “hp-horse- customer-favorites”). For the second customer audience (e.g., farm animal owners), the two selected variations are 1) the control variation (e.g., “hp-shop-by-pet_control”) and 2) a non-control variation corresponding to popular farm animal customer products (e.g., “hp-farm-animal-customer- favorites”).
[0079] Referring to Figs. 5C-5E, in response to the ME server 108 detecting the rendering of the customer facing UI 200 at a client device 110, the ME server 108 may be configured to automatically determine a customer audience associated with the client device 110 and cause a variation of one or more interactable UI elements to be rendered thereon in accordance with a currently running marketing experiment. For example, and as illustrated in Figs. 5C-5E, three
different users (“Tom”, “Jerry”, “Pinky”) access the customer facing UI 200 via different client device 110a- 110c while the marketing experiment “hp-zone-14-specialty-test” discussed above with reference to Figs. A-5B is currently running. The ME server 108 may be configured to detect the user’s accessing the customer facing UI 200 via the respective client devices 110a- 110c and automatically determine customer specific data for each user in generally the same manner as discussed above with reference to step 402 in Fig. 4. The user Tom at the first client device 110a and the user Jerry at the second client device 110b are both included in the horse owner (e.g., Horse l 1172021) customer audience. The ME server 108 may be configured to automatically assign each of the first and second client devices 110a, 110b to the A/B test for the first segment of the marketing experiment “hp-zone-14-specialty-tesf ’. Similarly, the user Pinky at the third client device 110c may be included in the farm animal owner (e.g., Farm_l 1172021) customer audience. The ME server 108 may be configured to automatically assign the third client device 110c to the second segment of the marketing experiment “hp-zone-14-specialty-test”.
[0080] The ME server 108 may assign each of the users Tom, Jerry, and Pinky to respective the A/B tests for the first and second segments of the marketing experiment in generally the same manner as discussed above with reference to steps 404 and 406 in Fig. 4. It should be noted that because there is only a single A/B test for each of the first and second segments (e.g., a 100% traffic allocation to the AZB tests for each segment) of the marketing experiment being performed in Figs. 5A-5E that the ME server 108 may be configured to automatically assign client devices 110a- 110c to the respective A/B tests regardless of the calculated traffic bucket values. For example, in Fig. 4, the determined traffic bucket value of 3921 for user Bob resulted in the client device being assigned to an A/B test having a 40% traffic allocation. However, in Figs. 5A-5B the A/B tests have 100% traffic allocation and The ME server 108 is configured to assign all customers included in the respective customer audience segments (e.g., horse owners, farm animal owners) to the corresponding A/B tests.
[0081] In some embodiments, the ME server 108 may be configured to, in response to determining that a customer device 110 is assigned to an A/B test, render at the customer facing UI 200 displayed on the customer device 110 an interactable UI element variation included in the A/B test. The ME server 108 may be configured calculate an A/B bucket value in a manner generally the same as step 408 described above with reference to Fig. 4 for each user assigned to an A/B test. Furthermore, the ME server 108 may be configured to render at the customer facing UI 200 an interactable UI element variation based on the calculated A/B bucket value. For example, in Fig. 5C, the user Tom is assigned to the A/B test for horse owners and the calculated A/B bucket value is
zero. The ME server 108 may be configured to render the first variation (e.g., the control variation) at the zone 204 including one or more interactable UI element variations 202a included in the first variation. In Fig. 5D, the user Jerry is assigned to the A/B test for horse owners and the calculated A/B bucket value is one. The ME server 108 may be configured to render the second variation (e.g., the non-control variation) at the same zone 204 including one or more interactable UI element variations 202b included in the second variation. Similarly, in Fig. 5E, the user Pinky is assigned to the A/B test for farm animal owners and the calculated A/B bucket value is one. The ME server 108 may be configured to render the second variation (e.g., the non-control variation) at the same zone 204 including one or more interactable UI element variations 202c included in the second variation. [0082] In some embodiments, the system 100 and/or method 400 discussed above may be configured to improve the performance of automated marketing experiments. System 100 and/or method 400 of the present disclosure may be configured to perform marketing experiments at the server-side (e.g., the ME server 108, audience server 106, and/or storefront server 102). In some instances, by performing the marketing experiments at the server-side, the risk of skewed experiment results and/or increases in rendering time may be eliminated or at least reduced. This may be achieved, in some instances, by avoiding rendering web pages and/or marketing experiment content at the client device and/or by avoiding content blocking software running on the client device such as, but not limited to, advertisement blocking software and cookie blocking software. [0083] It will be appreciated by those skilled in the art that changes could be made to the exemplary embodiments shown and described above without departing from the broad inventive concepts thereof. It is to be understood that the embodiments and claims disclosed herein are not limited in their application to the details of construction and arrangement of the components set forth in the description and illustrated in the drawings. Rather, the description and the drawings provide examples of the embodiments envisioned. The embodiments and claims disclosed herein are further capable of other embodiments and of being practiced and carried out in various ways.
[0084] Specific features of the exemplary embodiments may or may not be part of the claimed invention and various features of the disclosed embodiments may be combined. Unless specifically set forth herein, the terms “a”, “an” and “the” are not limited to one element but instead should be read as meaning “at least one”. Finally, unless specifically set forth herein, a disclosed or claimed method should not be limited to the performance of their steps in the order written, and one skilled in the art can readily appreciate that the steps may be performed in any practical order.
Claims
1. A system for automatically performing marketing experimentation and analysis, the system comprising: a storefront server in communication with a plurality of customer devices, the storefront server configured to transmit to the customer devices instructions to render a customer-facing user interface (UI) on the customer device, the customer-facing UI including a plurality of user- interactable UI elements; a database in communication with the storefront server, the database having stored thereon customer specific data corresponding to the plurality of customer devices; an audience server in communication with the database and storefront server, the audience server configured to monitor customer interactions with the customer facing UI and determine a plurality of customer audiences based on one or more customer interactions with the customer facing UI, each customer audience being associated with a similar customer interaction; and a marketing experimentation (ME) server in communication with the audience server, storefront server, and an admin device, the ME server configured to: generate an experimentation UI and transmit the experimentation UI to the admin device for display, the experimentation UI including a menu rendering of the plurality of customer audiences; detect, at the experimentation UI, a user selection of: one of the user-interactable UI elements for the customer-facing UI, a target customer audience from the menu, and experimentation content to be rendered in association with the selected user-interactable UI element; in response to detecting the user selection, cause the storefront server to execute a marketing test including rendering, at the customer facing UI, the experimentation content at the selected one of the user-interactable UI elements on the customer devices corresponding to the target customer audience; and automatically generate a user interaction record based on detected interactions with the experimentation content displayed at the customer facing UI.
2. The system of claim 1, wherein the ME server is further configured to: detect, at the experimentation UI, a user selection of a marketing experimentation type; and cause the storefront server to render, at the customer facing UI, the experimentation content according to the selected marketing experimentation type.
3. The system of claim 2, wherein the marketing experimentation type is one of an A/B test, and a multi-armed bandit test.
4. The system of any one of claims 1-3, wherein the ME server is further configured to: detect, at the experimentation UI, a selection of a second experimentation content, a first allocation amount, and a second allocation amount; and in response to detecting the user selection, cause the storefront server to execute the marketing test including: rendering, at the customer facing UI, the experimentation content at the one of the user-interactable UI element on a number of customer devices corresponding to the target customer audience and the first allocation amount; and rendering, at the customer facing UI, the second experimentation content at the one of the user-interactable UI element on a number of customer devices corresponding to the target customer audience and the second allocation amount.
5. The system of any one of claims 1-4, wherein the audience server is configured to automatically determine customer audiences via machine learning.
6. The system of any one of claims 1-5, wherein one or more of the customer audiences is comprised of customers having corresponding customer specific data including an indication of a similar geographical area.
7. The system of any one of claims 1-6, wherein one or more of the customer audiences is comprised of customers having corresponding customer specific data including an indication of a similar type of customer device used to interact with the customer facing UI.
8. The system of claim 1, wherein the experimentation content is a plurality of different experimentation contents, and the ME server is configured to: in response to detecting the user selection, cause the storefront server to execute the marketing test including, for each experimentation content included in the plurality thereof:
rendering, at the customer facing UI, the corresponding experimentation content at the one of the user-UI element on customer devices corresponding to a subset of the target customer audience.
9. The system of any one of claims 1-8, wherein the ME server is configured to detect one or more user defined priorities at the experimentation UI and cause the storefront server to execute the marketing experiment in accordance with one or more user defined priorities.
10. A method of automatically performing marketing experimentation and analysis, the method comprising: at a storefront server in communication with a plurality of customer devices, transmitting to the customer devices instructions to render a customer-facing user interface (UI) on the customer device, the customer-facing UI including a plurality of user-interactable UI elements; at an audience server in communication with the storefront server, monitoring customer interactions with the customer facing UI and determining a plurality of customer audiences based on one or more customer interactions with the customer facing UI, each customer audience being associated with a similar customer interaction; and at a marketing experimentation (ME) server in communication with the audience server, storefront server, and an admin device: generating an experimentation UI and transmitting the experimentation UI to the admin device for display, the experimentation UI including a menu rendering of the plurality of customer audiences; detecting, at the experimentation UI, a user selection of: one of the user-interactable UI elements for the customer-facing UI, a target customer audience from the menu, and experimentation content to be rendered in association with the selected user-interactable UI element; in response to detecting the user selection, causing the storefront server to execute a marketing test including rendering, at the customer facing UI, the experimentation content at the selected one of the user-interactable UI elements on the customer devices corresponding to the target customer audience; and automatically generating a user interaction record based on detected interactions with the experimentation content displayed at the customer facing UI.
11. The method of claim 10 further comprising, at the ME server:
detecting, at the experimentation UI, a user selection of a marketing experimentation type; and causing the storefront server to render, at the customer facing UI, the experimentation content according to the selected marketing experimentation type.
12. The method of claim 11, wherein the marketing experimentation type is one of an A/B test, and a multi-armed bandit test.
13. The method of any one of claims 10-12 further comprising, at the ME server: detecting, at the experimentation UI, a selection of a second experimentation content, a first allocation amount, and a second allocation amount; and in response to detecting the user selection, causing the storefront server to execute the marketing test including: rendering, at the customer facing UI, the experimentation content at the one of the user-interactable UI element on a number of customer devices corresponding to the target customer audience and the first allocation amount; and rendering, at the customer facing UI, the second experimentation content at the one of the user-interactable UI element on a number of customer devices corresponding to the target customer audience and the second allocation amount.
14. The method of any one of claims 10-13, further comprising, at the audience server: automatically determining customer audiences via machine learning.
15. The method of any one of claims 10-14, wherein one or more of the customer audiences is comprised of customers having corresponding customer specific data including an indication of a similar geographical area.
16. The method of any one of claims 10-15, wherein one or more of the customer audiences is comprised of customers having corresponding customer specific data including an indication of a similar type of customer device used to interact with the customer facing UI.
17. The method of claim 10, wherein the experimentation content is a plurality of different experimentation contents, and the method further comprises, at the ME server:
in response to detecting the user selection, causing the storefront server to execute the marketing test including, for each experimentation content included in the plurality thereof: rendering, at the customer facing UI, the corresponding experimentation content at the one of the user-UI element on customer devices corresponding to a subset of the target customer audience.
18. The system of any one of claims 10-17 further comprising, at the ME server: detecting one or more user defined priorities at the experimentation UI and causing the storefront server to execute the marketing experiment in accordance with one or more user defined priorities.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202363490648P | 2023-03-16 | 2023-03-16 | |
US63/490,648 | 2023-03-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024192347A1 true WO2024192347A1 (en) | 2024-09-19 |
Family
ID=92756105
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2024/020143 WO2024192347A1 (en) | 2023-03-16 | 2024-03-15 | System for automatically performing marketing experimentation and analysis |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024192347A1 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070033105A1 (en) * | 2005-07-29 | 2007-02-08 | Yahoo! Inc. | Architecture for distribution of advertising content and change propagation |
US20210200943A1 (en) * | 2019-12-31 | 2021-07-01 | Wix.Com Ltd. | Website improvements based on native data from website building system |
US20220129285A1 (en) * | 2020-10-28 | 2022-04-28 | International Business Machines Corporation | Modifying user interface layout based on user focus |
-
2024
- 2024-03-15 WO PCT/US2024/020143 patent/WO2024192347A1/en unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070033105A1 (en) * | 2005-07-29 | 2007-02-08 | Yahoo! Inc. | Architecture for distribution of advertising content and change propagation |
US20210200943A1 (en) * | 2019-12-31 | 2021-07-01 | Wix.Com Ltd. | Website improvements based on native data from website building system |
US20220129285A1 (en) * | 2020-10-28 | 2022-04-28 | International Business Machines Corporation | Modifying user interface layout based on user focus |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11810156B2 (en) | Systems, methods, and devices for componentization, modification, and management of creative assets for diverse advertising platform environments | |
US12184946B2 (en) | Sequential delivery of advertising content across media devices | |
US10783563B2 (en) | Methods and systems for modeling campaign goal adjustment | |
US20180288177A1 (en) | Methods and systems for activity-based recommendations | |
US20170358011A1 (en) | Systems and method for achieving reduced latency | |
US8725559B1 (en) | Attribute based advertisement categorization | |
US9934510B2 (en) | Architecture for distribution of advertising content and change propagation | |
US8977640B2 (en) | System for processing complex queries | |
US20160034468A1 (en) | Testing of and adapting to user responses to web applications | |
US20010047297A1 (en) | Advertisement brokering with remote ad generation system and method in a distributed computer network | |
US10007645B2 (en) | Modifying the presentation of a content item | |
CN105164712A (en) | Prioritized and contextual display of aggregated account notifications | |
US11756088B2 (en) | Displaying listings based on listing activity | |
US20160026640A1 (en) | Systems and methods of testing-based online ranking | |
US20130238974A1 (en) | Online polling methodologies and platforms | |
US9430779B1 (en) | Determining visual attributes of content items | |
WO2024192347A1 (en) | System for automatically performing marketing experimentation and analysis | |
US20150339723A1 (en) | User-based analysis of advertisement pools | |
JP2020154939A (en) | Product promotion device | |
US20250022007A1 (en) | Automated campaign configuration switching framework | |
JP6552569B2 (en) | Advertisement distribution management device, advertisement distribution management method, and advertisement distribution management program | |
JP2017068482A (en) | Information processing apparatus, information processing method, and program | |
WO2023228708A1 (en) | Information processing device, information processing method, and recording medium | |
JP2017068483A (en) | Information processing apparatus, information processing method, and program | |
JP2023165514A (en) | Server, program, and information distribution method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24771806 Country of ref document: EP Kind code of ref document: A1 |