CN113454668A - Techniques configured to enable monitoring of user engagement with physically printed material via an augmented reality delivery system - Google Patents

Techniques configured to enable monitoring of user engagement with physically printed material via an augmented reality delivery system Download PDF

Info

Publication number
CN113454668A
CN113454668A CN201980089989.9A CN201980089989A CN113454668A CN 113454668 A CN113454668 A CN 113454668A CN 201980089989 A CN201980089989 A CN 201980089989A CN 113454668 A CN113454668 A CN 113454668A
Authority
CN
China
Prior art keywords
user
data
animation
image
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980089989.9A
Other languages
Chinese (zh)
Inventor
伊雷妮·易卜拉欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hybrid Reality Solutions Pte Ltd
Original Assignee
Hybrid Reality Solutions Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hybrid Reality Solutions Pte Ltd filed Critical Hybrid Reality Solutions Pte Ltd
Publication of CN113454668A publication Critical patent/CN113454668A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0242Determining effectiveness of advertisements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0633Lists, e.g. purchase orders, compilation or processing
    • G06Q30/0635Processing of requisition or of purchase orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/16Using real world measurements to influence rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/024Multi-user, collaborative environment

Abstract

Described herein are techniques configured to enable monitoring of user engagement with physical printed material via an augmented reality delivery system. Although this application of the technique is described as a primary example, it should be understood that the technique has additional applications. Some example embodiments relate to using AR technology as a means to improve the effectiveness of print-based advertisements, primarily by allowing analysis to be collected regarding a user's participation in the print-based advertisements, and in some cases, by allowing a user to purchase goods and/or services directly via participation in the print-based advertisements. A very broad form of AR experience delivery and participation monitoring may be used.

Description

Techniques configured to enable monitoring of user engagement with physically printed material via an augmented reality delivery system
Technical Field
In various embodiments, the present invention relates to techniques configured to enable monitoring of user engagement with physical printed material via an augmented reality delivery system. Although some embodiments are described herein with particular reference to those applications, it will be appreciated that the invention is not limited to such fields of use and is applicable in broader contexts.
Background
Any discussion of the background art throughout the specification should in no way be considered as an admission that such art is widely known or forms part of common general knowledge in the field.
It is known to monitor user engagement with digital content, such as digital advertisements delivered via the internet. Such monitoring is of significant value in the context of performing analysis to determine relationships between user engagement and other factors. In this sense, there are the following technical problems: existing participation monitoring methods are limited to content distributed digitally and cannot extend their application to content distributed via physical printed material.
Disclosure of Invention
It is a primary object of the present invention to overcome or ameliorate at least one of the disadvantages of the prior art or to provide a useful alternative.
Example embodiments are described in what is referred to as the "claims".
One embodiment provides a computer-implemented method that enables monitoring of user engagement with printed content, including printed content distributed via magazines, newspapers, packaging, and manuals, the method comprising:
distributing a plurality of items carrying respective copies of a common printed content item, wherein each copy of the printed content item includes a common graphical artifact, wherein the common graphical artifact is associated with an Augmented Reality (AR) content dataset in a cloud-hosted system;
executing a participating application on a plurality of user mobile devices, wherein each executing instance in the participating application is configured to:
processing image data captured via a camera module of the mobile device to identify a presence of the graphic article when the printed content item enters a field of view of the camera module;
obtaining, from the cloud hosting system, at least a subset of an Augmented Reality (AR) content dataset associated with the graphical artifact;
causing display, on a display screen of the mobile device, a presentation of interactive AR content superimposed on the image capture-based display of the physical substrate based on the obtained AR content data, wherein the presentation animation superimposed on the image capture-based display of the physical substrate is displayed at a defined position and orientation relative to the image capture-based display of the physical substrate;
at a server device, data representing user engagement with interactive content rendered from the Augmented Reality (AR) content data associated with the graphic artifact is collected from the plurality of user mobile devices to generate an analysis representing user engagement with the plurality of copies of the common printed content item.
An embodiment provides an apparatus configured to enable monitoring of user engagement with a physical substrate bearing printed material, the apparatus comprising:
a camera module configured to capture image data of a capture area;
a display screen;
a microprocessor;
a memory module coupled to the microprocessor, wherein the memory module maintains computer executable code that, when executed via the microprocessor, configures the apparatus to provide:
an image identifier extraction module configured to process image data captured via the camera module to: (i) identifying the presence of a graphic article having predefined attributes printed on the physical substrate; and (ii) extracting an image identifier code from the graphic article;
an Augmented Reality (AR) data access module configured to retrieve an executable AR dataset associated with the extracted image identifier code from an AR data storage module;
an AR presentation module configured to cause display of a presentation animation superimposed on the image capture-based display of the physical substrate on the display screen based on the retrieved AR data, wherein the presentation animation superimposed on the image capture-based display of the physical substrate is displayed at a defined position and orientation relative to the image capture-based display of the physical substrate;
a participation monitoring module configured to: (i) monitoring user interactions associated with the display of the presentation animation; (ii) compiling data representing the user interactions; and (iii) transmitting the data representing the user interactions to a server device such that the data representing the user interactions is associated with an image identifier code, thereby associating the participating interactions with the particular printed material.
One embodiment provides a device wherein the presentation animation is an interactive animation that is variably presented in response to input by a user of the device.
One embodiment provides a device, wherein the AR data storage module maintains a plurality of AR data sets associated with a given image identifier code, and wherein a particular one of the AR data sets is selected for delivery to the device based on one or more attributes of a user associated with the device.
One embodiment provides a device, wherein the one or more attributes of a user associated with the device are defined in a user profile, wherein the user profile is stored in the memory module of the device and/or in a cloud-hosted data store.
One embodiment provides a device wherein the data storage is maintained by a cloud hosted server AR data management system
One embodiment provides a system configured to enable monitoring of user engagement with a physical substrate bearing printed material, the system comprising:
a communication module that enables communication between the system and a plurality of user devices, wherein each user device comprises:
a camera module configured to capture image data of a capture area;
a display screen;
a microprocessor;
a memory module coupled to the microprocessor, wherein the memory module maintains computer executable code that, when executed via the microprocessor, configures the apparatus to provide:
an image identifier extraction module configured to process image data captured via the camera module to: (i) identifying the presence of a graphic article having predefined attributes printed on the physical substrate; and (ii) extracting an image identifier code from the graphic article;
an Augmented Reality (AR) data access module configured to retrieve an executable AR dataset associated with the extracted image identifier code from an AR data storage module;
an AR presentation module configured to cause display of a presentation animation superimposed on the image capture-based display of the physical substrate on the display screen based on the retrieved AR data, wherein the presentation animation superimposed on the image capture-based display of the physical substrate is displayed at a defined position and orientation relative to the image capture-based display of the physical substrate;
a participation monitoring module configured to: (i) monitoring user interactions associated with the display of the presentation animation; (ii) compiling data representing the user interactions; and (iii) transmitting the data representing the user interactions to a server device such that the data representing the user interactions is associated with an image identifier code, thereby associating the participating interactions with the particular printed material.
Wherein the system provides the AR storage module.
One embodiment provides a system wherein the presentation animation is an interactive animation that is variably presented in response to input by a user of the device.
One embodiment provides a system wherein the AR data storage module maintains a plurality of AR data sets associated with a given image identifier code, and wherein a particular one of the AR data sets is selected for delivery to the device based on one or more attributes of a user associated with the device.
One embodiment provides a system wherein the one or more attributes of a user associated with the device are defined in a user profile, wherein the user profile is stored in the memory module of the device and/or in a cloud-hosted data store.
One embodiment provides a system wherein the one or more attributes of the user are updated in response to monitoring user interaction associated with the display of the presentation animation.
One embodiment provides a system wherein the updating of the one or more attributes of the user causes a customized selection and/or execution of AR data presented in relation to a given image ID code, which results in a change in the selection and/or execution of AR data upon continuous interaction with a common image ID code.
One embodiment provides a system, wherein the system includes a module that provides access to an e-commerce platform to enable a user of the user device to purchase goods and/or services based on interaction with the presented AR data.
One embodiment provides a system wherein the data storage is maintained by a cloud hosted server AR data management system
One embodiment provides a method for enabling monitoring of user engagement with a physical substrate bearing printed material, the method comprising:
computer executable code defining a plurality of sets of interactive augmented reality animation data, wherein each set of data is executable by an AR presentation module of a user device to cause a presentation animation superimposed on an image capture-based display of a physical substrate to be displayed on a display screen of the user device, wherein the presentation animation superimposed on the image capture-based display of the physical substrate is displayed at a defined position and orientation relative to the image capture-based display of the physical substrate;
associating each AR data set with an identifier code;
printing a given one of the identifier codes onto a plurality of physical substrates, wherein each of the physical substances is applied with a print medium designed to supplement the AR data set associated with the identifier code;
identifying that a user device executing a prescribed software application is capturing image data of a physical substrate and that a given one of the identifier codes has been extracted from the image data;
causing the user device to display a presentation animation of the AR data set associated with the identifier code such that the animation is presented superimposed on the image capture-based display of the physical substrate, wherein the presentation animation superimposed on the image capture-based display of the physical substrate is displayed at a defined position and orientation relative to the image capture-based display of the physical substrate;
user interactions associated with the display that presents the animation are monitored such that the data representing the user interactions is associated with the image identifier code, thereby associating the participating interactions with the particular printed material. (ii) a And
data representing these user interactions is compiled.
One embodiment provides a method for enabling monitoring of user engagement with a physical substrate bearing printed material, the method comprising:
distributing a plurality of physical substrates carrying a common printed content item, wherein the common printed content item is associated in a database with a particular set of AR content;
at the first mobile device, reading a graphical article from a first one of the physical substrates bearing the printed content item, and in response obtaining data from the particular set of AR content from the server, and rendering an AR interactive game based on the data;
at the second mobile device, reading a graphical article from a second one of the physical substrates bearing the printed content item and in response obtaining data from the particular set of AR content from the server and rendering an AR interactive game based on the data;
a cloud-hosted game server device communicates with the first mobile device and the second mobile device such that the server receives data representing an interaction with an instance of the AR interactive game at one of the two mobile devices and causes a change in a state of the AR interactive game executing at the other of the two mobile devices in response to the data representing the interaction.
One embodiment provides a method that includes maintaining a record of interactions with the AR content at the first mobile device and the second mobile device to enable analysis of participation in the printed content items.
One embodiment provides a method for enabling a user to engage a physical substrate bearing printed material, the method comprising:
distributing a plurality of physical substrates carrying a common printed content item, wherein the common printed content item is associated in a database with a particular set of AR content;
at the first mobile device, reading a graphical artifact from a first one of the physical substrates bearing the printed content item and in response obtaining data from the particular set of AR content from the server and rendering an AR interactive animation based on the data;
presenting, via the presentation of the AR interactive animation, an interactive object representing functionality for purchasing goods and/or services;
in response to a predefined interaction with the interactive object representing a function for purchasing goods and/or services, a process is triggered that causes the user to conduct a purchase transaction with respect to the goods and/or services.
One embodiment provides a method wherein the process of causing the user to conduct a purchase transaction with respect to the goods and/or services comprises a process of completing the purchase transaction using pre-stored user data, the pre-stored user data comprising payment information and delivery information.
One embodiment provides a method wherein the process of causing the user to conduct a purchase transaction with respect to the goods and/or services includes a process of redirecting the user to a display that presents a user interface for facilitating completion of the purchase transaction.
Reference throughout this specification to "one embodiment," "some embodiments," or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases "in one embodiment," "in some embodiments," or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments as will be apparent to one of ordinary skill in the art in view of this disclosure.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
In the claims and in the description herein, any of the terms "comprising", "including" or "comprising" is an open-ended term that means including at least the following elements/features but not excluding others. Thus, the term "comprising" when used in a claim should not be interpreted as being limited to the means or elements or steps listed thereafter. For example, the scope of the expression "a device comprising a and B" should not be limited to devices consisting of only element a and element B. Any of the terms "including" or "including" as used herein are also open-ended terms that include at least the elements/features that follow the term, but do not exclude other elements/features. Thus, "comprising" is synonymous with and means "including".
As used herein, the term "exemplary" is used in a sense that provides an example rather than an indication of a trait. That is, the "exemplary embodiment" is an embodiment provided as an example, and is not necessarily an embodiment having exemplary characteristics.
Drawings
Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
FIG. 1A illustrates a technical framework according to one embodiment.
FIG. 1B illustrates a technical framework in accordance with one embodiment.
FIG. 1C illustrates a technical framework in accordance with one embodiment.
FIG. 2A illustrates a method according to one embodiment.
FIG. 2B illustrates a method according to one embodiment.
FIG. 2C illustrates a method according to one embodiment.
FIG. 2D illustrates a method according to one embodiment.
Detailed Description
Described herein are techniques configured to enable monitoring of user engagement with physical printed material via an augmented reality delivery system. Although this application of the technique is described as a primary example, it should be understood that the technique has additional applications. Some example embodiments relate to using AR technology as a means to improve the effectiveness of print-based advertisements, primarily by allowing analysis to be collected regarding a user's participation in the print-based advertisements, and in some cases, by allowing a user to purchase goods and/or services directly via participation in the print-based advertisements. A very broad form of AR experience delivery and participation monitoring may be used.
In general, some embodiments of the technology discussed herein are configured to enable monitoring of user engagement with physical printed material, and in some cases delivery of interactive content (including "shared experience" interactive content) to people experiencing the printed material. For example, this may include participation in any one or more of the following:
advertisements printed on newspapers or magazines (e.g., to enable analysis in a manner similar to internet-delivered advertisements);
advertising material printed on leaflets, brochures, etc. (again, e.g., to enable analysis in a manner similar to internet-delivered advertisements);
business cards (e.g., for delivering multimedia information);
non-advertising content on newspapers or magazines (e.g., to enhance the content provided);
packaging for one or more items (e.g., for providing multimedia content and/or for providing "terms and conditions" files for soliciting user consent);
other physical substances that carry the printed material.
In some embodiments, the techniques are applied to virtual replication of printed material, e.g., 2D content displayed on a computer screen. One example is a magazine that is experienced in digital form via a tablet computer or the like. It will be appreciated that the techniques described herein that are operable on printed material can work equally well with digital versions of printed material presented on a two-dimensional display screen.
Tracking participation by: attracting users to interact with physical substrates via AR-enabled devices (e.g., mobile devices such as smartphones or tablets, AR headphones/glasses, etc.) to experience augmented reality animations designed to complement the printed material. User interactions (e.g., time spent viewing the animation, interactions with the animation in the form of a video game, interactions with their devices during the animation, etc.) are monitored and recorded for analysis purposes. This is optionally used to update the data in the user profile. These interactions may include interactions performed via input devices of the smartphone/tablet (e.g., via buttons, touch screen, microphone, camera, etc.) and/or interactions observed via external input units, such as smartwatches, biometric monitors, game controllers, and so forth.
As used herein, the term "augmented reality animation" refers to any augmented reality content presented on a screen via augmented reality technology, which in effect means that the content is displayed in a defined position relative to an object or objects identified in real world space (e.g., identified and tracked using a camera module). "animation" may include three-dimensional non-interactive content, interactive content (e.g., a video game rendered as a three-dimensional augmented reality object), three-dimensional mapped objects rendered as augmented reality objects, and other forms of content. In some embodiments, the generation of augmented reality content is accomplished with a Unity augmented reality platform (and in further embodiments alternative techniques and/or platforms are used).
In some embodiments, the augmented reality animation is customized based on data in the user profile. The profile may include demographic characteristics of the user, known interests/habits, and other information. The designer preferably takes these factors into account when designing the AR animation, thereby enabling a customized experience to be delivered. In some embodiments, the user profile is updated in response to monitoring of user engagement. In some cases, the update is implemented such that, for a given instance of printed material (e.g., a given advertisement), the AR data presented to a given user may differ between successive interactions. For example, in one embodiment, a user engages in a print advertisement via an AR, and a user profile is updated in response to the engagement. The user later engages in the same printed advertisement via the AR (optionally printed in a different location, e.g., in a different magazine, in a different media form, etc.) may be presented with a different AR experience due to changes in the user profile data. This is optionally implemented to provide ongoing delivery of AR-based material in a manner similar to an automated email tracking engine. The material presented to the user via the AR data is customized based on previous engagement of the user, delivering content of the defined logical channels in response to the user engagement action.
In some embodiments, the augmented reality animation is configured to deliver a "shared experience" whereby a given user's content experience is influenced by the experience of that content by one or more other users. For example, in some embodiments, the augmented reality content is an interactive multiplayer game in which interaction with the augmented reality content by one player on their device affects the augmented reality content experience of another player on their device (or the experience of multiple other players on their respective devices). This may include real-time multiplayer online games (e.g., sports games, first-person shooter games, etc.) in which a game server maintains a central simulation based on input from player devices and players see corresponding local simulations based on data from the server, or it may include round-robin multiplayer games (e.g., a spelling game or a chess game) in which input from one player is provided to the server and, in response, the server provides data to another player.
In some embodiments, the augmented reality animation is configured to display or provide access to content available from one or more network servers, such as content presented in the form of an electronic commerce function (e.g., to enable purchasing of products via interaction with the animation), social media content (e.g., to allow comments and/or view comments), and so forth. Example embodiments are discussed below:
in an example embodiment, the AR content includes an interactive object that facilitates a purchase transaction. This may be a direct purchase transaction (e.g., the user interacts with the object presented as an AR component or in conjunction with AR animation and purchases are made by doing so using pre-stored payment information), or may be an indirect purchase transaction (e.g., the user interacts with the object presented as an AR component or in conjunction with AR animation and is directed to a web page that allows the purchase to be completed, etc. by doing so, the user may then proceed to the purchase transaction with the object presented as an AR component or in conjunction with AR animation.
In another example, the AR animation or objects presented in conjunction with the AR animation provide functionality to view third party comments and/or enter user-defined comments. This may utilize a third party content review platform.
The user interaction for generating analysis data representing user engagement includes any one or more of:
time spent participating. For example, this may include the length of time that the user viewed a particular AR animation, the length of time that the user interacted with a particular AR animation component in a particular manner, the number of repeated access events (or other attribute (s)) for the AR animation, and so forth.
The location of participation. This may be determined, for example, based on a reading of the smartphone's GPS module (or another location module).
Characteristics of participation in the interactive module (e.g., participation in the AR video game, such as the number of games played, the time spent playing the game, game performance characteristics, etc.).
Participation preference (e.g., playing a game or viewing information).
Repeated participation (and the nature of repeated participation).
A digital heatmap that interacts with the AR objects and/or the virtual environment (e.g., to help determine whether the user is engaged with a particular advertisement).
View a digital heat map of portions of printed material and/or AR content (e.g., based on retinal tracking via a front-facing camera).
Asset interactions, e.g., input to interactive objects- "like" (concept of "like" in social media), identification of unconverted sales (allowing for focus on downstream e-marketing methods to seek future conversions), and "stay-based" merchandising of goods/services (determining potential interest in purchasing goods and/or services based on stay viewing of objects representing the goods and/or services, optionally derived from interactive heatmaps, AR content status/location analysis, and/or retinal tracking).
Conversion rate (e.g., users accessing the e-commerce platform and/or purchasing products because of participation in AR data).
This is used, for example, to enable analytical monitoring of engagement with particular printed material examples (e.g., advertisements delivered via one or more distribution media types), thereby gathering information such as demographic characteristics that are useful in understanding the effectiveness of advertising campaigns and the like delivered via print media.
The additional advantages are that: advertising campaigns, etc., delivered in physical print form can be improved and updated based on intelligence gathered (via participation monitoring) to optimize effectiveness even after physical distribution. Additionally, the AR data delivery logic optionally implements split testing whereby multiple AR experiences are tested simultaneously in a user group.
According to one embodiment, a method for enabling monitoring of user engagement with a physical substrate bearing printed material includes the processes set forth below.
First, the method includes defining computer executable code representing a plurality of interactive Augmented Reality (AR) animation datasets. Each data set may be executed by an AR rendering module of the user device to cause display of AR data. In this regard, each data set is defined with reference to: (i) a printed material on the substrate on which the AR animation is to be superimposed; and (ii) AR experience goals (e.g., based on viewer profiles, etc.).
The AR presentation module of the user device is configured to cause presentation of a presentation animation superimposed on the image capture-based display of the physical substrate on a display screen of the user device, wherein the presentation animation superimposed on the image capture-based display of the physical substrate is displayed at a defined position and orientation relative to the image capture-based display of the physical substrate. Preferably, the presentation animation is an interactive animation that is variably presented in response to input by a user of the device (e.g., to provide a video game in conjunction with game performance logic).
Each AR data set is associated with an identifier code. These codes are printed on physical substrates and allow the computer system to identify the correct AR animation to be displayed. That is, each of these physical substances is applied with a print medium designed to supplement the AR data set associated with the identifier code.
The method includes printing a given one of the identifier codes onto a plurality of physical substrates and distributing the substrates to users. The method then includes identifying that a user device executing a prescribed software application is capturing image data of the physical substrate and has extracted a given one of the identifier codes from the image data, and causing the user device to display a presentation animation of the AR data set associated with the identifier code. User interactions associated with the display of the presentation animation are monitored and data representing the user interactions is compiled. Preferably, the data is maintained in a server for subsequent analysis.
In some embodiments, data derived from the monitoring of user participation is used to update data in the maintenance user profile. This allows the AR data presented to a given user to be customized in response to data derived from previous interactions. This may include giving offers based on the identified historical user engagement attributes, and so on.
FIG. 1A illustrates a frame according to one embodiment. It will be appreciated that various components shown in this example may be replaced with other components for purposes of other embodiments.
The example of FIG. 1A and other examples herein are described with reference to a number of "modules". The term "module" refers to a logically separable software component (computer program), or hardware component. The modules in the embodiments refer not only to the modules in the computer program but also to the modules in the hardware configuration. The discussion of the embodiments also serves as a discussion of computer programs for causing the modules to function (including a program for causing a computer to execute each step, a program for causing a computer to function as a plurality of apparatuses, and a program for causing a computer to implement each function), and also serves as a discussion of systems and methods. For ease of explanation, the phrases "storing information," "causing information to be stored," and other phrases equivalent thereto are used. If the embodiment is a computer program, these phrases are intended to mean "causing a memory device to store information" or "controlling a memory device to cause a memory device to store information". Modules may correspond to functions in a one-to-one correspondence. In a software implementation, one module may form one program, or a plurality of modules may form one program. One module may form a plurality of programs. Multiple modules may be executed by a single computer. A single module may be executed by multiple computers in a distributed environment or in a parallel environment. One module may comprise another module. In the discussion that follows, the term "coupled" refers not only to physical connections, but also to logical connections (such as the exchange of data, instructions, and data reference relationships). The term "predetermined" means that something has been decided before the process of interest. Thus, the term "predetermined" is intended to refer to something decided prior to the process of interest in an embodiment. Even after the process in an embodiment has been started, the term "predetermined" refers to something decided before the process of interest according to the embodiment's condition or state at the present point in time, or according to the condition or state lasting up to the present point in time. If the "predetermined values" are complex numbers, the predetermined values may be different from each other, or two or more of the predetermined values (including all value cases) may be equal to each other. The statement "if a, then perform B" is intended to mean "determine if something is a and if it is determined that something is a, then perform action B". This statement will make no sense if no determination is made as to whether something is a.
The term "system" refers to an arrangement of multiple computers, hardware configurations, and devices interconnected via a communications network (including one-to-one communications connections). The term "system" and the term "device" are also meant to include a single computer, hardware configuration, and arrangement of devices. The system does not include a human-established social system as a social "choreography".
In each process executed by a module or in one of a plurality of processes executed by a module, information targeted for the process is read from a memory device, the information is then processed, and the process result is written to the memory device. Descriptions related to reading information from the memory device before the process and writing processed information to the memory device after the process may be appropriately omitted. The memory device may include a hard disk, a Random Access Memory (RAM), an external storage medium, a memory device connected via a communication network, and a ledger within a CPU (central processing unit).
The framework of fig. 1 includes an example user device in the form of a smartphone 120 (in other embodiments there may be other forms of devices including AR headphones/glasses). The smartphone 120 includes a display screen 121 (preferably a touch screen, e.g. a capacitive or resistive touch screen) and an image capture device (such as a digital camera, in this example pointing in the opposite direction to the display screen, commonly referred to as a rear-facing camera), not shown, which captures image data in a region 122.
The smartphone 120 also includes a memory module coupled to the microprocessor, where the memory module maintains computer-executable code that, when executed via the microprocessor, configures the smartphone to execute a mobile app (shown as mobile app module 130). The mobile app module 130 includes:
an image processing module 131 configured to process image data collected by the image capture device to identify an image artifact. As used herein, the term "image artifact" describes a graphic artifact that can be identified in an image that allows a determination that a particular image (or class of images) is being viewed.
An image identifier extraction module 132 configured to process image data captured via the camera module such that: (i) identifying the presence of a graphic article having predefined attributes printed on the physical substrate; and (ii) extracting the image identifier code of the graphic article. This may include reading a graphic code containing data representing the identifier code, or using data derived from the graphic article to identify the identifier code from a database.
An Augmented Reality (AR) data access module 133 configured to retrieve an executable AR data set associated with the extracted image identifier code from an AR data storage module 141 of the AR file management system 140. The AR dataset includes code that enables presentation of AR animation (e.g., AR video game).
An AR presentation module 138 configured to cause display of a presentation animation superimposed on the image capture-based display of the physical substrate on the display screen based on the retrieved AR data. As will be discussed in further detail below, the animation is rendered superimposed at a defined position and orientation. In some embodiments, this makes the animation appear as if floating in three-dimensional space, over, or resting on top of the physical substrate.
A participation monitoring module 135 configured to: (i) monitoring user interactions associated with the display of the presentation animation; (ii) compiling data representing the user interactions; and (iii) transmit data representing these user interactions to server device 160. The compiled data may include data representative of the participation location, based for example on data retrieved from a mobile device GPS module.
A user profile data module 134 that maintains data defining attributes of the user, including demographic characteristics, historical activity, and the like. In some embodiments, the user profile data is instead contained in a cloud-hosted storage repository and/or supplemented by other profile data contained in the cloud-hosted storage repository.
Fig. 1 also shows AR-enhanced image file management system 140 with which smartphone 120 interacts (via module 130) via the internet. In general, system 140 is configured to facilitate delivery of AR data for an animation to be displayed to module 130 in response to an identified image. In some embodiments (such as described below), the system 140 transmits the AR data itself; in other embodiments, system 140 transmits a link or other data that enables module 130 to download AR data from another network location.
The system 140 includes an AR data storage module 143 that includes AR data sets defined by the AR data generation system 170. These data sets include animations and associated logic for interactive animations (e.g., interactive animation-based video games). System 140 also includes an image data storage module 144 that maintains data associated with printable media associated with AR datasets (e.g., each print media dataset includes a graphic article that embeds an image ID code associated with a given one or more of the AR datasets).
The AR data generation system may be substantially any system or group of systems for generating AR data. For example, in some embodiments, this includes a system that provides AR authoring tools (e.g., Vuforia) and interactive content authoring tools (e.g., Unity, urea, etc.). An interactive content authoring tool provides an authoring tool for logically programming an interactive experience involving three-dimensional objects in a three-dimensional virtual space; the AR authoring tool enables the spatial transformation into a frame of reference defined by the physical substrate to provide AR data in a predefined position and orientation relative to the physical substrate.
The system 140 also includes an AR delivery rules module that allows for customization of AR data to be delivered in a particular instance based on rules that define AR dataset selection criteria in addition to image ID codes. This may include, for example, user attributes maintained in the user profile data, enabling customized delivery of AR content based on the known attributes of the viewer. This may include any one or more of the following operations: selecting an AR dataset tailored to the profile; selecting logic customized for the profile; and identifying additional data (e.g., stored user values, etc.) for display with the AR data.
In some embodiments, customization based on a user profile is achieved by a process comprising: (i) monitoring user participation; (ii) updating a user profile based on the user engagement; and (iii) subsequently, delivering the AR content in a modified manner based on the updated user profile. This allows AR
In some embodiments, customization based on a user profile is achieved by a process comprising: (i) monitoring user participation; (ii) updating a user profile based on the user engagement; and (iii) subsequently, delivering AR content in a modified manner in some embodiments, customization based on the user profile is achieved by a process comprising: (i) monitoring user participation; (ii) updating a user profile based on the user engagement; and (iii) subsequently, delivering the AR content in a modified manner based on the updated user profile. This allows AR customization based on the updated user profile.
In some embodiments, a logical temporal structure is defined that causes particular AR data for a given image ID to be delivered based on the identified historical user engagement data. In this way, the AR experience is modified based on the logical temporal structure whenever the user participates in a particular printed material (e.g., an advertisement) (rather than delivering the same experience whenever the user participates in the same advertisement). This is optionally used to virtually duplicate the techniques known from automated email marketing that also uses a logical event structure to automatically send an email containing specific content based on the user's interaction with previous emails. In the context of the present technology, by way of example, a logical event structure allows for control of AR material delivered to a given user based on past interactions with a given advertisement (and/or other advertisements, such as advertisements for common products/services). This control may be achieved by delivering different selections of AR data or modifying execution of code implementing the interactive AR experience (e.g., modifying execution of a video game delivered via AR data).
The printing system 150 causes image data to be printed from the storage module 144 onto a plurality of physical substrates. An example physical substrate 110 is shown. This may include a main print area 111 (which includes artwork, white space, etc.) and one or more graphic articles 112A and 112B embedded with an image ID code. In some embodiments, multiple artifacts are printed at defined locations to facilitate determining the location and orientation of the substrate by module 131 so that the AR animation is displayed at the correct location and orientation. In this regard, in fig. 1, the display screen 121 shows an image capture based display 123 of the physical substrate (essentially a real-time feed of images captured by the image capture device) and an AR animation 124 superimposed on the display 123 of the physical substrate at a controlled/defined position and orientation. The AR animation may be 3-dimensional (i.e. extending beyond the substrate, e.g. using the substrate as a ground plane relative to which the animated object moves) or two-dimensional (e.g. providing a presentation showing additional material on the surface of the substrate 111, as shown by 223 on screen 121).
Although two graphic artifacts 112A and 112B are shown in some embodiments, there may be fewer or more graphic artifacts providing access to the image ID code. In some embodiments, there is a single article and the article is capable of informing the mobile device of the position and orientation of the physical substrate. In some embodiments, the graphic article is embedded in an artwork (e.g., a photograph).
In some embodiments, as shown in fig. 1B, the mobile app module 130 is configured to communicate with one or more other network systems, such as an e-commerce platform 191, a social network platform 192, and other online data sources 193. Although the example of FIG. 1B shows modules 130 accessing these platforms directly, in some embodiments, access is via modules of system 140.
In the case of e-commerce platform 191, in some embodiments, the system is additionally coupled to one or more e-commerce platforms, enabling users to purchase goods and/or services associated with the AR experience either directly through the software application or via external hyperlinks that are invoked from within the software application. This provides a technique of: users are able to make online purchases through this technology from participation in printed material (e.g., printed advertisements). Additionally, in some embodiments, interactions with the e-commerce platform are monitored to update the user profile of the user, allowing for customization of AR data in future interactions. In a preferred embodiment, module 130 (or system 140, or platform 191) is pre-configured for storing payment and shipping information for the user to allow for a simplified purchase process (e.g., enabling the user to purchase desired goods and/or services via direct interaction with AR content or interaction with other on-screen presentation objects displayed in conjunction with the AR content through a single interaction or optimized interaction process). It should be appreciated that the use of AR content as a means of facilitating digital participation (including participation in the form of online purchases of goods and/or services) in physically printed material, such as newspaper and magazine advertisements, allows printed advertisements to facilitate direct online sales in a more efficient manner. Because of the advantages of click-to-purchase that can be achieved through online advertising, many advertisers have overlooked print media relative to online advertising; click purchases from print media advertisements can be facilitated through the techniques disclosed herein.
In the case of social network platform 192, this may include any social network platform, including social network platforms that provide the user with the following functionality: input comments and/or praise, and read comments and/or praise input by other users. The module 130 downloads content from the platform, allowing comments and/or likes to be entered and/or viewed via user interface objects displayed as part of or in conjunction with the AR animation.
Other online data sources may include the following:
data values (e.g., weather values, temperature values, sporting event scores, etc.).
Map data presented as AR animations (e.g., via google maps, etc.), e.g., for providing location and/or directions based on the user's current location. This may be used, for example, to provide an AR map about the advertisement, showing the viewer the nearest location where the advertiser's product or service is available.
In various embodiments, map data within (or initiated in response to interaction with) the AR data is used to demonstrate the closest location of available goods and/or services for purchase and/or post-purchase self-service (post-purchase collection) to support e-commerce functionality. It will be appreciated that one key advantage of the technology described herein includes enabling advertisers to promote online sales (e.g., via AR content with e-commerce capabilities) through print media advertising; some embodiments additionally/alternatively seek to support "brick and cement type" brick and tile retail brick and mortar retail stores through "buy and self-serve" methods and/or by pushing store visits using map data.
FIG. 1C illustrates a further embodiment representing a "shared experience" AR environment. In general, "shared experience" AR environments, where two (or more) users experience common AR content, are typically based on having the same trigger print material, and the interaction of one user with the content has an impact on the experience of the other user (or users).
In the example of fig. 1C, there are two smartphones (120 and 120 ') that each execute a respective instance of the mobile app module (130 and 130'). In this example, the two smartphones have been used to capture image data that includes the same printed material (e.g., the same advertisement printed at different locations on two different physical magazines). Both instances of the mobile app module obtain AR data from the public cloud-hosting AR data delivery module 194. The delivery module 194 is configured to send data and instructions to each smartphone to control delivery of AR content by the smartphone, and to receive data representing interactions with the AR content (and optionally, concurrently displayed content). The module 194 communicates with a shared experience execution module 195 configured to coordinate sharing of AU experiences among multiple users. In a further embodiment, the smartphone communicates with module 195 in a manner that bypasses module 194 (e.g., module 194 provides data to configure local execution of AR content, and execution of the content includes processes of sending data to module 195 and receiving data from module 195 to facilitate shared experience functionality).
Some example use cases for sharing used AR content are considered below. It will be appreciated that these provide new ways of enabling users to participate in physically printed content (e.g., newspaper and magazine advertisements) where the ability to share content experiences with users at remote locations was previously inherently hindered.
In one example, the module 195 performs a simulation that the users of the smartphone 130 and the smartphone 130' (and optionally one or more additional players) are both playing a multiplayer AR game at the same time. In this manner, the module 195 can maintain a global game context that represents the current state of the two players within the game via upstream data packets and provide downstream data sets such that the AR simulation presented on each smartphone mimics the global game context. It will be appreciated that this is a conventional approach for managing online multiplayer games, and that various known techniques for managing online multiplayer games may be used. However, in a practical sense, in the current technical framework, this allows multiple users to have a shared interactive experience with print media (e.g., participation among readers of the same magazine or advertisement).
In another example, the module 195 maintains current game states for a plurality of round games (including AR round games played on the smartphone 130 and the smartphone 130'). This may be, for example, an AR game based on board games such as a spelling game, chess, and the like. In this example, module 195 is configured to facilitate transfer of player turn data (e.g., chess moves) between players in a public game, thereby enabling round-based player interaction with the players. One practical example is a puzzle plate printed on a physical substrate, where the AR animation causes player and opponent movements to be superimposed on the plate throughout the course of the competitive game. Preferably, the game is coordinated by a matching module that identifies a group of players (typically pairs of players) who are in a state of initiating AR content and being ready to join the game.
In another example, module 195 manages the collaborative game such that multiple players make respective contributions to the content in the AR animation, and the addition of the respective players is visible to all players. In some embodiments, this includes AR animation: the AR animation involves tasks that require multiple players to perform team collaboration to complete. In some cases, a matching module is provided to help coordinate the process by which groups of users are dynamically managed so that each group has at most a predefined maximum number of users.
In a further example, the module 195 provides a collaboration module whereby multiple players at remote locations can collaborate in common AR content, for example by entering data and/or changing graphical objects in a manner observable by other users. In some cases, a matching module is provided to help coordinate the process by which groups of users are dynamically managed so that each group has at most a predefined maximum number of users.
In a further example, module 195 provides a communication module that enables a user to communicate (e.g., via text-based messages and/or audio-based messages). This may also utilize a matching module that, in some embodiments, matches users based on an analysis of their demographic information based on a matching rules engine (e.g., to match users having shared characteristics, e.g., in terms of language, interests, age, etc.). In some embodiments, communication is limited to users who are already connected via an existing social network platform, such that the user only communicates with people in their existing social network.
In a further example, the module 195 provides a collaborative purchase module by which multiple users (e.g., a group configured by a matching module that generates the group based on criteria such as stored demographic information and/or social network commonality) can reach a favorable purchase price for goods and/or services subject to a threshold number of users making purchases together during a common shared AR experience.
It will be appreciated that these are merely examples and that other approaches may be used.
FIG. 2A illustrates a method 200 according to one embodiment. This method describes an example embodiment, AR content is used as a means to enable monitoring of participation in printed material (e.g., print advertisements in magazines, newspapers, etc.).
Block 201 represents a process that includes defining printable content to be applied to a physical substrate, for example, using a static content authoring application, such as Adobe Photoshop or the like. The content is defined to include an AR-identifying artifact (a graphical artifact identified by an AR software application that processes image data of the captured image to enable determination that AR content is to be displayed), wherein the AR-identifying artifact represents an AR Unique Identifier (UID) uniquely associated with the printable content.
Block 202 represents a process that includes generating AR content, which is associated with an AR UID at block 203. This includes defining code, animations, rules, game execution software, network data binding properties, etc. for the AR content. At block 204, the AR content is made available on the server for download (e.g., so that the mobile device can request download of the AR content based on the AR UID).
Block 205 represents printing and distributing a physical substrate, such as a magazine, newspaper, etc. that prints and distributes pages containing printable content that printed block 201.
Block 206 represents a process by which a user of a mobile device (e.g., a smartphone, AR headset, smartphone connected to AR headset, etc.) is used to identify AR graphical artifacts, download AR content based on an AR UID, enable execution and interaction with the AR content, and perform monitoring of the interaction such that a server device can receive analysis of the user's interaction with printed content.
Fig. 2B illustrates a method 210 associated with block 206 of fig. 2A. At 211, the user launches a particular smartphone app, which triggers the AR artifact identification process at 212. The process of block 212 includes processing image data collected via a camera module of the smartphone (or connected AR headset) to search for an AR artifact in the camera field of view. Such an article is identified at block 213, which triggers transmission of a request for AR content data (or identifies locally stored AR content) to a server based on the AR UID extracted from the image. At 214, AR content is received and executed, and user engagement (i.e., user interaction with AR animation, which provides an alternative to monitoring user interaction with printed content) is monitored during execution, such as user engagement with advertisements as described further above. At 216, data resulting from the monitoring is transmitted to a server.
Fig. 2C illustrates a method 230 associated with block 214 of method 210. Block 231 represents the process by which AR content is received from a server and includes objects bound to hyperlinked data (e.g., the AR content includes objects that act as containers that present content downloaded from a network-based location, including data values, images, website content, etc.). Block 232 represents configuring the access module to acquire (pull) and/or receive (via push) data from a network location. The configuration of these modules may allow control of refresh times that are shorter for dynamically changing web content or longer for generally static web content. Then, at block 233, the AR content is rendered with real-time (i.e., current) data from the network source. In one embodiment, the web content includes objects that facilitate the purchase of goods and/or services.
Fig. 2D illustrates a method 240 also associated with block 214 of method 210. In this example, block 241 represents a process by which a user's interaction with an AR connection presented on their device (e.g., a smartphone, but in other examples an AR headset) triggers a request to participate in a multiplayer online game. Block 242 represents the step by which the game server performs a matching process to place the user in an existing game having free player positions or to generate a new game in which the user occupies one of the player positions. Block 243 represents the process by which AR content provides a multiplayer game with a continuous bi-directional data exchange with the game server.
It will be appreciated that the above described techniques provide a means by which a user can interact with a physical substrate containing printed material via a smartphone device or the like, and that participation will be monitored and measured for subsequent analysis purposes.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," module "or" system. Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer-readable media having computer-readable program code embodied therein.
Any combination of one or more computer-readable media may be used. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein (e.g., in baseband or as part of a carrier wave). Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, Radio Frequency (RF), etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language (e.g., Java, Smalltalk, C + +, etc.) and conventional procedural programming languages, such as the "C" programming language or similar programming languages, a scripting language (e.g., Perl, VBS or similar languages), and/or functional languages, such as Lisp and ML, and a logic oriented language, such as Prolog. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
Aspects of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The computer program product may comprise all the features enabling the implementation of the methods described herein, and may be capable of executing the methods when loaded in a computer system. Computer program, software program, or software, in the present context mean any expression, in any language, code or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: (a) conversion to another language, code or notation; and/or (b) replication in a different material form.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims, if any, are intended to include structures, materials, or acts for performing the functions in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
Various aspects of the disclosure may be embodied as programs, software, or computer instructions embodied in a computer-or machine-usable or readable medium, which when executed on a computer, processor, and/or machine, cause the computer or machine to perform the steps of the method. A program storage device readable by a machine, tangibly embodying a program of instructions executable by the machine to perform the various functions and methods described in this disclosure, is also provided.
The systems and methods of the present disclosure may be implemented and performed in a general purpose computer or a special purpose computer system. The terms "computer system" and "computer network" as may be used in this application may include various combinations of fixed and/or portable computer hardware, software, peripheral devices, and storage devices. The computer system may include a number of separate components that are networked or otherwise linked for cooperative execution, or may include one or more separate components. The hardware and software components of the computer system of the present application may include and may be included in fixed and portable devices, such as desktop computers, laptop computers, and/or servers. A module may be a component of a device, software, program, or system that implements some "functionality," which may be implemented as software, hardware, firmware, electronic circuitry, or the like.
While specific embodiments of the invention have been described, those skilled in the art will appreciate that there are other embodiments equivalent to the described embodiments. Accordingly, it is to be understood that the invention is not to be limited by the specifically illustrated embodiments, but only by the scope of the appended claims.
It should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the methods of the present disclosure should not be construed as reflecting the intent: the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the following claims are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Furthermore, although some embodiments described herein include some but not other features included in other embodiments, as will be appreciated by those of skill in the art, combinations of features of different embodiments are intended to fall within the scope of the invention and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
Furthermore, some embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing a function. A processor having the necessary instructions for carrying out such a method or method element thus forms a means for carrying out the method or method element. Furthermore, the elements of an apparatus embodiment described herein are examples of elements that perform the functions that the elements perform for the purpose of implementing the invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it is to be noticed that the term coupled, when used in the claims, should not be interpreted as being restricted to direct connections only. The terms "coupled" and "connected," along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Thus, the scope of the expression "device a is coupled to device B" should not be limited to devices or systems in which the output of device a is directly connected to the input of device B. This means that there is a path between the output of a and the input of B, which may be a path including other devices or means. "coupled" may mean that two or more elements are in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.
Accordingly, while there has been described what are believed to be the preferred embodiments of the invention, those skilled in the art will recognize that other and further modifications may be made thereto without departing from the spirit of the invention, and it is intended to claim all such changes and modifications as fall within the scope of the invention. For example, any of the formulas given above are merely representative of programs that may be used. Functions may be deleted and added from the block diagrams and operations may be interchanged among the functional blocks. Steps may be added to or deleted from the described methods within the scope of the invention.

Claims (20)

1. A computer-implemented method that enables monitoring of user engagement with printed content, including printed content distributed via magazines, newspapers, packaging, and brochures, the method comprising:
distributing a plurality of items carrying respective copies of a common printed content item, wherein each copy of the printed content item includes a common graphical artifact, wherein the common graphical artifact is associated with an Augmented Reality (AR) content dataset in a cloud-hosted system;
executing a participating application on a plurality of user mobile devices, wherein each executing instance in the participating application is configured to:
processing image data captured via a camera module of the mobile device to identify a presence of the graphic article when the printed content item enters a field of view of the camera module;
obtaining, from the cloud hosting system, at least a subset of an Augmented Reality (AR) content dataset associated with the graphical artifact;
causing display, on a display screen of the mobile device, a presentation of interactive AR content superimposed on the image capture-based display of the physical substrate based on the obtained AR content data, wherein the presentation animation superimposed on the image capture-based display of the physical substrate is displayed at a defined position and orientation relative to the image capture-based display of the physical substrate;
at a server device, data representing user engagement with interactive content rendered from the Augmented Reality (AR) content data associated with the graphic artifact is collected from the plurality of user mobile devices to generate an analysis representing user engagement with the plurality of copies of the common printed content item.
2. A device configured to enable monitoring of user engagement with a physical substrate bearing printed material, the device comprising:
a camera module configured to capture image data of a capture area;
a display screen;
a microprocessor;
a memory module coupled to the microprocessor, wherein the memory module maintains computer executable code that, when executed via the microprocessor, configures the apparatus to provide:
an image identifier extraction module configured to process image data captured via the camera module to: (i) identifying the presence of a graphic article having predefined attributes printed on the physical substrate; and (ii) extracting an image identifier code from the graphic article;
an Augmented Reality (AR) data access module configured to retrieve an executable AR dataset associated with the extracted image identifier code from an AR data storage module;
an AR presentation module configured to cause display of a presentation animation superimposed on the image capture-based display of the physical substrate on the display screen based on the retrieved AR data, wherein the presentation animation superimposed on the image capture-based display of the physical substrate is displayed at a defined position and orientation relative to the image capture-based display of the physical substrate;
a participation monitoring module configured to: (i) monitoring user interactions associated with the display of the presentation animation; (ii) compiling data representing the user interactions; and (iii) transmitting the data representing the user interactions to a server device such that the data representing the user interactions is associated with an image identifier code, thereby associating the participating interactions with the particular printed material.
3. The device of claim 2, wherein the presentation animation is an interactive animation variably presented in response to an input by a user of the device.
4. The device of claim 2 or claim 3, wherein the AR data storage module maintains multiple sets of AR data associated with a given image identifier code, and wherein a particular one of the sets of AR data is selected for delivery to the device based on one or more attributes of a user associated with the device.
5. The device of claim 4, wherein the one or more attributes of the user associated with the device are defined in a user profile, wherein the user profile is stored in the memory module of the device and/or in a cloud-hosted data store.
6. The device of any of claims 2 to 5, wherein the data store is maintained by a cloud hosted server, AR, data management system
7. A system configured to enable monitoring of user engagement with a physical substrate bearing printed material, the system comprising:
a communication module that enables communication between the system and a plurality of user devices, wherein each user device comprises:
a camera module configured to capture image data of a capture area;
a display screen;
a microprocessor;
a memory module coupled to the microprocessor, wherein the memory module maintains computer executable code that, when executed via the microprocessor, configures the apparatus to provide:
an image identifier extraction module configured to process image data captured via the camera module to: (i) identifying the presence of a graphic article having predefined attributes printed on the physical substrate; and (ii) extracting an image identifier code from the graphic article;
an Augmented Reality (AR) data access module configured to retrieve an executable AR dataset associated with the extracted image identifier code from an AR data storage module;
an AR presentation module configured to cause display of a presentation animation superimposed on the image capture-based display of the physical substrate on the display screen based on the retrieved AR data, wherein the presentation animation superimposed on the image capture-based display of the physical substrate is displayed at a defined position and orientation relative to the image capture-based display of the physical substrate;
a participation monitoring module configured to: (i) monitoring user interactions associated with the display of the presentation animation; (ii) compiling data representing the user interactions; and (iii) transmitting the data representing the user interactions to a server device such that the data representing the user interactions is associated with an image identifier code, thereby associating the participating interactions with the particular printed material.
Wherein the system provides the AR storage module.
8. The system of claim 7, wherein the presentation animation is an interactive animation variably presented in response to an input by a user of the device.
9. The system of claim 7 or claim 8, wherein the AR data storage module maintains a plurality of AR data sets associated with a given image identifier code, and wherein a particular one of the AR data sets is selected for delivery to the device based on one or more attributes of a user associated with the device.
10. The system of claim 9, wherein the one or more attributes of the user associated with the device are defined in a user profile, wherein the user profile is stored in the memory module of the device and/or in a cloud-hosted data store.
11. The system of claim 10, wherein the one or more attributes of the user are updated in response to monitoring user interaction associated with the display of the presentation animation.
12. The system of claim 11, wherein the updating of the one or more attributes of the user causes a customized selection and/or execution of AR data presented in relation to a given image ID code, which results in a change in the selection and/or execution of AR data upon continuous interaction with a common image ID code.
13. The system of any one of claims 7 to 12, wherein the system comprises a module providing access to an e-commerce platform to enable a user of the user device to purchase goods and/or services based on interaction with the presented AR data.
14. The system of any of claims 7 to 13, wherein the data store is maintained by a cloud hosted server, AR, data management system
15. A method for enabling monitoring of user engagement with a physical substrate bearing printed material, the method comprising:
computer executable code defining a plurality of sets of interactive augmented reality animation data, wherein each set of data is executable by an AR presentation module of a user device to cause a presentation animation superimposed on an image capture-based display of a physical substrate to be displayed on a display screen of the user device, wherein the presentation animation superimposed on the image capture-based display of the physical substrate is displayed at a defined position and orientation relative to the image capture-based display of the physical substrate;
associating each AR data set with an identifier code;
printing a given one of the identifier codes onto a plurality of physical substrates, wherein each of the physical substances is applied with a print medium designed to supplement the AR data set associated with the identifier code;
identifying that a user device executing a prescribed software application is capturing image data of a physical substrate and that a given one of the identifier codes has been extracted from the image data;
causing the user device to display a presentation animation of the AR data set associated with the identifier code such that the animation is presented superimposed on the image capture-based display of the physical substrate, wherein the presentation animation superimposed on the image capture-based display of the physical substrate is displayed at a defined position and orientation relative to the image capture-based display of the physical substrate;
user interactions associated with the display that presents the animation are monitored such that the data representing the user interactions is associated with the image identifier code, thereby associating the participating interactions with the particular printed material. (ii) a And
data representing these user interactions is compiled.
16. A method for enabling monitoring of user engagement with a physical substrate bearing printed material, the method comprising:
distributing a plurality of physical substrates carrying a common printed content item, wherein the common printed content item is associated in a database with a particular set of AR content;
at the first mobile device, reading a graphical article from a first one of the physical substrates bearing the printed content item, and in response obtaining data from the particular set of AR content from the server, and rendering an AR interactive game based on the data;
at the second mobile device, reading a graphical article from a second one of the physical substrates bearing the printed content item and in response obtaining data from the particular set of AR content from the server and rendering an AR interactive game based on the data;
a cloud-hosted game server device communicates with the first mobile device and the second mobile device such that the server receives data representing an interaction with an instance of the AR interactive game at one of the two mobile devices and causes a change in a state of the AR interactive game executing at the other of the two mobile devices in response to the data representing the interaction.
17. The method of claim 15, comprising: a record of interactions with the AR content at the first mobile device and the second mobile device is maintained to enable analysis of participation in the printed content items.
18. A method for enabling a user to participate in a physical substrate bearing printed material, the method comprising:
distributing a plurality of physical substrates carrying a common printed content item, wherein the common printed content item is associated in a database with a particular set of AR content;
at the first mobile device, reading a graphical artifact from a first one of the physical substrates bearing the printed content item and in response obtaining data from the particular set of AR content from the server and rendering an AR interactive animation based on the data;
presenting, via the presentation of the AR interactive animation, an interactive object representing functionality for purchasing goods and/or services;
in response to a predefined interaction with the interactive object representing a function for purchasing goods and/or services, a process is triggered that causes the user to conduct a purchase transaction with respect to the goods and/or services.
19. The method of claim 18, wherein the process of causing the user to conduct purchase transactions with respect to the goods and/or services includes a process of completing a purchase transaction using pre-stored user data, the pre-stored user data including payment information and delivery information.
20. The method of claim 18, wherein the process of causing the user to conduct purchase transactions with respect to the goods and/or services includes a process of redirecting the user to a display presenting a user interface for facilitating completion of purchase transactions.
CN201980089989.9A 2019-11-18 2019-11-18 Techniques configured to enable monitoring of user engagement with physically printed material via an augmented reality delivery system Pending CN113454668A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/AU2019/051266 WO2021097514A1 (en) 2019-11-18 2019-11-18 Technology configured to enable monitoring of user engagement with physical printed materials via augmented reality delivery system

Publications (1)

Publication Number Publication Date
CN113454668A true CN113454668A (en) 2021-09-28

Family

ID=75979962

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980089989.9A Pending CN113454668A (en) 2019-11-18 2019-11-18 Techniques configured to enable monitoring of user engagement with physically printed material via an augmented reality delivery system

Country Status (7)

Country Link
EP (1) EP4062354A1 (en)
JP (1) JP2023501016A (en)
KR (1) KR20220105119A (en)
CN (1) CN113454668A (en)
AU (1) AU2019474933A1 (en)
SG (1) SG11202105089UA (en)
WO (1) WO2021097514A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101438311A (en) * 2006-05-09 2009-05-20 高斯国际美洲公司 System and method for targeting print advertisements
CN103518215A (en) * 2011-04-01 2014-01-15 英特尔公司 System and method for viewership validation based on cross-device contextual inputs
US20140210857A1 (en) * 2013-01-28 2014-07-31 Tencent Technology (Shenzhen) Company Limited Realization method and device for two-dimensional code augmented reality
CN104040574A (en) * 2011-12-14 2014-09-10 英特尔公司 Systems, methods, and computer program products for capturing natural responses to advertisements
US20140267792A1 (en) * 2013-03-15 2014-09-18 daqri, inc. Contextual local image recognition dataset
US20150348329A1 (en) * 2013-01-04 2015-12-03 Vuezr, Inc. System and method for providing augmented reality on mobile devices

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140249932A1 (en) * 2012-10-31 2014-09-04 Boxstar, LLC Shipping label advertising system and method
US9892560B2 (en) * 2014-09-11 2018-02-13 Nant Holdings Ip, Llc Marker-based augmented reality authoring tools
US10475103B2 (en) * 2016-10-31 2019-11-12 Adobe Inc. Method, medium, and system for product recommendations based on augmented reality viewpoints

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101438311A (en) * 2006-05-09 2009-05-20 高斯国际美洲公司 System and method for targeting print advertisements
CN103518215A (en) * 2011-04-01 2014-01-15 英特尔公司 System and method for viewership validation based on cross-device contextual inputs
CN104040574A (en) * 2011-12-14 2014-09-10 英特尔公司 Systems, methods, and computer program products for capturing natural responses to advertisements
US20150348329A1 (en) * 2013-01-04 2015-12-03 Vuezr, Inc. System and method for providing augmented reality on mobile devices
US20140210857A1 (en) * 2013-01-28 2014-07-31 Tencent Technology (Shenzhen) Company Limited Realization method and device for two-dimensional code augmented reality
US20140267792A1 (en) * 2013-03-15 2014-09-18 daqri, inc. Contextual local image recognition dataset

Also Published As

Publication number Publication date
JP2023501016A (en) 2023-01-18
EP4062354A1 (en) 2022-09-28
SG11202105089UA (en) 2021-11-29
KR20220105119A (en) 2022-07-26
AU2019474933A1 (en) 2021-07-08
WO2021097514A1 (en) 2021-05-27

Similar Documents

Publication Publication Date Title
US10937067B2 (en) System and method for item inquiry and information presentation via standard communication paths
Billewar et al. The rise of 3D E-Commerce: the online shopping gets real with virtual reality and augmented reality during COVID-19
US9535577B2 (en) Apparatus, method, and computer program product for synchronizing interactive content with multimedia
US11132703B2 (en) Platform for providing augmented reality based advertisements
US9225880B2 (en) Apparatus and method of conducting a transaction in a virtual environment
US8620730B2 (en) Promoting products in a virtual world
US10102534B2 (en) System and method for virtual universe relocation through an advertising offer
TW201642197A (en) Computer-readable media and methods for selecting and providing advertisement to client device
US20100332312A1 (en) System and method for analyzing endorsement networks
JP6401403B2 (en) Determining the appearance of objects in the virtual world based on sponsorship of object appearance
US20150174493A1 (en) Automated content curation and generation of online games
CN107851261A (en) For providing the method and system of relevant advertisements
Das Application of digital marketing for life success in business
US20150242877A1 (en) System for wearable computer device and method of using and providing the same
Asensio World wide data: the future of digital marketing, e-commerce, and big data
CN113454668A (en) Techniques configured to enable monitoring of user engagement with physically printed material via an augmented reality delivery system
Ulker-Demirel Development of Digital Communication Technologies and the New Media
CA3125164A1 (en) Technology configured to enable monitoring of user engagement with physical printed materials via augmented reality delivery system
Yang et al. Augmented, mixed, and virtual reality applications in cause-related marketing (CRM)
KR20150063295A (en) Method of providing advertisement service and apparatuses operating the same
WO2014071307A1 (en) Methods for targeted advertising
Castyana et al. The Role of Social Media to Attract Virtual Basketball Championship’s Participant during Pandemic Era
Chamaria et al. Master Growth Hacking: The Best-Kept Secret of New-Age Indian Start-ups
KR20230122231A (en) Systme for providing metaverse based online and offline global fan meeting service
CN117474600A (en) Interaction method and device based on delivered content, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination