WO2019168780A1 - Système et procédé permettant de fournir une expérience de réalité augmentée à des objets à l'aide d'étiquettes sans fil - Google Patents

Système et procédé permettant de fournir une expérience de réalité augmentée à des objets à l'aide d'étiquettes sans fil Download PDF

Info

Publication number
WO2019168780A1
WO2019168780A1 PCT/US2019/019392 US2019019392W WO2019168780A1 WO 2019168780 A1 WO2019168780 A1 WO 2019168780A1 US 2019019392 W US2019019392 W US 2019019392W WO 2019168780 A1 WO2019168780 A1 WO 2019168780A1
Authority
WO
WIPO (PCT)
Prior art keywords
augmented
augmented reality
computing device
portable computing
physical article
Prior art date
Application number
PCT/US2019/019392
Other languages
English (en)
Other versions
WO2019168780A9 (fr
Inventor
Christian Delay
Karthik RAMKUNAR
Original Assignee
Thin Film Electronics Asa
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thin Film Electronics Asa filed Critical Thin Film Electronics Asa
Publication of WO2019168780A1 publication Critical patent/WO2019168780A1/fr
Publication of WO2019168780A9 publication Critical patent/WO2019168780A9/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Definitions

  • the present invention generally relates to the field(s) of enhancing user experiences with objects through mobile devices. More specifically, embodiments of the present invention pertain to a computer-implemented method for associating one or more wireless security and/or identification tags with augmented reality metadata perceivable on such mobile computing devices.
  • Augmented Reality is a recently developed technology in which a computing system generates a composite view of objects in a physical, real-world environment that are "augmented" by computer-generated perceptual information, including through different modalities such as visual, auditory and haptic to name a few.
  • an image such as a person's face
  • an image is enhanced with additional elements, through filters, lenses and other tags which are overlaid on the image.
  • automated recognition is also performed on the object within the mobile device camera's field of view by capturing a large image file and communicating it over a network to a remote server where the image data is recognized by specialized/trained system.
  • the results of the recognition are then communicated to the mobile device, where they can be used by the user.
  • the drawbacks of this approach include the fact that there is a significant amount of data that has to be communicated to the recognition server, which may be prohibitive based on bandwidth and/or data utilization costs. Additional latencies are also associated with the recognition process. Finally, an image recognition-based system must rely on reasonable capture conditions, and thus may not be workable in low light situations.
  • an object may include a barcode or a 2-D barcode, such as a QR code. Since these visual codes are relatively easy to copy, they are not secure. Moreover, since they are visual in nature and they rely on a phone camera's ability, they may have less accuracy in low light conditions. Accordingly, there is a need for improved object recognition systems that can facilitate and enhance augmented reality experiences for users of mobile devices engaging with physical objects, including retail products and other items.
  • Embodiments of the present invention include the ability to associate and render an augmented reality (AR) experience with a wireless tag.
  • the AR experience is based on enhanced data about an object or commercial product (e.g., a bottle of wine, a jar of medicine, an article of clothing) that can be rendered along with other object data to a user's mobile computing device, including a graphical interface of a smartphone.
  • an object or commercial product e.g., a bottle of wine, a jar of medicine, an article of clothing
  • a first aspect of the disclosure concerns a computer-implemented method for creating an augmented reality experience on a portable computing device about a physical article (e.g., an item), comprising associating the physical article with augmented reality metadata including multimedia data elements relating to the physical article, the augmented reality metadata being renderable as part of an augmented experience on the portable computing device, storing the augmented reality metadata on a server computing system that is accessible over a network, reading a first augmented experience wireless tag affixed to the physical article with a reader integrated within the portable computing device to determine a first augmented experience wireless tag identification code, communicating the first augmented experience wireless tag identification code (TID) with the portable computing device over the network to the server computing system as part of a request for augmented experience data, verifying the portable computing device as authorized to receive the first augmented reality metadata for the first augmented experience wireless tag identification code, retrieving the first augmented reality metadata corresponding to the first augmented experience wireless tag identification code with the server computing system after the verifying the portable computing device, communicating the first
  • Another aspect of the disclosure concerns a computer-implemented method of presenting an augmented reality experience on a portable computing device to a user about a physical article
  • presenting a graphical user interface (GUI) in the portable computing device wherein the GUI is adapted with a selectable article augmented reality (AR) option enabling the user to engage with and render augmented reality metadata for the physical article within the GUI, reading a first augmented experience wireless tag affixed to the physical article with a reader integrated within the portable computing device to determine a first augmented experience wireless tag identification code, communicating the first augmented experience wireless tag identification code (TID) with the portable computing device over a network to a server computing system as part of a request for an augmented experience data, wherein the request may include authorization data verifying that the portable computing device is authorized to receive first augmented reality metadata for the first augmented experience wireless tag identification code, receiving the first augmented reality metadata over the network at the portable computing device, the first augmented reality metadata includes multimedia data elements relating to the physical article, and rendering the
  • the first augmented experience wireless tag may be rendered within the GUI in accordance with context and rendering rules associated with the user and the portable computing device, which may be related to a time/place/manner/user associated with the requested experience and/or a particular format for rendering the experience within the GUI, etc.. Excepting for transmission delays, the augmented reality data may be rendered in real time for the user.
  • Still another aspect of the disclosure concerns a method for enabling a vendor to create an augmented reality experience renderable on a portable computing device about a physical article with a computing system comprising the following steps with a computing system: associating a first physical article with first augmented reality metadata including multimedia data elements relating to the first physical article wherein the first augmented reality metadata is configured to be renderable as part of an augmented experience on the portable computing device for the first physical article, associating a first augmented experience wireless tag with the first physical article, which first augmented experience wireless tag is coded with both a first product identification code and a first vendor identification code, specifying a set of context rules and a set of rendering rules to be considered and used when presenting the first augmented reality metadata for the first physical article on the portable computing device, storing the first augmented reality metadata on a server computing system that is accessible over a network by the portable computing device, in response to confirming a valid read of a first augmented experience wireless tag affixed to the first physical article and a first request
  • the multimedia data elements for the augmented reality metadata may include at least one of graphics, audio, video and haptics data.
  • the multimedia data elements may include information about a human user of the portable computing device.
  • the augmented experience is rendered by superimposing the multimedia data elements with image data for the physical article.
  • the server computing system may be part of a cloud computing facility and the network can include a cellular network and the Internet.
  • the augmented experience wireless tag may be a flexible electronic tag printed with an electronic ink, adapted to respond to a near-field- communications (NFC) interrogation signal presented during a tap event which may result from moving the portable computing device in proximity to the tag.
  • NFC near-field- communications
  • the tag also may be formatted with non-volatile memory data fields that identify at least a manufacture ID and a product ID.
  • the portable computing device is preferably verified by the server computing device prior to reading the first augmented experience wireless tag. The verification process can also consider a user identification code that is presented and/or the location of the portable device.
  • a benefit of the present method(s) is that the physical article is identified through the first augmented reality metadata at the server computing system using only the first augmented experience wireless tag identification code and without assistance or reference to any other data captured by the portable computing device, including camera image data. This allows for rendering identification information even under scenarios that present challenges for prior art image recognition systems, such as in low light conditions. This further means that in various embodiments, only data output from the reader needs to be communicated over the network to obtain complete identification data for the physical article.
  • a vendor in addition to presenting augmented reality data, a vendor can enable one or more features of an application program to benefit/reward a user.
  • FIG. 1 is a diagram of an exemplary cloud-based tag management system enabling a service provider to, among other functions, coordinate tag creation, tag transfers, tag transactions, tag product assignments, tag marketing, etc. for and between manufacturers, merchants and end users in accordance with the teachings of the present disclosure;
  • FIG. 2A is a diagram of an exemplary cloud-based Product
  • Manufacturing/Tag Computing Support system enabling a manufacturer to create and manage tags in connection with product manufacturing operations in accordance with the teachings of the present disclosure
  • FIG. 2B is a diagram of an exemplary cloud-based Tag Mfr/Service Provider
  • Computing Support system enabling a tag service provider to manage and coordinate tags for manufacturers, merchants and end users in accordance with the teachings of the present disclosure
  • FIG. 2C is a diagram of an exemplary cloud-based Merchant Product/Tag
  • Computing Support system enabling a merchant to manage tags, products, etc. in connection with product marketing and sales operations in accordance with the teachings of the present disclosure
  • FIGs. 3A - 3F are diagrams and flowcharts depicting the structure and operation of exemplary tags in accordance with one or more embodiments of the present invention.
  • FIG. 4 A is a diagram of exemplary hardware and software employed in a mobile computing device implemented as a Tag User Computing Support system enabled with tag management functions in accordance with one or more embodiments of the present invention
  • FIG. 4B depicts an exemplary graphical interface of a mobile computing device enabled with tag management functions in accordance with one or more embodiments of the present invention
  • FIG. 5A is a diagram of an exemplary cloud-based tag augmented reality system enabling an enhanced experience for a tagged item within a mobile computing device for an end user in accordance with the teachings of the present disclosure
  • FIG. 6 is a flow chart showing an exemplary method for creating and presenting an augmented reality experience for a tagged item in accordance with embodiments of the present invention.
  • FIGs. 7A - 7D depict an exemplary display on the GUI of a computer or a mobile device presenting an augmented reality experience for a tagged item in accordance with embodiments of the present invention.
  • the term“signal” refers to any known structure, construction, arrangement, technique, method and/or process for physically transferring data or information from one point to another. Unless indicated otherwise from the context of its use herein, the terms “information” and“data” may be used interchangeably, although each term is generally given its art-recognized meaning.
  • the terms“coupled to,”“connected to,” and“in communication with” may be used interchangeably and indicate both direct and indirect couplings, connections and communications, but each term is also generally given its art-recognized meaning.
  • the terms“known,”“fixed,”“given,”“certain” and“predetermined” generally refer to a value, quantity, parameter, constraint, condition, state, process, procedure, method, practice, or combination thereof that is, in theory, variable, but is typically set in advance and not varied thereafter when in use.
  • wireless tag (or simply “tag”) as used herein preferably refers to near-field communication (NFC), radio frequency (RF), high frequency (HF), very high frequency (VHF), or ultra high frequency (UHF) tags.
  • the mobile or portable device may comprise a smart phone configured to communicate wirelessly with the wireless tags.
  • the tags may be associated with a user account using a customized tag application on the mobile device.
  • the tags are of the NFC type manufactured by Thin Film Electronics ASA (TFEA) in printed integrated circuit (PIC) form (preferably made using TFEA's proprietary printed dopant polysilicon (PDPS) technology) under the tradenames SpeedTapTM and OpenSense.TM
  • the wireless tags are manufactured using printed doped polysilicon (PDPS) technology (see, e.g., U.S. Pat. Nos. 7,314,513 [Attorney Docket No. IDR0302], 7,485,691 [Attorney Docket No. IDR0422], 8,846,507 [Attorney Docket No. IDR0884], 9,045,653 [Attorney Docket No. IDR1102], and 9,359,513 [Attorney Docket No. IDR1942], the relevant portions of which are incorporated herein by reference).
  • PDPS printed doped polysilicon
  • FIG. 3A A circuit diagram identifying the main components of a preferred example of a tag 300 used in the present embodiments is shown in FIG. 3A. These tags preferably include the following general characteristics:
  • Tag-Talks-First (TTF) Protocol/Mode meaning the tag preferably transmits its code after it receives enough power from a reader field (FIG. 3E).
  • the tag does not wait for or require and additional commands from a reader before transmitting its code, and for security reasons, preferably does not acknowledge/recognize any commands from the reader
  • these types of tags preferably comply with the NFC Barcode protocol, a common NFC protocol supported by top-tier NFC controllers from NXP, Broadcom, Samsung, Sony, Toshiba, and others.
  • the tags are preferably passive, l28-bit NFC tags operating at l3.56MHz and using a Tag-Talks-First (TTF) protocol.
  • TTF Tag-Talks-First
  • These types of NFC tags operate preferably in a read-only mode to transmit l28-bit codes to NFC- enabled devices, such as phones, tablets, PCs, and set-top boxes.
  • the data in the tags is also primarily stored in permanent, unalterable read-only memory but may in some embodiments include a number of reprogrammable dynamic bits to reflect the status of connected or integrated sensors and other information that could change over time. Because these types of tags do not receive information via RF, all data transmissions are unidirectional, from tag 300 to reader 310.
  • the NFC SpeedTap and NFC OpenSense tags also preferably store data following the NFC Barcode data formats (previously known as the Kovio NFC Barcode data formats). These are standardized representations of data so that operating systems and applications can consistently interpret the l28-bit data stream.
  • An example of a memory map 330 preferably used by such tags is shown in FIG. 3D. As seen in FIG. 3D the tags preferably include dedicated fields for such parameters as a manufacturers ID field 332, a data format specifier field 334, a data payload field 336 and a CRC field 338.
  • the 128 bit code 330 includes an 8-bit (l-byte) Manufacturer ID field 332 consisting of a start bit and a 7-bit ID.
  • 8-bit (l-byte) Manufacturer ID field 332 consisting of a start bit and a 7-bit ID.
  • TTF Tag-Talks-First
  • a 7-bit manufacturer ID (based on the least significant 7 bits of the manufacturer IDs specified in the ISO/IEC 7816-6 specification) follows the standard logical‘1’ start bit.
  • An 8-bit (l-byte) data format identifier field 334 then describes how an NFC reader should interpret the contents of the payload field 336.
  • the data format identifier preferably contains two sections: Reserved bits and a Data Type Format.
  • the 3-bit Reserved section is set to‘000’ for a l28-bit NFC Barcode.
  • a 5-bit Data Type Format allows for 32 possible data types.
  • the data payload field 336 is preferably 96 bits, and may include separate components such as a tag ID 336a, an object/item ID 336b and a vendor ID 336c or some other convenient format for the application in question.
  • the payload 336 can be used for any number of data purposes including for identifying a uniform resource locator (URL) having different formats, an electronic product code (EPC) or any other desired identification/metadata information.
  • URL uniform resource locator
  • EPC electronic product code
  • the CRC field 338 can be coded in accordance with any number of conventional specifications as needed to support a particular application.
  • the tag identification codes are assigned to products in accordance with the teachings of application serial no. 15/904,178, also assigned to the present applicant, and hereby incorporated by reference. Again, it should be understood that other NFC Barcode data formats can be used in other applications, and as standards for tags evolve, it is expected that other variations will be employed in the future.
  • a “tap” or “tapping event” refers to the transmission of the NFC code by the tag when it is sufficiently close to be read by an NFC controller as may be embodied in a portable computing device (e.g. smartphone).
  • the term “tap” in this instance does not require physical contact or bumping of the tag, but, rather, merely waving or placing the reader in close proximity to the tag.
  • the distance range of detectable taps or tapping events can be adjusted of course, by altering field strength, reader antenna size and other physical/transmission parameters.
  • the tag continues, at a predetermined interval and standardized protocol, to re transmit the entire length as long as the NFC Barcode is powered up in the reader’s field.
  • the transmission intervals are separated by sleep cycles, which timing periods are again predefined according to an operating standard used in the particular application.
  • wireless near field communication (NFC) and radio frequency (RF) security and/or identification tags can be used by manufacturers, distributors and other entities to digitally identify, track and manage products and other objects.
  • owned in connection with a tag means generally that it is associated with a user account by the manufacturer of the tags before receipt by the user (e.g., a product manufacturer, distributor, reseller, packager, end user [consumer], etc.).
  • give away refers to tags are not pre-associated with a user account, and may be associated with a user account by the user of the mobile device. Give-away tags may be given away at conferences or demonstrations or as samples, or may be sold as a commodity item.
  • group when used herein preferably refers to tags manufactured on a common roll or sheet, and/or which have at least some common manufacturing ID 332 (FIG. 3D) data, payload data 336, etc. It will be understood that in some embodiments, tags which have different physical tag ids 330 may nonetheless be logically associated to create groupings at different logical levels by the support software described herein.
  • FIG. 1 is a diagram of an exemplary cloud-based tag management system 100 enabling a service provider to, among other functions, coordinate tag creation, tag transfers, tag transactions, tag product assignments, tag marketing, etc. for and between manufacturers, merchants and end users in accordance with the teachings of the present disclosure.
  • the tag management system 100 preferably includes a front-end cloud computer system 110 and a back-end cloud computing system 170 connected through a secure connection 114.
  • the system 100 further preferably includes separate computing support systems for the different tag stakeholders, including a Product Manufacturing/Tag Computing Support system 140 (shown in more detail in FIG.2A) a Tag Manufacturer/Service Provider Computing Support systeml50 (shown in more detail in FIG.
  • the tag management system further preferably comprises a tag user computing support system for end-users, including consumers, including one or more mobile devices 120 (or conventional PCs) executing a mobile tag manager application 125 (or web portal 126) and connected through both TCP/IP protocol network l05a (preferably the Internet) and a cellular network l05b.
  • tags can be managed by system 100 include wireless security tags (e.g., continuity sensing tag 135') a wireless identification tag 135, and other known types.
  • the cloud computing systems (110, 170) may provide shared computer processing resources and data to the other devices in the system, and may be implemented using a cloud computing service such as Google Cloud PlatformTM or Amazon Web ServicesTM.
  • the computing systems (110, 170) may be implemented using a service model such as software as a service (SaaS). Some or all of the data may be accessed by authorized users, but is protected from access by unauthorized users.
  • SaaS software as a service
  • the tag manufacturer (service provider) applications may be partially executed using the cloud computer 110.
  • the tag manufacturer applications are accessible from a support system 150, as well as through various client devices (such as the mobile device 120) through either a web browser or a program (e.g., application) interface.
  • client devices such as the mobile device 120
  • program e.g., application
  • the various stakeholders including tag manufacturer, product manufacturers, merchants (distributor, reseller) or end-users do not manage or control the underlying infrastructure in the cloud computer 110 or 170 including any network, servers, operating systems, and/or storage devices.
  • FIG. 1 depicts only those components of system 100 critical to understanding the present teachings. Moreover, other components and software modules may be employed in system 100 consistent with the present teachings.
  • FIG. 2A is a diagram of an exemplary cloud-based tag manufacturing support system enabling a manufacturer to create and manage tags with a back end cloud computing system for product manufacturing operations in accordance with the teachings of the present disclosure.
  • System 170 is a back end cloud computing system that includes one or more computing servers 172, product database 178, tag database 179 and related software modules that support manufactures integrating tags with any type of product/object, such as apparel, consumables, household items, pharmaceuticals, or any other commercial article 137 on which a tag 135 or 135' (which can be in the form of a roll, sheet, etc.) can be affixed directly or as part of packaging during a manufacturing process.
  • a tag 135 or 135' which can be in the form of a roll, sheet, etc.
  • the product-tag support system further preferably comprises a host computing system 140 (e.g. a PC, smartphone, etc.), typically onsite at the product manufacturer facility, which system further includes a portal application (not shown, but which may take on any number of conventional forms) to permit communications with a cloud system 170, including a manufacturing administrative module 174, a manufacturing interface module 173, and various tag ID management applications in module 177.
  • a manufacturer tag writer/application module 176 controls the application of tags to products/packaging during the manufacture of the articles of interest at a fabrication facility 175.
  • the various software modules of FIG. 2A assist product manufacturers in managing the creation, application and tracking of products including tags.
  • a manufacturing admin module 174 provides visualization and configuration tools, including for enabling users to designate particular tag types/IDs for particular products.
  • the tag IDs are provided by a tag manufacturer through an interface module 173 by a service provider, or, in some instances can be generated directly by a tag ID management module 177.
  • a product manufacturer can maintain separate databases of both tags (M-Tag 178) and products (M-Product 179).
  • the type and form of the data in such databases may be specified in any convenient form most suitable for the manufacturer's particular operations, infrastructure, etc. Since it is conceivable that the same tag or product can be managed and tracked differently by different stakeholders using different data formats and logical identifiers, the nomenclature in FIG.
  • M-Prod db 179 and M-Tag dB 178 denotes such distinction.
  • the application of specific tag ids to specific products is controlled and monitored by a module 176 at the product manufacturing facility 175. In this manner, a product manufacturer can maintain an accurate inventory and record of tag/product pairings.
  • This product/tag pairing data 176’ then be shared with other systems as desired, including through an API call or other known mechanisms known in the art. While shown as part of front end cloud computing system 170, it will be appreciated by those skilled in the art that some or all portions of such modules, databases, interfaces, etc. in FIG. 2A can be implemented as part of host computing system 140 as well.
  • FIG. 2B is a diagram of an exemplary cloud-based tag service provider support system enabling a tag service provider to create, manage and coordinate tags for manufacturers, merchants and end users in accordance with the teachings of the present disclosure.
  • a front end cloud-computing system 110 is accessed by a tag manufacturing (and/or tag service provider) host system 150.
  • tags 135 are manufactured in a tag fabrication facility 138 in the form of rolls, sheets, or other conventional forms.
  • the tags are physically coded during manufacture in accordance with any number of tag identification code types and formats (see e.g. FIG. 3, 336a, 336b, 336c).
  • System 110 includes specified by a tag management module 156, which is a front end cloud computing system that includes one or more computing servers 112, a tag ID (S-tag) database 158, a tag metadata database 159, a user identification code database 157, and related software modules that support tag creation support and management functions.
  • the type and form of the data in such databases may be specified in any convenient form most suitable for the tag provider's particular operations, infrastructure, etc.
  • the tag IDs tracked by system 110 may be the same or have different physical IDs than those tracked by system 170, the tag IDs are designated with a (potentially) different dB index (i.e., S-tag as opposed to M-tag).
  • the tag manufacturer/provider system further preferably comprises a host computing system 150 (e.g. a PC, smartphone, etc.), typically onsite at the tag manufacturer facility, which system further includes a portal application (not shown, but which may take on any number of conventional forms) to permit communications with a cloud system 110, including a service administrative module 154, a manufacturing interface module 152 which communicates over a secure connection to back-end cloud system 170 and to a merchant support system 160 (FIG. 2C) and a tag management module 156 that comprises various tag ID management applications.
  • a tag engagement module 151 interacts with and coordinates transactions with end-user systems such as seen in FIG. 2D, including through receipt and processing of tap events, user identification information, and related context data from user computing devices.
  • Tag engagement module 151 further generates and provides any necessary responses from system 110 as described further below, including tag AR metadata, enhanced tag secure data, tag ownership transaction details and tag identification codes.
  • the various software modules of FIG. 2D assist product manufacturers in managing the creation, application and tracking of products including tags. While shown as part of front end cloud computing system 110, it will be appreciated by those skilled in the art that some or all portions of such modules, databases, interfaces, etc. in FIG. 2B can be implemented as part of host computing system 150 as well.
  • FIG. 2C is a diagram of an exemplary cloud-based tag merchant support system enabling a merchant to manage tags, products, etc. in connection with product marketing and sales operations in accordance with the teachings of the present disclosure.
  • a front end cloud-computing system 110 is accessible to a merchant product/tag host system
  • product/tag ID information 176' from one or more manufacturers can be input from a source including a back-end cloud computing system 170 (FIG. 2 A).
  • the product is a consumable item (beer) 137, which has an affixed tag (integrated as part of the label).
  • a merchant/vendor can customize additional content for the product, including multi-media data (video, audio, graphics, etc.) 138 which can be presented when the product tag is read as part of an augmented reality (AR) experience discussed further below.
  • AR augmented reality
  • System 160 includes a combination of hardware and software components that support merchant (retailer/distributor) product-tag marketing, promotions and sales operations, including one or more server computing machines 164, a host computing system
  • System 160 also supports a merchant website 139, which can be configured with product/marketing/sales webpages in any number of styles known in the art and made accessible to web-enabled browsers (including on smartphones) through conventional uniform resource locators (URLs). Resources and control access to system 160 can also be made through secure applications executing on smartphones 16G and similar portable computing devices.
  • the tag manufacturer/provider system further preferably comprises a host computing system 161 (e.g. a PC, smartphone, etc.), typically onsite at the merchant facility, which system further includes a portal application (not shown, but which may take on any number of conventional forms) to permit communications with a cloud system 110.
  • Merchant support system 160 further includes a number of software modules, including a merchant (retailer/distributor or Rtag) management module 165 that enables and supports tag creation, tag-product association, tag-content association, and related management/marketing functions attendant to the marketing, promotion and sales of products including physical tags.
  • tag IDs tracked by system 160 may be the same or have different physical IDs than those tracked by systems 110/170, the tag IDs are designated with a (potentially) different dB index (i.e., R-Tag, as opposed to S-tag and M-tag).
  • System 160 may further include an AR Context Rules module 163 and AR
  • Device Rendering Logic module 166 which are responsible for identifying, selecting and presenting customized content to end-user devices 120 within a customized application 125 (see FIG. 1) or as part of a customized experience within a browser accessing website 139.
  • AR Context Rules module 163 dictates user, time, place, manner controls and filters, so that, for example, certain content may be presented for a designated product tag ID (i.e. wool sweater) only to selected users meeting certain criteria (i.e. new customers) at particular locations (i.e., designated partner store) at particular times (i.e. in fall months). All of such parameters can be extracted from the end-user's device 121, product tag 135 and other merchant customer information in db 167.
  • product tag ID i.e. wool sweater
  • AR Device Rendering Logic module 166 which is responsible for providing appropriate metadata in the right format for the particular desired AR experience on a target device. For example, a merchant may specify that a designated graphics overlay with particular dimensions should be made on a particular portion of a target device (i.e., model A smartphone by brand X). Again, any form of software tools and controls for overlaying, supplementing and augmenting existing media files (e.g., a graphical image captured by a phone) can be used for this module. While shown as a standalone system in FIG. 2C, it will be understood by skilled artisans that part or all of system 160 could be implemented by front end cloud computing system 110 and controlled/managed through portal applications with basic devices 161, 16G and the like.
  • FIG. 4A is a diagram of exemplary hardware and software employed in a mobile computing device 120 enabled with tag management functions in accordance with one or more embodiments of the present invention.
  • the device 120 includes a customized CPU 122 for executing mobile applications, a memory 123 (which may take different forms, including volatile DRAM/SRAM and non-volatile EEPROM), a set of different types of sensors 124 (camera, microphone, touch, gyroscopic to name a few) for capturing different physical stimuli, a Universal Integrated Circuit Card (UICC) or SIM card 126 for communicating over a cellular channel (such as a carrier network 105), Bluetooth/GPS and WiFi communication circuits 127, and various I/O circuits, including display, speakers, etc.
  • a customized CPU 122 for executing mobile applications
  • a memory 123 which may take different forms, including volatile DRAM/SRAM and non-volatile EEPROM
  • sensors 124 camera, microphone, touch, gyroscopic to name a few
  • UICC Universal
  • a mobile computing device includes one or more Near Field Communication (NFC) support circuits, including an NFC communications IC l2la, an associated Secure Element l2lb and an NFC receive/transmit antenna l2lc.
  • Device 120 further includes a number of firmware and software components, including an Operating System (OS) l25a (e.g., Android, IOS), a web/network network software interface l25b (e.g., Safari, Chrome, etc.) for establishing communication sessions over an IP network channel l05a (e.g. Internet) and one or more software applications l25c executing on the device and enabling different functions I/O and computational functions.
  • OS Operating System
  • l25a e.g., Android, IOS
  • a web/network network software interface l25b e.g., Safari, Chrome, etc.
  • FIG. 4B depicts an exemplary graphical interface of a mobile computing device enabled with tag management functions in accordance with one or more embodiments of the present invention.
  • These applications generally include an augmented reality (AR) tagged Item application l29a, an enhanced tag/app application l29b, a tag transfer application l29c, an assign new tag application l29d, and a provisions tag application l29e.
  • AR augmented reality
  • FIG. 4B depicts an exemplary graphical interface of a mobile computing device enabled with tag management functions in accordance with one or more embodiments of the present invention.
  • These applications generally include an augmented reality (AR) tagged Item application l29a, an enhanced tag/app application l29b, a tag transfer application l29c, an assign new tag application l29d, and a provisions tag application l29e.
  • AR augmented reality
  • NFC tags can be associated by object stakeholders with additional augmented reality (AR) metadata that can be presented when the tag is read by a mobile device user.
  • AR augmented reality
  • the wireless tags can be associated with a product by a manufacturer of the tag as well as AR metadata, and then subsequently shipped to the manufacturer or distributor of the product or to another entity that physically affixes the tag to a target object.
  • manufacturers, distributors and resellers can customize the wireless tag management system (e.g., to control the user experience for a given product).
  • a manufacturer or distributor may want a user to see a customized response interface on their mobile device when communicating with a tag on a particular product at a particular time, or under particular conditions, as opposed to a generic response interface.
  • the customized response interface in turn, can be delivered to the user's mobile computing device with augmented reality metadata, to impart an augmented reality experience.
  • Images ⁇ Objects ⁇ Text must be first identified by Image ⁇ Object ⁇ Text recognition in response to a user request, which operations are computationally complex, subject to significant latency, and, in some instances impractical if not impossible.
  • a user typically has to access the camera application, focus on object, capture the image and forward it to a remote server for image ⁇ text ⁇ object recognition to be performed. This can be slow, cumbersome and inconvenient for applications such as shopping in a physical store where a user may have limited time to check on several items during a visit.
  • an improved augmented reality experience is rendered to a user device based on a simple tap to an NFC enabled tag.
  • the augmented tag process using NFC technology identifies a physical real world object ⁇ Image ⁇ Text with precision, accuracy and rich information such as type, color and related customized metadata that can render an augmented reality experience specifically catered to an identified object/user and in accordance with other context rules.
  • the process is faster, fluid, smooth and more natural for user to tap on a tag of an object and receive an Augmented Reality experience.
  • one additional technical improvement over the prior art image ⁇ object ⁇ text recognition process is increased accuracy (potentially greater than 99.99%) from NFC Tag tap based identifications of an object.
  • configuring and maintaining databases for image/object/text recognition is significantly simplified compared to the prior art.
  • FIG. 5A is a diagram of an exemplary cloud-based tag augmented reality system 500 enabling an enhanced experience for a tagged item within a mobile computing device for an end user in accordance with the teachings of the present disclosure.
  • the AR experience is enabled by merchant related components in the front-end cloud computing system 110 that are returned in response to requests from user devices 520.
  • one or more objects 533 which may be a road sign with non-English text, a coffee machine (in this instance an item originating from the Starbucks Corporation), a book (in this instance a title published by Wiley), an article of clothing, etc. include an associated tag 535.
  • Each tag includes a unique ID as explained above.
  • An application 522 on device 520 is activated and used to read the tag 535 using NFC with a tap gesture as further noted above.
  • the application 520 invokes an AR experience for the object, and then preferably communicates the details of the tap event 526a, including the tag ID (which may include object data) and along with user identification (UID) and location information to tag engagement module 151 (FIG. 2B) in front-end cloud computing system 110 over network channel l05a.
  • the tag metadata 526b associated with a customized AR experience is retrieved from db 159 (FIG. 2B) and returned to device 520 where it can be rendered on the user's device in accordance with parameters provided by a merchant computing system 160 (FIG.
  • tag engagement module 151 may service requests from device 120 by contacting merchant product-tag support system 160 directly for purposes of responding to AR requests.
  • the original in non-foreign text is presented and supplemented with AR metadata 525' - in this case, an English translation text overlay to assist the user in understanding the sign.
  • the format, placement, etc. of the overlaid information can be rendered as noted above in compliance with any specifications or context rules provided by a merchant or vendor.
  • the merchant/vendor FIG. 2B
  • tag service provider can consolidate/access AR tag information from multiple vendors, the user device 520 does not require installing or invoking multiple specialized applications for identifying different types of objects. In other words, to recognize beverages, appliances, apparel, etc., the present art requires separate applications specialized for each.
  • AR tag embodiments of the present invention a single AR tag application can render AR experiences for multiple products types, a significant advantage for reducing interface complexity, reducing user confusion, etc.
  • Those skilled in the art will appreciate that other components can be utilized in system 500 in accordance with the present teachings.
  • FIG. 6 is a flow chart showing an exemplary method 600 for creating and presenting an augmented reality experience for a tagged item in accordance with embodiments of the present invention.
  • a user tap on a tag for an article is detected by the device 120 in the manner described above in connection with FIGs. 3A-3F.
  • the tag id (included in some or all of payload 136) is then extracted and read at step 610.
  • the user id (which may be any one or more of a username, password, or application registration ID) and location data can also be derived from the application and device sensors (i.e. GPS and similar techniques) at step 615.
  • a formal user AR request is then presented to cloud computing system 110 at step 620, which then determines at step 630 from available tag and uid databases (FIG. 2B, databases 157/158) if the request is valid. When these request parameters are invalid, the process simply terminates at step 625 by returning a reply to the user that there is no available data for the tag in question.
  • the Tag and UID information (which may include a device or application ID) are determined to be valid, the context and rendering rules (FIG. 2C, modules 163/166) for the AR data are then considered and processed at step 635 for applicability to the particular user, device, time, location, etc.
  • Information about the object, and the object's associated AR metadata (FIG. 2C db 167; FIG.
  • the AR metadata may include at least one of graphics, audio, video and/or haptics data, or any other data renderable on the device 120 and which can be overlaid or framed along with other original data for an item, including image data captured by the device itself.
  • the augmented reality experience data is returned and then rendered on device 120 by application 125 as noted above, including by superimposing AR elements onto base data elements of the product or article.
  • the AR experience can be customized on a user-by-user basis as well so that different persons tapping the same tag will be presented with different enhancing metadata. This differentiation can be used to reward frequent purchasers, to solicit new customers, etc.
  • FIGs. 7A - 7D depict an exemplary display on a GUI 725 of a computer or a mobile device presenting an augmented reality experience for a tagged item 733 in accordance with embodiments of the present invention.
  • a single tag based AR application can replace multiple separate applications typically required in the art.
  • FIG. 7 A corresponds to the use case shown in FIG. 5 A in which an object (or text data from the object) is identified and enhanced.
  • This AR case replaces a conventional text recognition based application that is typically used in these cases.
  • FIG. 7B corresponds to a use case involving an article such as a book (in this instance a title published by Wiley) in which the tag metadata may correspond to a URL for a product page, an advertisement, etc.
  • This AR case replaces a conventional image recognition based application (including QR codes, barcodes, etc.) that is typically used in these cases.
  • FIG. 7C corresponds to a use case involving an article of clothing, in which the tag metadata may correspond to an image of a person (including potentially the user) synthesized and overlaid with the article image, an advertisement, etc.
  • This AR case replaces a conventional image recognition/human target based application that is typically used in these cases.
  • FIG. 7D corresponds to a use case involving a commercial product (in this case a coffee machine originating from the Starbucks Corporation), in which the tag metadata may correspond to an image of the product synthesized and overlaid with other images, an advertisement, a short video demonstration, etc.
  • This AR case again replaces a conventional image recognition application that is typically used in these cases.
  • modules of the present invention can be implemented using any one of many known programming languages suitable for creating applications that can run on large scale computing systems, including servers connected to a network (such as the Internet) as part of a cloud computing system.
  • a network such as the Internet
  • the details of the specific implementation of the present invention will vary depending on the programming language(s) used to embody the above principles, and are not material to an understanding of the present invention.
  • a portion of the hardware and software will be contained locally to a user's computing system, which can include a portable machine or a computing machine at the user's premises, such as a personal computer, a PDA, digital video recorder, receiver, etc.

Landscapes

  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Des étiquettes sans fil sont associées et utilisées pour fournir des expériences de réalité augmentée (AR) à des utilisateurs de dispositifs informatiques mobiles. L'AR peut être incluse dans des articles et des objets communs afin d'améliorer l'interaction et l'appréciation de tels articles par un utilisateur. Les informations d'étiquettes peuvent être utilisées pour identifier facilement des objets et des articles sans nécessiter d'outils de reconnaissance d'objets significatifs ni de latences associées.
PCT/US2019/019392 2018-02-27 2019-02-25 Système et procédé permettant de fournir une expérience de réalité augmentée à des objets à l'aide d'étiquettes sans fil WO2019168780A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862635698P 2018-02-27 2018-02-27
US62/635,698 2018-02-27

Publications (2)

Publication Number Publication Date
WO2019168780A1 true WO2019168780A1 (fr) 2019-09-06
WO2019168780A9 WO2019168780A9 (fr) 2019-10-24

Family

ID=67805964

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/019392 WO2019168780A1 (fr) 2018-02-27 2019-02-25 Système et procédé permettant de fournir une expérience de réalité augmentée à des objets à l'aide d'étiquettes sans fil

Country Status (1)

Country Link
WO (1) WO2019168780A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210256561A1 (en) * 2020-02-14 2021-08-19 Walmart Apollo, Llc Systems and methods for presenting augmented reality promotion indicators
WO2021184388A1 (fr) * 2020-03-20 2021-09-23 Oppo广东移动通信有限公司 Procédé et appareil d'affichage d'image, et dispositif électronique portable
CN114071246A (zh) * 2020-07-29 2022-02-18 海能达通信股份有限公司 媒体增强现实标签方法、计算机设备及存储介质
US11886767B2 (en) 2022-06-17 2024-01-30 T-Mobile Usa, Inc. Enable interaction between a user and an agent of a 5G wireless telecommunication network using augmented reality glasses

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020167500A1 (en) * 1998-09-11 2002-11-14 Visible Techknowledgy, Llc Smart electronic label employing electronic ink
US20070240152A1 (en) * 2006-03-24 2007-10-11 Red. Hat, Inc. System and method for sharing software certification and process metadata
US20090115576A1 (en) * 2007-11-02 2009-05-07 Symbol Technologies, Inc. Efficient Variable Format Data Encodation in RFID Tags and Other Media
WO2013166191A2 (fr) * 2012-05-01 2013-11-07 Zambala Llp Système et procédé pour faciliter des transactions d'un produit physique ou d'un service de vie réelle par l'intermédiaire d'un environnement de réalité augmentée
US20140091910A1 (en) * 2003-03-03 2014-04-03 Veroscan, Inc. Interrogator and interrogation system employing the same
US20160064003A1 (en) * 2013-04-03 2016-03-03 Dolby Laboratories Licensing Corporation Methods and Systems for Generating and Rendering Object Based Audio with Conditional Rendering Metadata
US20160269385A1 (en) * 2013-09-23 2016-09-15 Amazon Technologies, Inc. Location service for user authentication
US9460573B1 (en) * 2014-02-27 2016-10-04 Sprint Communications Company, L.P. Autonomous authentication of a reader by a radio frequency identity (RFID) device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020167500A1 (en) * 1998-09-11 2002-11-14 Visible Techknowledgy, Llc Smart electronic label employing electronic ink
US20140091910A1 (en) * 2003-03-03 2014-04-03 Veroscan, Inc. Interrogator and interrogation system employing the same
US20070240152A1 (en) * 2006-03-24 2007-10-11 Red. Hat, Inc. System and method for sharing software certification and process metadata
US20090115576A1 (en) * 2007-11-02 2009-05-07 Symbol Technologies, Inc. Efficient Variable Format Data Encodation in RFID Tags and Other Media
WO2013166191A2 (fr) * 2012-05-01 2013-11-07 Zambala Llp Système et procédé pour faciliter des transactions d'un produit physique ou d'un service de vie réelle par l'intermédiaire d'un environnement de réalité augmentée
US20160064003A1 (en) * 2013-04-03 2016-03-03 Dolby Laboratories Licensing Corporation Methods and Systems for Generating and Rendering Object Based Audio with Conditional Rendering Metadata
US20160269385A1 (en) * 2013-09-23 2016-09-15 Amazon Technologies, Inc. Location service for user authentication
US9460573B1 (en) * 2014-02-27 2016-10-04 Sprint Communications Company, L.P. Autonomous authentication of a reader by a radio frequency identity (RFID) device

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210256561A1 (en) * 2020-02-14 2021-08-19 Walmart Apollo, Llc Systems and methods for presenting augmented reality promotion indicators
US11854046B2 (en) * 2020-02-14 2023-12-26 Walmart Apollo, Llc Systems and methods for presenting augmented reality promotion indicators
WO2021184388A1 (fr) * 2020-03-20 2021-09-23 Oppo广东移动通信有限公司 Procédé et appareil d'affichage d'image, et dispositif électronique portable
CN114071246A (zh) * 2020-07-29 2022-02-18 海能达通信股份有限公司 媒体增强现实标签方法、计算机设备及存储介质
CN114071246B (zh) * 2020-07-29 2024-04-16 海能达通信股份有限公司 媒体增强现实标签方法、计算机设备及存储介质
US11886767B2 (en) 2022-06-17 2024-01-30 T-Mobile Usa, Inc. Enable interaction between a user and an agent of a 5G wireless telecommunication network using augmented reality glasses

Also Published As

Publication number Publication date
WO2019168780A9 (fr) 2019-10-24

Similar Documents

Publication Publication Date Title
US10546290B2 (en) Methods, systems, and computer readable media for provisioning and utilizing an aggregated soft card on a mobile device
WO2019168780A1 (fr) Système et procédé permettant de fournir une expérience de réalité augmentée à des objets à l'aide d'étiquettes sans fil
US11089461B2 (en) System and method for varying a function triggered by near field communication
KR101409754B1 (ko) 오프라인 거래 결제 시스템, 이를 위한 방법 및 장치
CN103369049B (zh) 移动终端和服务器交互方法及其系统
US20150081538A1 (en) Systems and methods for providing secure digital identification
US11228874B2 (en) Beverage container augmentation for social media
CN104584041A (zh) 信息提供方法、用于信息提供的移动终端和显示装置
US20180182026A1 (en) Systems and methods for facilitating a transaction relating to newly identified items using augmented reality
US20180218388A1 (en) Retargeting advertising product recommending user device and service providing device, advertising product recommending system including the same, control method thereof, and non-transitory computer readable storage medium having computer program recorded thereon
US20160364719A1 (en) User equipment for reverse nfc payment, nfc payment terminal, reverse nfc payment system comprising the same, control method thereof and non-transitory computer readable storage medium having computer program recorded thereon
US20140037220A1 (en) Image repository systems and methods
CN114175083A (zh) Nfc加强型增强现实信息叠加
CN114096981A (zh) 利用支付卡认证语音交易
WO2019168783A9 (fr) Système et procédé de transfert sécurisé de propriété d'étiquettes sans fil
US20170358003A1 (en) Object recognition based retargeting advertisement product recommending server, control method thereof, and non-transitory computer readable storage medium having computer program recorded thereon
WO2019168782A1 (fr) Système et procédé de gestion de la fonctionnalité d'étiquettes sans fil
KR101380109B1 (ko) 근거리 무선 통신을 이용한 건물 내부 정보 제공 시스템 및 방법
WO2019168785A1 (fr) Système et procédé de codage automatique d'étiquettes sans fil
KR102054230B1 (ko) Nfc 태그를 이용한 비교광고 서비스 제공 방법
KR102249997B1 (ko) 단말과 서비스 제공 장치, 그를 포함하는 상품 정보 제공 시스템, 그 제어 방법 및 컴퓨터 프로그램이 기록된 기록매체
US20220092582A1 (en) Systems and methods for rendering card art representation
KR20140031431A (ko) Nfc를 이용한 상품주문 시스템 및 그 방법
KR102111620B1 (ko) 단말 및 그 제어 방법
KR20120100640A (ko) 식별자를 이용한 결제 방법 및 시스템, 그를 위한 이동 단말기

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19761528

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 21/01/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19761528

Country of ref document: EP

Kind code of ref document: A1