WO2019221666A1 - System and method for determining an outcome associated with an event - Google Patents

System and method for determining an outcome associated with an event Download PDF

Info

Publication number
WO2019221666A1
WO2019221666A1 PCT/SG2018/050548 SG2018050548W WO2019221666A1 WO 2019221666 A1 WO2019221666 A1 WO 2019221666A1 SG 2018050548 W SG2018050548 W SG 2018050548W WO 2019221666 A1 WO2019221666 A1 WO 2019221666A1
Authority
WO
WIPO (PCT)
Prior art keywords
computing device
event
sensing data
operable
screen
Prior art date
Application number
PCT/SG2018/050548
Other languages
English (en)
French (fr)
Inventor
Aaron D. Madolora
Original Assignee
Voyager Innovations Holdings Pte. Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Voyager Innovations Holdings Pte. Ltd. filed Critical Voyager Innovations Holdings Pte. Ltd.
Priority to KR1020207034531A priority Critical patent/KR102522571B1/ko
Priority to JP2020555838A priority patent/JP7180951B2/ja
Priority to CN201880093352.2A priority patent/CN112106120B/zh
Publication of WO2019221666A1 publication Critical patent/WO2019221666A1/en
Priority to PH12020551897A priority patent/PH12020551897A1/en

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/3202Hardware aspects of a gaming system, e.g. components, construction, architecture thereof
    • G07F17/3204Player-machine interfaces
    • G07F17/3211Display means
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/326Game play aspects of gaming systems

Definitions

  • the present invention relates to a system and method for determining an outcome associated with an event.
  • Spectator events such as sporting events, concerts and musicals have become popular. Many people are willing to pay for attending such live events in order to ‘soak in the atmosphere’ so as to enhance the feeling of excitement.
  • hosts of the live events provide a prize or lottery draw during the live events.
  • each audience has been provided a piece of paper (‘ticket’) indicating a randomly generated number upon entry in a live event. The randomly generated number may be indicated in each ticket for the live event. Thereafter, a host of the live event may select a winner manually.
  • ticket piece of paper
  • live events have been broadcasted live through computing devices. Therefore, an audience who is watching the live event using his/her computing devices also desires an opportunity to actively participate in the live event.
  • the present invention seeks to provide a system and method that addresses the aforementioned need at least in part.
  • the invention seeks to provide a user experience in relation to an event to an audience who is at the event site as well as an audience who is watching the event using his/her computing device.
  • an event may include, but not be limited to, a prize draw, raffle and/or lottery, to determine an outcome, for example draw at least one winner.
  • a user may include, but not be limited to, an audience who is at the event site and/or an audience who is watching or listening the event using a computing device.
  • the technical solution is provided in the form of a system and method for determining an outcome associated with the event.
  • a server is operable to collect first sensing data as location data of a computing device to determine whether a user is located in the event site, and prompt the computing device to display a screen in relation to the event. Thereafter, the server is operable to determine the outcome of the event based on at least one of the first sensing data and second sensing data in relation to the screen.
  • each user is able to participate in the event regardless of their physical attendance at the event. Since the outcome of the event may depend on the first sensing data as the location data of the computing device and/or the second sensing data associated with the screen, each user is able to experience the feeling of excitement during the event, using his/her computing device.
  • a system for determining an outcome associated with an event comprising: a computing device operable to receive an input of an identifier associated with the event and generate first sensing data as location data of the computing device; and a server operable to collect the identifier and the first sensing data, and prompt the computing device to display a screen relating to the event, wherein the computing device is operable to display the screen and generate second sensing data in relation to the screen, and the server is operable to determine the outcome of the event based on at least one of the first sensing data and the second sensing data.
  • the server is operable to use a combination of a random function, the first sensing data and the second sensing data, to determine the outcome.
  • the computing device is operable to capture an image, and display the screen rendered based on the captured image.
  • the computing device is operable to display the screen in a form of at least one of augmented reality (AR) and virtual reality (VR).
  • AR augmented reality
  • VR virtual reality
  • the screen includes one or more objects which are overlapped with the captured image.
  • the objects are displayed in a form of floating or falling on the captured image.
  • the computing device is operable to drive at least one camera to capture the image, and the captured image is in a form of at least one of static image and dynamic image.
  • the identifier is in a form of at least one of the following: pin numbers, alphanumeric code, QR code, barcode, data matrix and maxicode.
  • the computing device is operable to detect whether the computing device locates at a site of the event to generate the first sensing data.
  • the server is operable to assign a value to a prize of the event based on the first sensing data, before the computing device displays the screen.
  • the computing device is operable to render the screen based on at least one of the captured image, the assigned value and a user’s information.
  • the user’s information includes at least one of the following: behaviour, preference and profile.
  • the computing device is operable to link at least one object among the objects to the prize.
  • the computing device is operable to drive at least one sensor to generate the second sensing data.
  • the second sensing data relates to at least of the following: an orientation or rotation of the computing device, acceleration or deceleration of the computing device, presence or absence of an object’s contact, touch input and proximity touch input.
  • the computing device is operated as a virtual basket to collect the prize associated with the object.
  • the collected prize is saved in an account of the user.
  • the server is operable to determine at least one active user having an account in a service associated with the server or a partner, and provide the active user with a plurality of options available as to how to redeem the prize.
  • the server and the computing device are arranged in a distributed ledger or blockchain environment for a record of information in relation to the event.
  • each of the server and the computing device are implemented as a node on the distributed ledger or the blockchain.
  • a determining an outcome associated with an event comprising: receiving, by a computing device, an input of an identifier associated with the event; generating, by the computing device, first sensing data as location data of the computing device; collecting, by a server, the identifier and the first sensing data; prompting, by the server, the computing device to display a screen relating to the event; displaying, by the computing device, the screen; generating, by the computing device, second sensing data in relation to the screen; and determining, by the server, the outcome of the event based on at least one of the first sensing data and the second sensing data.
  • the server is operable to use a combination of a random function, the first sensing data and the second sensing data, to determine the outcome.
  • the computing device is operable to capture an image, and display the screen rendered based on the captured image.
  • the computing device is operable to display the screen in a form of at least one of augmented reality (AR) and virtual reality (VR).
  • AR augmented reality
  • VR virtual reality
  • the screen includes one or more objects which are overlapped with the captured image.
  • the objects are displayed in a form of floating or falling on the captured image.
  • the computing device is operable to drive at least one camera to capture the image, and the captured image is in a form of at least one of static image and dynamic image.
  • the identifier is in a form of at least one of the following: pin numbers, alphanumeric code, QR code, barcode, data matrix and maxicode.
  • the computing device is operable to detect whether the computing device locates at a site of the event to generate the first sensing data.
  • the server is operable to assign a value to a prize of the event based on the first sensing data, before the computing device displays the screen.
  • the computing device is operable to render the screen based on at least one of the captured image, the assigned value and a user’s information.
  • the user’s information includes at least one of the following: behaviour, preference and profile.
  • the computing device is operable to link at least one object among the objects to the prize.
  • the computing device is operable to drive at least one sensor to generate the second sensing data.
  • the second sensing data relates to at least of the following: an orientation or rotation of the computing device, acceleration or deceleration of the computing device, presence or absence of an object’s contact, touch input and proximity touch input.
  • the computing device is operated as a virtual basket to collect the prize associated with the object.
  • the collected prize is saved in an account of the user.
  • the server is operable to determine at least one active user having an account in a service associated with the server or a partner, and provide the active user with a plurality of options available as to how to redeem the prize.
  • the server and the computing device are arranged in a distributed ledger or blockchain environment for a record of information in relation to the event.
  • each of the server and the computing device are implemented as a node on the distributed ledger or the blockchain.
  • Other aspects of the invention will become apparent to those of ordinary skilled in the art upon review of the following description of specific embodiments of the invention in conjunction with the accompanying drawings.
  • Fig. 1 illustrates a block diagram of a system of the present invention.
  • Fig. 2 illustrates a flowchart of an embodiment of the present invention.
  • Figs. 3 to 7 illustrate examples of user experiences of the present invention.
  • Fig. 8 illustrates a logic and data flow of an embodiment of the present invention.
  • Figs. 9 and 10 illustrate flowcharts of another embodiment of the present invention.
  • Fig. 1 is a block diagram of a system in accordance with an embodiment of the present invention.
  • the system may comprise at least one computing device 100 and at least one server 200.
  • the computing device 100 may belong to a user, for example a participant of an event, and be used to participate in the event.
  • the computing device 100 may include, but not be limited to, smartphone, laptop, tablet computer, desktop computer, television and wearable devices, in particular intelligent wearable devices such as smart watch, smart glasses or mobile virtual reality headset. It may be appreciated that a user may use a plurality of computing devices 100, for example a smartphone and a smart watch, together. In some embodiments, the plurality of computing devices 100 may communicate with each other.
  • the participant may include, but not be limited to, an audience who is located at the event site and/or an audience who is watching or listening the event using a computing device in remotely at another geographical location.
  • the event may include, but not be limited to, a prize draw, raffle and/or lottery, to determine an outcome, for example draw at least one winner.
  • the server 200 may determine the outcome, for example chances of winning of the user for the event.
  • the server 200 may belong to a host of event, for example a marketing technology service provider. It may be appreciated that another computing device may be used as the server 200. In another embodiment, the computing device 100 and the server 200 may be integrated.
  • server 200 any device determining the outcome of the event, for example the chances of winning and/or winner of the event.
  • the computing device 100 comprises at least one of a controller 120, memory 130, input unit 140, output unit 150, communication unit 160 and input/output (I/O) interface(s) 170.
  • the controller 120 may be referred to as a processing unit or a firmware.
  • the controller 120 may control the overall operations of the computing device 100.
  • the controller 120 may drive and/or control other components such as the memory 130, input unit 140, output unit 150, communication unit 160 and input/output (I/O) interface(s) 170.
  • the memory 130 may temporarily store input and/or output data, for example captured images.
  • the memory 130 may include, but not be limited to, a Random Access Memory (RAM) 131 , cache 132, storage system 133, and standalone database or clustered database 134 which distributed ledger 135 is stored.
  • the computing device 100 may contain the standalone database as well as serve as a node within a distributed ledger system or a blockchain environment.
  • the storage system 133 may refer to the collective of the standalone database or clustered database 134 and distributed ledger 135.
  • the memory 130 may be implemented using any type of suitable storage medium including at least one of a flash memory type, a hard disk type, a multimedia card micro type, a memory card type such as SD card, Read-Only Memory (ROM).
  • the memory 130 may include at least one internal memory of the computing device 100 and/or at least one external memory, for example, web storage.
  • the computing device 100 may maintain and/or update the distributed ledger 135.
  • the computing device 100 may be implemented as one of nodes on the distributed ledger 135.
  • the distributed ledger 135 may include, but not be limited to, a blockchain, hashgraph, or any variant of decentralized, distributed ledger technology.
  • the distributed ledger 135 may be updated periodically or from time to time with modifications to the ledger.
  • the modifications for example, may include, but not be limited to an insertion or an update of a ledger entry.
  • the ledger may be used for a record of information in relation to the event.
  • the information in relation to the event may include at least one of user’s information, event information and outcome information.
  • a plurality of computing devices may participate in the event.
  • the plurality of computing devices may be utilised as a decentralized processor as well as database.
  • Each computing devices may be referred to as “nodes” of the system.
  • the ledger copies which are maintained and stored on each node enable cross-validation with one another. For example, when a conflict event occurs between ledger entries, the computing devices are operable to conduct the cross-validation with one another. Hence, the information in relation to the event recorded in the distributed ledger may be secured even though the conflict event occurs.
  • the input unit 140 may receive signal from the user and/or external environment.
  • the input unit 140 may include, but not be limited to, a camera, microphone, user input unit and sensing unit.
  • the camera may capture an image of external environment.
  • the captured image may be used to render an augmented reality (AR) screen.
  • AR augmented reality
  • the image may be at least one of static image (also referred to as still image) and dynamic image (also referred to as moving image or video).
  • the camera may generate raw data.
  • the controller 120 may receive the raw data from the camera and process, for example interpret, the raw data to obtain an image.
  • the obtained image may be stored in the memory 130.
  • a plurality of cameras may be provided.
  • the computing device 100 may comprise a front camera and a rear camera.
  • the microphone may receive an audio input.
  • the microphone may generate raw data associated with the audio input.
  • the controller 120 may receive the raw data from the microphone and process the raw data to obtain an audio file.
  • the obtained audio file may be stored in the memory 130. It is to be appreciated that a plurality of microphones may be provided.
  • the user input unit may be implemented to receive a manual input from the user.
  • the user input unit may include, but not be limited to, a keypad, keyboard, touchpad, touch screen, and pointing device.
  • the user input unit may generate raw data associated with the manual input. Thereafter, the controller 120 may receive the raw data from the user input unit and process the raw data to generate input data to control an operation of the computing device 100.
  • the sensing unit may include, but not be limited to, a motion sensor, touch sensor, proximity sensor and location sensor. It is to be appreciated that the sensing unit may be integrated with the user input unit.
  • the motion sensor may drive to detect at least one of an orientation or rotation of a body of the computing device 100, and an acceleration or deceleration of the body of the computing device 100.
  • the touch sensor may be implemented to detect a position, area, pattern and/or pressure of the touch input.
  • the proximity sensor may drive to detect at least one of a presence or absence of any object’s contact and proximity touch.
  • the location sensor may drive to detect a location of the computing device 100.
  • the location sensor may include but not limited to a GPS (Global Positioning System) module to acquire location data of the computing device 100.
  • the location sensor may include a beacon reader module to receive signals from a beacon and acquire location data. It may be appreciated that the location sensor may be used with or integrated with the communication unit 160.
  • the output unit 150 may provide a visual output and/or audio output.
  • the output unit 150 may include, but not be limited to, a display and audio output unit.
  • the display is operable to display information processed by the controller 120.
  • the display may include at least one of a liquid crystal display (LCD), thin film transistor- liquid crystal display (TFT-LCD), organic light emitting diode (OLED) display, active- matrix organic light emitting diode (AMOLED) and flexible display. It may be appreciated that a plurality of display may be provided.
  • the computing device 100 may comprise a front display and a rear display.
  • the touch sensor and the display may be integrated.
  • the display may be used as the input unit 140 as well as the output unit 150.
  • the audio output unit is operable to output audio data stored in the memory 130 or received from the communication unit 160.
  • the audio output unit may include, but not be limited to, a speaker and receiver. It may be appreciated that a plurality of audio output unit may be provided.
  • the communication unit 160 may be referred to as a wireless communication unit.
  • the communication unit 160 may include, but not be limited to a mobile communication module, a wireless Internet module and a short range communication module. It may be appreciated that at least two of these modules, for example the mobile communication module and the wireless Internet module, can be integrated.
  • the mobile communication module may communicate with at least one of a base station, an external device and the server 200 over a mobile communication network.
  • the mobile communication module may transmit and/or receive a radio signal, for example a voice call signal, video call signal, text and/or multimedia message signal, using a channel access method, for example Code-division multiple access (CDMA) or Time-division multiple access (TDMA).
  • CDMA Code-division multiple access
  • TDMA Time-division multiple access
  • the wireless Internet module may support wireless Internet access to an external device and a server. Examples of such wireless internet access may include, but not be limited to, wireless LAN (WLAN) (for example, Wi-Fi), wireless broadband (Wi-bro), worldwide interoperability for microwave access (Wi-max) and high speed downlink packet access (HSDPA).
  • WLAN wireless LAN
  • Wi-Fi wireless broadband
  • Wi-bro wireless broadband
  • Wi-max worldwide interoperability for microwave access
  • HSDPA high speed downlink packet access
  • the short range communication module may support a short range communication.
  • Examples of such short range communication may include, but not be limited to, Bluetooth, Radio Frequency Identification (RFID), Ultra-wideband (UWB) and ZigBee.
  • RFID Radio Frequency Identification
  • UWB Ultra-wideband
  • ZigBee ZigBee
  • the I/O interface 170 may be implemented to interface the computing device 100 with at least one external device.
  • the I/O interface 170 may include, but not be limited to, an external charger port, memory card port and earphone port. It may be appreciated that the external charger port and the earphone port may be integrated.
  • the server 200 comprises at least one of a controller 220, memory 230 and communication unit 260.
  • the controller 220 may be referred to as a processing unit and control the overall operations of the server 200.
  • the controller 220 may drive and/or control other components such as the memory 230 and the communication unit 260.
  • the memory 230 may temporarily store input and/or output data.
  • the memory 230 may include, but not be limited to, a Random Access Memory (RAM) 231 , cache 232, storage system 233, and standalone database or clustered database 234 which distributed ledger 235 is stored.
  • the memory 230 may include at least one internal memory of the server 200 and/or at least one external memory, for example, web storage.
  • the server 200 may maintain and/or update the distributed ledger 235.
  • the server 200 may also be implemented as a node among a plurality of nodes on the distributed ledger 235.
  • the distributed ledger 235 may be updated periodically or from time to time with modifications to the ledger.
  • the ledger may be used for a record of information in relation to the event.
  • the communication unit 260 may be used to communicate with other computing devices, for example the computing device 100.
  • the communication unit 260 of the server 200 may communicate with the computing device 100 and transmit/receive information.
  • Fig. 2 illustrates a flowchart of an embodiment of the present invention.
  • an announcement may be made with instructions for users, for example participants, to input an event-specific identifier, for example code, on his/her computing devices.
  • an event-specific identifier for example code
  • a single identifier may be assigned to an event, so all the users who participate in the same event may use the same identifier. It is to be appreciated that the same identifier may be shared with other users who are not physically at the event but viewing or remotely participating in the event.
  • a plurality of identifiers may be assigned to an event. Therefore, a group of users who are physically at the event may use the first identifier, while the other group of users who are not physically at the event may use the second identifier.
  • the identifier may be in the form of at least one of the following: pin numbers, alphanumeric code, QR code, barcode, data matrix and maxicode.
  • the computing device 100 may receive an input of the identifier (S310).
  • the input unit 140 of the computing device 100 may be used to receive the input for the identifier.
  • the input unit 140 may be a user input unit that is used to receive a manual input for the identifier from the user.
  • the input unit 140 may be a camera or scanner such as QR code scanner or barcode scanner that is used to scan the identifier.
  • the input unit 140 may be a microphone that is used to receive an audio input for the identifier.
  • the input unit 140 may generate raw data in relation to the input of the identifier, and transmit the raw data to the controller 120.
  • the controller 120 may detect the identifier and transmit the identifier to the server 200 via the communication unit 160, for example the wireless Internet module, mobile communication module and/or short range communication module.
  • the controller 220 of the server 200 may then validate and/or authenticate the identifier.
  • the controller 220 may register the computing device 100 or the user of the computing device 100 as a participant of the event.
  • the computing device 100 may generate first sensing data as location data of the computing device 100 (S320).
  • the controller 120 is operable to drive the input unit 140, for example location sensor to generate the location data of the computing device 100.
  • the location sensor may include the GPS module or the beacon reader module to acquire the location data of the computing device 100.
  • the controller 120 is operable to detect whether the computing device 100 is located at a site of the event, based on the location data.
  • the location sensor may be used or integrated with the communication unit 160 and/or other sensing units of the input unit 140.
  • the input unit 140 may generate the first sensing data to check the location of the computing device 100. It may be appreciated that this first sensing data may include, but not be limited to, location, image, timestamps and social media data.
  • the server 200 is arranged in data or signal communication to retrieve/collect the identifier and the first sensing data (S330).
  • the communication unit 260 of the server 200 may receive the identifier of the event and the location data of the computing device 100.
  • the server 200 may prompt the computing device 100 to display a user interface, such as a screen relating to the event (S340).
  • a user interface such as a screen relating to the event (S340).
  • the controller 220 of the server 200 may register the computing device 100 and/or the user as a participant.
  • the controller 220 is operable to assign a value to at least each prize of the event based on the location data of the computing device 100, before prompting the computing device 100 to display the screen.
  • the prize may include, but not be limited to services, physical/digital goods, gift cards, giveaways, coupons and credits.
  • the prize may be held in a digital escrow account held by the server 200, for example a marketing technology service provider’s server, until the day of the event.
  • the computing device 100 then displays the screen and generates second sensing data in relation to the screen (S350).
  • the controller 120 of the computing device 100 is operable to drive a camera to capture at least one image. It is to be appreciated that a plurality of cameras may be implemented in the computing device 100.
  • the controller 120 may drive at least one camera based on the default setting, user’s input and/or external environment.
  • the captured image may be in the form of static image (also referred to as still image) or dynamic image (also referred to as moving image, live image or video).
  • the controller 120 is operable to render the screen in relation to the event, based on at least one of the captured image and the assigned value.
  • the screen may include one or more objects which are overlapped with the captured image.
  • the controller 120 is operable to link at least one object among the objects to the prize.
  • the objects are displayed in a form of floating or falling on the captured image, on the output unit 150, for example a display, of the computing device 100. In this manner, the display is able to display the screen in a form of augmented reality (AR).
  • AR augmented reality
  • the display of the computing device 100 is able to display the screen in a form of virtual reality (VR).
  • the controller 120 may capture the image and process the image to render a screen using a virtual reality technology. Thereafter, the display of the mobile virtual reality headset is able to display the rendered screen including the objects in the form of virtual reality (VR). Thereafter, the computing device 100 generates the second sensing data in relation to the screen.
  • the controller 120 is operable to drive at least one sensing unit to generate the second sensing data.
  • the sensing unit may include, but not be limited to, a motion sensor, touch sensor and proximity sensor.
  • the second sensing data relates to at least of the following: an orientation or rotation of the computing device, acceleration or deceleration of the computing device, presence or absence of an object’s contact, touch input and proximity touch input.
  • the user of the computing device 100 may move or tilt a body of the computing device 100 to collect the prize while he/she is watching the screen displayed on the output unit 150.
  • the sensing unit for example the motion sensor, may be used to generate the second sensing data in accordance with the movement or tilt of the body.
  • the second sensing data may be transmitted to the communication unit 260 of the server 200.
  • the server 200 may determine an outcome of the event based on at least one of the first sensing data and the second sensing data (S360). It may be appreciated that the server 200 is operable to use a combination of a random function, the first sensing data and the second sensing data, to determine the outcome, for example the prize disbursement and/or winner.
  • Each prize may be held in a digital or virtual escrow held by the server 200, for example a marketing technology service provider’s server, and will be released at the pre-determined time.
  • the use of the random function (also referred to as“randomizer” or “digital disbursement randomizer”) may ensure fair and unbiased distribution of the prize pool to all known and active users, i.e. participants, of the“Digital Drop”.
  • the known and active participant may be defined by the service provider, for example the marketing technology service provider, as registered user whose location is known, has an active login session for the duration of event, and has provided a valid and non-expired identifier.
  • the server 200 may use an algorithm that combines the first sensing data such as geospatial location and the active session login timestamps, to generate variable minimum and maximum values. The minimum and maximum values are then input into a Java-based random function of the server 200.
  • Two (2) layers of randomization at both a physical layer, for example sensor data level, and a software layer, for example which is comparable to thermal and atmospheric noise random number generation, may be used.
  • Environmental sensor data may be collected as a multi-value arrays [a,b] where‘a’ and‘b’ represent two (2) values of the same or similar type. For example,‘a’ is user’s last location, and‘b’ is current location; or‘a’ is current atmospheric temperature, and b is device’s current internal CPU temperature. These values may be sent to the software layer and used to generate a pseudo random number X.
  • another sensor data such as the user’s physical movements recorded by the computing device’s 100 on-board sensors including but not limited to accelerometers, gyroscopes and magnetometers may also be collected as multivalue arrays [c,d] where‘c’ and‘d’ represent two (2) values of the same or similar type. These values may be sent to the software layer and used to generate a pseudo random number Y.
  • a resulting pseudo random number Z generated by the software layer is a value compared against the available inventory of prizes as a key-value pair.
  • the software layer and the resultant pseudo number Z are comparable to thermal and atmospheric noise in that both result in high levels of entropy and/or randomness due to the multiple variables impacting temperature and radio frequencies as energy travels through different environments. It would be applicable when using the users’ behaviour, for example movements, coupled with the users’ surroundings.
  • the controller 220 of the server 200 may determine the minimum value and the maximum value.
  • the minimum value (“MIN”) associated with the second sensing data and maximum value (“MAX”) associated with the first sensing data and the identifier may be as follows:
  • Min integer value assigned to sensor movement data inputs
  • Max integer value assigned to GPS and session login ID inputs
  • the outcome for example the chances of winning, may be determined using a Java- based random function as follows:
  • Each prize may be assigned an integer value between the minimum value and the maximum value before the screen is displayed on the display of the computing device 100. While the screen is displayed, the second sensing data, for example user’s input and/or movement, generating the matching integer may determine the outcome.
  • the resultant pseudo number Z generated by the software layer may be a secondary value in the key-value pair which links the prize to the user which forms the basis of the“match”. For example, since the maximum value relates to the location data of the computing device 100, a user who is remotely located may have lower value compared to another user who is located at the event site. Therefore, the prizes available for the user who is remotely located would have lesser. However, another user who physically presents at the event site may be eligible to win larger prizes and giveaways than the user who does not physically present.
  • a user who is remotely located may have higher value compared to another user who is located at the event site.
  • the computing device 100 generates the second sensing data in relation to the screen.
  • the user of the computing device 100 may move or tilt a body of the computing device 100 to collect the prize while he/she is watching the screen displayed on the output unit 150, for example the display.
  • the motion sensor may be used to generate the second sensing data in relation to the movement or tilt of the body.
  • the display may display a virtual basket to collect the prize associated with the object and/or interact with the prize as it appears or disappears in the screen.
  • the prize may be collected using the second sensing data in relation to the movement or tilt of the body.
  • the collected prize may be saved in an account of the user.
  • the account may include, but not be limited to, mobile/data load credits, digital currency, or customer loyalty rewards wallets. The user may then be redeem the prize.
  • the user may touch the screen to collect the prize.
  • the user input unit may be used to generate the second sensing data in relation to the touch input.
  • the user may make a sound to collect the prize.
  • the microphone may receive the audio input and be used to generate the second sensing data in relation to the audio input.
  • a plurality of input unit 140 may be used to collect the prize.
  • the sensing unit for example the motion sensor
  • the user input unit may be used to generate the second sensing data in relation to the combination of the motion input and touch input.
  • a plurality of computing device for example a smartphone and a smart watch, may be used together to collect the prize.
  • a user input unit of the smartphone and a motion sensor of the smart watch may be used to generate the second sensing data in relation to the combination of the touch input for the smartphone and the motion input for the smart watch.
  • event can also occur in the non-physical realm such as during television program or internet broadcasts to encourage active participation.
  • Figs. 3 to 7 illustrate examples of user experiences of the present invention.
  • the computing device 100 may be used to participate in the event.
  • the user may download a software application to participate in the event, for example“digital drop service”.
  • this service may be available as a program or as an embedded or integrated overlay to existing platforms.
  • the service provider of the event for example a host of the event, may sign a partnership agreement or licence to provide this service via a backend APIs or Software Development Kits (SDKs).
  • SDKs Software Development Kits
  • the user may request for participation in the event by selecting an icon 101 , for example digital drop icon.
  • the display of the computing device 100 may display a screen to enter the identifier associated with event.
  • the screen may include an input window 102 to receive an input of the identifier.
  • the identifier may be an event specific code.
  • the identifier may include, but not be limited to, a pin numbers, alphanumeric code, QR code, barcode, data matrix and maxicode.
  • the user may enter four-digit pin numbers in the input window 102.
  • the user may scan an identifier in the form of QR code or barcode using a camera of the computing device 100.
  • the user may input voice signal in relation to an identifier using a microphone of the computing device 100.
  • the communication unit 160 of the computing device 100 may submit the identifier to the communication unit 260 of the server 200.
  • the controller 220 of the server 200 may analyse, verify and/or authenticate the identifier. Thereafter, the controller 220 may accept the identifier. Although not shown, it may be appreciated that the controller 220 may register the computing device 100 or the user as a participant of the event.
  • the display of the computing device 100 may display a screen to inform that the identifier is accepted, as shown in Fig. 3(c).
  • the screen may include a button 104 to share the identifier with other users. Although not shown, if the user selects the button 104, the identifier may be transmitted to other users via the communication unit 160.
  • the screen may include a button 105 to start participating in the event, for example“Digital Drop” service.
  • the display of the computing device 100 may display a screen in relation to the event.
  • the screen may comprise one or more objects.
  • the background of the screen may be image(s) captured by a camera of the computing device 100.
  • the camera may capture the image periodically, non- periodically or real time.
  • the captured image may be at least one of a static image and a dynamic image, for example a video.
  • the screen may be displayed in the form of augmented reality (AR).
  • AR augmented reality
  • the controller 120 of the computing device 100 may link the objects with the prize, before displaying the screen.
  • the controller 220 of the server 200 may link the objects with the prize, and provide such information to the computing device 100, so that the display of the computing device 100 can display accordingly.
  • a plurality of groups of objects may be displayed in the screen.
  • the object A 108 and object B 109 may be linked to different type or amount of prizes.
  • the object A 108 may be linked to a coupon, while the object B 109 may be linked to a giveaway. It is also to be appreciated that only object B 109 may be linked to the prize.
  • the outcome of the event may be determined using a combination of the random function, the first sensing data as the location data of the computing device 100 and the second sensing data associated with the screen.
  • the second sensing data may be generated by the computing device 100.
  • the objects 108, 109 may be overlapped with the captured image of the external environment.
  • the objects 108, 109 may be displayed in the form of augmented reality (AR).
  • the screen may also include a virtual basket 106 to collect the objects 108, 109 and/or interact with the objects 108, 109 as they appear or disappear in the screen.
  • the user may move or tilt the body of the computing device 100 to collect the prize in the virtual basket 106, while he/she is watching the screen.
  • the user may move the body of the computing device 100 to the right side to collect more objects associated with the prizes.
  • the information of the prizes that the user has collected may be displayed.
  • the number of prizes and/or objects that the user has collected during the event is displayed at a lower part 107 of the screen.
  • a value and/or amount of the prize that the user has collected during the event can be displayed.
  • the user may continuously move the body to collect more objects associated with the prizes.
  • the number of prizes and/or objects that the user has collected is also updated accordingly.
  • the event may be terminated after a predetermined time. In another embodiment, the event may be terminated once the user or all the users collect all the prizes and/or objects. As shown in Figs. 5(a) and 5(b), the number of prizes and/or objects that the user has collected is displayed.
  • the user may redeem the prizes.
  • the user may redeem the at least one prize among the prizes after selecting a button, for example“redeem button” 110.
  • the controller 220 of the server 200 may assist in the redemption. It may be appreciated that the controller 220 is operable to provide the user with at least one option available as to how to redeem the prize, based on a user’s profile.
  • the option may include how and/or where to redeem the prize.
  • the option may further include, not be limited to, ability to share content, winnings, and prize eligibility and restrictions.
  • the controller 220 of the server 200 may analyse the user’s profile which relates to the user’s account status with the system’s services or partner’s services.
  • the controller 220 may recognise at least one active user having an account in the system’s service or the partner’s services, and provide the active user with a plurality of options available as to how to redeem the prize.
  • the controller 120 of the computing device 100 further utilises a user’s information to render the screen.
  • the user’s information may include at least one of the following: behaviour, preference and profile.
  • the user’s information may be collected from a web, for example social network service, or online communities. In this regard, each user is able to enjoy different user experiences according to the user’s behaviours, preferences or profile.
  • the user can check the prizes that the user has collected. After the event is terminated, the number of prizes that the user has collected is displayed, as shown in Fig. 6(a). The user can select a button, for example“open gift” button 11 1. Thereafter, as shown in Fig. 6(b), information and/or icons associated with the prizes that the user has collected are displayed. The user may then redeem and/or use at least one prize among the prizes. For example, if a prize is a coupon, the user can download an electronic coupon once the user selects an icon associated with the prize.
  • a user who is remotely located may have lower chances of winning, compared to another user who is located at the event site. Therefore, the prizes available for the user who is remotely located may have lesser. However, another user who physically presents at the event site may have higher chances of winning, than the user who does not physically present.
  • the objects for example object B 109 associated with the prizes may be displayed in large size.
  • the objects for example object B 109 associated with the prizes may be displayed in small size. Therefore, it may be easier for a user who is located at the event site to collect the prize in his/her virtual basket 106, compared to another user who is remotely located.
  • the user experience in relation to the event may be provided.
  • the user experience may allow the user to interact with backend metadata to perform a task and/or update the user’s profile.
  • the user may download a software application and execute the same to participate in the event.
  • the software application may perform various features, such as at least one of image recognition, text recognition, QR code and/or barcode reader and manual user input recognition.
  • the manual user input may be combined with the screen which is in the form of the augmented reality (AR).
  • AR augmented reality
  • the computing device 100 or the server 200 may access and retrieve vital information about the objects which are displayed on the screen.
  • the objects may be implemented as an interaction trigger for wide range of programmable events.
  • the computing device 100 or the server 200 may comprise various layers, for example three (3) basic layers. These layers may include a presentation layer (also referred to as“user interface layer”), intelligence service layer and metadata database layer (also referred to as“metadata layer”).
  • presentation layer also referred to as“user interface layer”
  • intelligence service layer also referred to as“metadata layer”.
  • metadata database layer also referred to as“metadata layer”.
  • the intelligence service layer may include, but not be limited to, a lightweight Service Oriented Architecture (SOA) framework. It is to be appreciated that the intelligence service layer can be embedded in the software application, depending on chosen implementation.
  • the intelligence service layer may be an interface layer and/or communications hub that consolidates application, data source, business logic and service connections. Additionally, the intelligence service layer may serve as the core processing and aggregation engine that is able to combine data from the various endpoints and sources.
  • the intelligence service layer is not limited to software architecture as it can also pertain to a specific hardware solution that eliminates the need to maintain such communication interfaces separately within the various layers.
  • the intelligence service layer may simplify future upgrades and additions by abstracting all the core processing logic and business rules in a single layer.
  • the intelligence service layer may be centralized and be implemented as a parent-child API model.
  • the intelligence service layer may be decentralized, and the distributed ledger and/or mesh network technology is used to allow computing devices to operate as a node capable of knowing the state and values of other nodes in the same event without having to communicate with a central server.
  • the intelligence service layer is technology agnostic layer. Therefore, the intelligence service layer is able to connect to existing or future technologies simply through the use of service connector.
  • the service connector may include, but not be limited to, MuleSoft, AWS, WS02, SDKs and APIs.
  • the metadata database of the metadata database layer may refer to at least one database that links to data of user (for example, consumer), enterprise and service provider.
  • the metadata database may include, but not be limited to, relational database, big data database and NoSQL database.
  • the metadata database is able to use any combination of these types in either a standalone, clustered, or distributed configurations including distributed ledger technologies.
  • the intelligence service layer may control the presentation layer to provide augmented reality (AR) experience to the users.
  • the web and/or mobile user interface may serve as the presentation layer to display responses from the intelligence service layer which interrogates the metadata database using a combination of sensing data.
  • the presentation layer may be performed in any web- enabled device.
  • Fig. 8 illustrates a logic and data flow of an embodiment of the present invention.
  • an event 401 may be an arbitrary catalyst for triggering the event code algorithm.
  • An escrow 406 may hold at least one account where prizes and/or assets for distribution are stored before and/or during the event.
  • a governance 403 may store a set of logic, rules, guidelines that provide framework for users, i.e. participants. This may include, but not be limited to, prize allocations, participant eligibility, redemption criteria and code verification.
  • a code 402 may be an identifier that is used for the users to participate in the event.
  • An onsite service 404 may be a service that analyses the input code against sensing data designed to facilitate location-based bias in favour of increased thresholds.
  • the technology may be implemented to be an effective marketing and/or advertising tool to motivate users to travel and explore their surroundings.
  • Each code may have a designated location marker, for example GPS tag.
  • the system can determine the effectiveness of the tool in driving and directing the users to events of interest.
  • the users who are near the event location may be rewarded for their physical presence with larger prizes. For example, for those users, one (1 ) coin may be equivalent to twenty (20) points.
  • An offsite service 405 may be a service that analyses the input code against the sensing data designed to introduce location-based bias in favour of restrictive thresholds.
  • the users who are not within close proximity of the event location may be encouraged to make the trek and join others at the event location if they wish to win the larger prizes. For example, for those users, one (1 ) coin may be equivalent to one (1 ) point.
  • a normal function 407 may relate to the system’s primary function initiated by the user and sensing data inputs resulting in features providing optimal user experience.
  • a modified function 408 may relate to the system’s primary function modified by the user and sensing data inputs relating to event/user specific environmental/conditional variables.
  • a collection 409 may relate to the act or set of actions facilitating the collection of prizes from the augmented reality (AR) graphical interface by the users’ choice.
  • the system may use a cache and/or remote metadata database.
  • a redemption 410 may relate to the act or set of actions facilitating the redemption, transfer or linking of collected prizes into various backend systems and metadata database.
  • FIGs. 9 and 10 illustrate flowcharts of another embodiment of the present invention.
  • a host of the event may create an event.
  • the event may include, but not be limited to, a prize draw, raffle and/or lottery, to determine an outcome, for example draw at least one winner.
  • the host of the event may assign one or more prizes for the event and generate an identifier, for example an event specific code. Thereafter, the host may share the code with users who wish to participate in the event.
  • a user may include, but not be limited to, an individual or a group of individuals.
  • the user may share the code with other users. It may be appreciated that the user may download a software application on a computing device 100 of the user, to participate in the event using the code.
  • the user may input the code on the screen.
  • a server 200 may validate the code and allow the user to participate in the event.
  • the computing device 100 may display a screen in relation to the event. The user may collect the prize using his/her computing device 100.
  • the server 200 may then assist in redemption of the prize. It may be appreciated that the server 200 is operable to provide the user with at least one option available as to how to redeem the prize, based on a user’s profile. The option may include how and/or where to redeem the prize.
  • the system may comprise at least one of an enterprise event management module, disbursement platform and disbursement channels.
  • the enterprise event management module may be used by the host of the event.
  • the host may assign one or more prizes for the event, and generate an identifier associated with the event, using the enterprise event management module. Thereafter, the host may share the code with users who wish to participate in the event.
  • the enterprise event management module may be an interface used by a host of the event, for example a game master or an event manager, for creating and managing events and/or games and the related prizes. It is also where they can review key metrics from previous events.
  • the user may participate in the event and collect the prizes through interactions with the screen in relation to the event.
  • the user may use at least one computing device 100 to collect the prizes.
  • the disbursement platform and the disbursement channels may assist in the collection of the prizes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • User Interface Of Digital Computer (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Debugging And Monitoring (AREA)
PCT/SG2018/050548 2018-05-16 2018-10-30 System and method for determining an outcome associated with an event WO2019221666A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
KR1020207034531A KR102522571B1 (ko) 2018-05-16 2018-10-30 이벤트와 관련된 결과를 결정하는 시스템 및 방법
JP2020555838A JP7180951B2 (ja) 2018-05-16 2018-10-30 イベントに関連付けられた成果を確定するためのシステム及び方法
CN201880093352.2A CN112106120B (zh) 2018-05-16 2018-10-30 确定与活动相关的结果的系统和方法
PH12020551897A PH12020551897A1 (en) 2018-05-16 2020-11-05 System and method for determining an outcome associated with an event

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SG10201804153SA SG10201804153SA (en) 2018-05-16 2018-05-16 System and method for determining an outcome associated with an event
SG10201804153S 2018-05-16

Publications (1)

Publication Number Publication Date
WO2019221666A1 true WO2019221666A1 (en) 2019-11-21

Family

ID=64605014

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2018/050548 WO2019221666A1 (en) 2018-05-16 2018-10-30 System and method for determining an outcome associated with an event

Country Status (6)

Country Link
JP (1) JP7180951B2 (ko)
KR (1) KR102522571B1 (ko)
CN (1) CN112106120B (ko)
PH (1) PH12020551897A1 (ko)
SG (1) SG10201804153SA (ko)
WO (1) WO2019221666A1 (ko)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022059707A (ja) * 2020-10-02 2022-04-14 株式会社レシカ ブロックチェーンを利用したデジタルコンテンツの所有権証明及びデジタルチケット付与システム

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120157210A1 (en) * 2010-12-15 2012-06-21 At&T Intellectual Property I Lp Geogame for mobile device
US20130006737A1 (en) * 2011-06-28 2013-01-03 Clandestine Networks, Inc. Location aware mobile game reward system for encouraging real-world activity
US20150228148A1 (en) * 2012-09-11 2015-08-13 Gtech Corporation Method, system and ticket for facilitating lottery related activities via mobile devices
WO2017019530A1 (en) * 2015-07-24 2017-02-02 Silver Curve Games, Inc. Augmented reality rhythm game
US9782668B1 (en) * 2012-07-31 2017-10-10 Niantic, Inc. Placement of virtual elements in a virtual world associated with a location-based parallel reality game
US20180089927A1 (en) * 2016-09-28 2018-03-29 Scientific Games International, Inc. Lottery Game System and Method with Augmented Reality Component

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2003900809A0 (en) * 2003-02-24 2003-03-13 Aristocrat Technologies Australia Pty Ltd Gaming machine transitions
CN101299645A (zh) * 2005-07-22 2008-11-05 袋鼠传媒股份有限公司 用于增强观众参与现场体育赛事的体验的系统和方法
KR101164813B1 (ko) * 2009-11-13 2012-07-12 삼성전자주식회사 디스플레이장치, 단말기 및 영상표시방법
JP5545155B2 (ja) * 2010-09-30 2014-07-09 大日本印刷株式会社 サーバ装置及びコンテンツ情報提供プログラム
JP5940785B2 (ja) * 2011-09-14 2016-06-29 株式会社バンダイナムコエンターテインメント プログラムおよびゲーム装置
KR20130119695A (ko) * 2012-04-24 2013-11-01 (주)네오위즈게임즈 클라우드 서비스 기반의 게임 제공 방법 및 이를 실행하는 게임 제공 서버
US9724597B2 (en) * 2012-06-04 2017-08-08 Sony Interactive Entertainment Inc. Multi-image interactive gaming device
CN102800000A (zh) * 2012-07-09 2012-11-28 深圳赛美无限科技有限公司 一种基于区域及几率控制的线上抽奖促销的方法和系统
KR101793189B1 (ko) * 2012-08-27 2017-11-02 앤키, 인크. 하나 이상의 모바일 컴퓨팅 디바이스들을 갖는 로봇 시스템의 통합
US8719086B1 (en) * 2013-06-21 2014-05-06 Mitesh Gala Interactive electronic game systems, methods, and devices
JP6304590B2 (ja) * 2014-03-13 2018-04-04 株式会社コナミデジタルエンタテインメント ゲームシステム、管理装置及びプログラム
JP5894226B2 (ja) * 2014-06-19 2016-03-23 グランドデザイン株式会社 販売促進システム
US9965754B2 (en) * 2015-06-08 2018-05-08 Google Llc Point of sale terminal geolocation
CN105894255A (zh) * 2016-06-08 2016-08-24 北京奇虎科技有限公司 一种信息处理方法、装置和设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120157210A1 (en) * 2010-12-15 2012-06-21 At&T Intellectual Property I Lp Geogame for mobile device
US20130006737A1 (en) * 2011-06-28 2013-01-03 Clandestine Networks, Inc. Location aware mobile game reward system for encouraging real-world activity
US9782668B1 (en) * 2012-07-31 2017-10-10 Niantic, Inc. Placement of virtual elements in a virtual world associated with a location-based parallel reality game
US20150228148A1 (en) * 2012-09-11 2015-08-13 Gtech Corporation Method, system and ticket for facilitating lottery related activities via mobile devices
WO2017019530A1 (en) * 2015-07-24 2017-02-02 Silver Curve Games, Inc. Augmented reality rhythm game
US20180089927A1 (en) * 2016-09-28 2018-03-29 Scientific Games International, Inc. Lottery Game System and Method with Augmented Reality Component

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022059707A (ja) * 2020-10-02 2022-04-14 株式会社レシカ ブロックチェーンを利用したデジタルコンテンツの所有権証明及びデジタルチケット付与システム
JP7407094B2 (ja) 2020-10-02 2023-12-28 株式会社レシカ ブロックチェーンを利用したデジタルコンテンツの所有権証明及びデジタルチケット付与システム

Also Published As

Publication number Publication date
KR20210010992A (ko) 2021-01-29
JP7180951B2 (ja) 2022-11-30
PH12020551897A1 (en) 2021-05-31
KR102522571B1 (ko) 2023-04-17
CN112106120B (zh) 2022-05-27
SG10201804153SA (en) 2018-11-29
CN112106120A (zh) 2020-12-18
JP2021523431A (ja) 2021-09-02

Similar Documents

Publication Publication Date Title
US11290550B2 (en) Method and device for allocating augmented reality-based virtual objects
US9430783B1 (en) Prioritization of messages within gallery
US10380646B2 (en) Platform for providing customizable user brand experiences
CN112118460B (zh) 资源处理方法、装置、终端及服务器
US20150324400A1 (en) Interest Collection and Tracking System and Method of Use
CN110585726A (zh) 用户召回方法、装置、服务器及计算机可读存储介质
CN105009024B (zh) 节省电池和数据使用
US9271137B2 (en) Orchestrating user devices to form images at venue events
WO2016065131A1 (en) Prioritization of messages
CN104756510A (zh) 通信终端、通信方法、程序以及通信系统
US11501323B1 (en) Augmented reality store and services orientation gamification
JP3216098U (ja) インタラクティブ環境における広告の提供システム
WO2023109037A1 (zh) 基于直播间的互动方法及电子设备
US10373431B2 (en) System and method for advertising distribution through mobile social gaming
US20210150421A1 (en) Dynamic Management of Virtual Queues
US11238476B2 (en) Blockchain-based platform for monetizing social media following
US20190355016A1 (en) System and method for advertising distribution through mobile social gaming
KR102522571B1 (ko) 이벤트와 관련된 결과를 결정하는 시스템 및 방법
US9427661B1 (en) Social networking game with integrated social graph
KR102575553B1 (ko) 그룹간 경쟁을 이용한 증강현실 기반 보물 찾기를 제공하기 위한 서비스 제공 장치와 그 시스템 및 방법, 그리고 컴퓨터 프로그램이 기록된 비휘발성 기록매체
WO2021245863A1 (ja) 情報処理システムおよびプログラム
FR3084498A1 (fr) Systemes et procedes pour une interaction amelioree dans une application de realite augmentee
CN113011926B (zh) 信息推送方法和装置、信息接收方法和装置
JP7385855B1 (ja) プログラム、方法、および情報処理装置
JP7411130B1 (ja) プログラム、方法、情報処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18919123

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020555838

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20207034531

Country of ref document: KR

Kind code of ref document: A

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 25.03.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18919123

Country of ref document: EP

Kind code of ref document: A1