WO2019170835A1 - Advertising in augmented reality - Google Patents

Advertising in augmented reality Download PDF

Info

Publication number
WO2019170835A1
WO2019170835A1 PCT/EP2019/055782 EP2019055782W WO2019170835A1 WO 2019170835 A1 WO2019170835 A1 WO 2019170835A1 EP 2019055782 W EP2019055782 W EP 2019055782W WO 2019170835 A1 WO2019170835 A1 WO 2019170835A1
Authority
WO
WIPO (PCT)
Prior art keywords
software
augmented reality
real world
advertising
camera
Prior art date
Application number
PCT/EP2019/055782
Other languages
French (fr)
Inventor
Niklas BAKOS
Niclas KJELLGREN
Original Assignee
Adverty Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Adverty Ab filed Critical Adverty Ab
Publication of WO2019170835A1 publication Critical patent/WO2019170835A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements

Definitions

  • This invention relates to computer-generated augmented reality, in particular a method for advertising in augmented reality.
  • AR augmented reality
  • virtual objects are superimposed on the reality.
  • the AR environment is a mix between the real environment and virtual objects.
  • AR can be used in for example education, games or user interfaces, or to create works of art.
  • a method for advertising in a computer generated augmented reality environment displayed on a touch screen the touch screen being comprised in a portable device comprising a computer and a video camera, where said video camera is able to capture real time video of the real world, said computer being able to receive image data from the camera and to provide image data to the touch display, where the method is carried out in a computer system comprising tracking software, ren dering software, and advertising software,
  • the tracking software using the real time video captured by the camera to determine a model of the real world and the position and orientation of the portable device in the model
  • the rendering software being able to cause the portable device to render at least one virtual object in the augmented reality environment on the touch display, where the augmented reality environment comprises at least a part of the real time video captured by the camera, the method comprising the steps of: a) the advertising software receiving information about a user interacting with a virtual object which is a virtual ad in the augmented reality environment by touching the touch screen, where instructions for rendering the virtual object has been provided by the advertising software to the rendering software, and the rendering software has rendered the virtual ad in the augmented reality environment on the touch screen of the portable device, b) the advertising software causing the device to discontinue to display the augmented reality environment, and instead to display advertising content not displaying the real time video of the real world, where the video camera of the portable device continues to capture video of the real world without the real world being shown on the display,
  • the rendering software automatically causing the device to render the augmented reality comprising real time video of the real world after the advertising content has been displayed.
  • the method provides for a novel way to present advertising to a user of an AR environ ment.
  • the method provides automatic switching back to the AR environment after adver tising content has been provided to the user.
  • the AR environment may be rendered to the user with minimum time lag after the adverting has been shown.
  • the advertising software may provide a signal that causes the device to start rendering the software in step c).
  • the advertising content may be associated with a duration and step c) may then be carried out at the end of the duration.
  • the tracking software may continue to create the model of the real world, during step b) without the real world is being shown on the display.
  • the tracking software may continue to create the model of the real world and the rendering software may continue to determine the position and orientation of at least one virtual object in relation to the posi tion of the camera in the model of the real world, without the real world or the virtual ob ject being shown on the display.
  • Step a) may be preceded by the advertising software receiving information about a place holder where the virtual ad is to be rendered.
  • a system for advertising in a computer generated augmented reality environment displayed on a touch screen the touch screen being comprised in a portable device comprising a computer and a video camera able to capture real time images of the real world, said computer being able to receive real time video data from the camera and to provide image data to the touch display, the system being configured to use the images captured by the camera to determine a model of the real world and the position and orientation of the portable device in the model, the system further being configured to determine the position and orientation of a virtual object in the model of the real world and to cause the portable device to render at least one virtual object in the augmented reality environment on the touch display, where the augmented reality environment comprises at least a part of the real time video captured by the camera, the computer further being configured to detect when a user interacts with the virtual object by touching the touch screen, the computer being configured to then discontinue to display the augmented reality environment while the video camera of the portable device contin ues to capture video of the real world without the real world
  • a system for advertising in a computer generated augmented reality graphical environment displayed on a touch screen the touch screen being comprised in a portable device comprising a computer and a video camera able to capture real time images of the real world, said computer being able to receive image data from the camera and to provide image data to the touch display, the computer being con figured to use the images captured by the camera to determine a model of the real world and the position and orientation of the portable device in the model, the computer fur ther being configured to determine the position and orientation of a virtual object in the model of the real world and to cause the portable device to render at least one virtual ob- ject in the augmented reality graphical environment on the touch display, where the aug mented reality graphical environment comprises at least a part of the image captured by the camera, the computer further being configured to detect when a user interacts with the virtual object by touching the touch screen where the virtual object is displayed, the computer being configured to then discontinue to display the augmented reality graphical environment, and instead displaying advertising content
  • advertising software arranged to pro- vide, to rendering software, instructions for rendering a virtual object being an virtual ad rendered as a virtual object in an augmented reality graphical environment displayed on a display on a portable device comprising a computer and a video camera and a touch screen, the video camera able to capture real time video images of the real world, the ad vertising software further arranged to detect a touch on the touch screen, and upon de- tecting such touch, causing the device to discontinue to display the augmented reality graphical environment, and instead displaying advertising content not displaying the video of the real world captured by the camera, and arranged to automatically, when the adver tising content has been displayed, cause the rendering software to display the augmented reality comprising the video from the camera.
  • the advertising software may be arranged to receive information about a placeholder, and arranged to then provide to the rendering software, the instructions for rendering a vir tual ad, causing the virtual ad to be rendered.
  • Fig. la shows a user and a device showing an augmented reality environment.
  • Fig. lb shows a user and a device showing advertisement content.
  • Fig. 2 is a schematic drawing of a system.
  • Figs. 3-4 schematically shows memory units.
  • Fig. 5 is a flow chart showing a method.
  • Fig. 6 schematically show association between datasets.
  • an augmented reality environment (AR environment) 13 is charac terized by virtual objects 16,23 being superimposed on an image 9 shown in the reality 22.
  • Fig la shows how a part of the reality 22 is shown as a digital image 9 on a display 3 of portable device 1.
  • the relative positions of vir- tual objects 16,23 in relation to real objects should remain constant regardless of the movements and rotation of the camera 2.
  • the virtual objects 16, 23 are preferably ren dered as 3-dimensional objects on the screen 3 such that the view of the virtual objects 16, 23 changes when the user /camera 2 moves in relation to reality 22.
  • the reality 22 or real world 22 refers to the reality surrounding the user at the time of ren dering of the augmented reality.
  • the system 100 comprises portable device 1 which comprises video camera 2, display 3, which preferably is a touch display, a computer that comprises processor 4, memory 5 and optional communication device 6.
  • the portable device 1 may be a smartphone such as an iPhone or an Android phone, or a tablet computer, such as an iPad.
  • the camera 2 is preferably directed in a direction opposite to the display 3, such that a user has his or her view in generally the same direction as the camera 2 when the dis play 3 is directed towards the user. Most smart phones and tablet computers have such camera/display arrangements.
  • the camera 2 is capable of capturing digital video in real time.
  • the display 3 can be any suitable type of digital display, such as an LCD or an OLED display.
  • Device 1 is generally able to play videos and sound, show web pages and other im ages on display 3 and is preferably able to execute software that supports a wide range of file formats for this purpose.
  • the memory 5 of device 1 stores tracking software 10, rendering software 11 and advertising software 12.
  • Memory 5 also stores data 15 for rendering at least one virtual ad 16 and data 17 for rendering advertising content 20.
  • the advertising content data 17 is used to render content 20 that is not used in the augmented reality 13 and is described further below.
  • Virtual ad data 15 and ad- vertising content data 17 is typically created and stored memory 5 in advance, well before rendering by device 1.
  • Memory 5 may also store an operating system for the device 1.
  • Tracking software 10 is able to receive image data from the camera 2, or another camera, for example an IR camera or a TOF camera, and analyse the image data. Tracking software 10 is able to create a model of at least a part of the real world 22 and determine the posi tion and orientation of the camera 2 in the model.
  • the model of the real world 22 may be very limited.
  • the model of the real world 22 may comprise only the position of the part of the floor, or a part of a wall. Thus, it may not be necessary for the tracking software 10 to build a wireframe model of any part of the real world, although this may be an option in some embodiments.
  • the position and orientation of the camera 2 in the model is determined by methods known in the art, and may include one or more of the following: a) recognizing and determining the position of a visible marker object (world marker) which is stationary relative to the real world (marker based tracking), b) feature- based tracking, c) matching the image with a predefined model of the real world (model based tracking), d) making a 3D image of the surrounding world using a camera for exam- pie an IR camera to build a 3D infrared image map (such as the TrueDepth of Iphone X), or a TOF camera, e) using at least one sensor attached to the device 1, such as an attitude sensor, a position sensor or an accelerometer.
  • tracking soft ware is able to receive image data from camera 2 and use the image data for tracking.
  • Ma chine learning or artificial intelligence may be used by the tracking software 10.
  • the model is 3-dimensional such that it can detect at least one real object and determine the position and orientation of the camera 2 in relation to the real object. For example, if the user is standing on the floor in front of a wall, the model may comprise the wall and the floor.
  • Rendering software 11 is able to render an AR environment 13 to a user on display 3. Ren- dering of the AR environment 13 is carried out in real time or near real time. Updating fre quency is preferably several times per second.
  • the rendering software 11 renders the AR environment 13 on the display 13.
  • the AR environment 13 comprises at least a part of the image 9 of the real world 22 captured by the digital camera 2.
  • Rendering software 11 is ca pable of rendering at least one virtual ad 16 on the display 3. In Fig la, most of display 3 shows the real world, whereas as small portion of the display is occupied by virtual objects 16,23.
  • Memory 5 stores data 14, 15 describing virtual objects 16, 23 that can be rendered on the display 3 by rendering software 11.
  • Virtual objects 16,23 can for example be wire-frame objects with defined textures and colours and other properties (such as opacity, light dif fraction, etc.) stored as digital information 14,15.
  • Rendering software 11 is able to use data 15 for rendering a virtual ad 16 on the display 3, using methods known in the art.
  • the rendering software 11 is able to use the model of the real world built by the tracking software 10 to place a virtual object 16,23 in the AR environment 13.
  • the vir tual object 16,23 is placed in a coordinate system defined as distances XYZ from a point in the model of real world, which may be the position of the camera.
  • XYZ are distances in three dimensions from the point in the model of the real world.
  • Object 16,23 is also placed with an orientation in relation to the coordinate system. The orientation may be defined by rotation around angles in the X, Y and Z directions.
  • Rendering software 11 and tracking software 10 should be seen as functional units which can be comprised in the same computer software product.
  • Many device products 1 such as iPhone and certain Android phones
  • App developers may develop AR applications for use with these products with the use of ARkit (for Apple products) or ARCore (for Android prod ucts).
  • the software package Unity is also useful for development of applications for these packages.
  • the rendering software 11 may be able to render other virtual ob- jects 23, for example characters in games, or other objects, for example a virtual building, a virtual car or tree for example, etc, provided by third party application 19, which may be the main reason for the user experiencing the augmented reality 13 in the first place.
  • third party application 19 may be an augmented reality game or simulation that utilizes tracking software 10 and rendering software 11 to provide an augmented reality experience to a user.
  • Data 14 for rendering virtual objects 23 may be a part of third-party application 19, but this is not necessary.
  • data 15 for rendering virtual ads and data 14 for rendering non-ad virtual objects may generally be similar, have the same or similar for mats and may be handled in an identical manner by rendering software 11.
  • the third- party application 19 may be executable separately from the operating system of the de vice 1, and it may be possible to update the third-party application 19 separately from the operating system of the device 1.
  • the virtual ad 16 may populate a predefined placeholder 24.
  • a placeholder is a site in the model of the real world at least characterized by that it has a position in the model of the real world.
  • the placeholder may also comprise an orientation in relation to the real world. The orientation is preferably defined by rotation around the X, Y and Z directions.
  • Such a placeholder 24 may for example be associated with a non-ad virtual object 23, such that the virtual ad 16 is shown together or next to or as a part of non-ad virtual object 23.
  • the placeholder 24 may also have an identity so that it can be handled by rendering software 11 and advertising software 12.
  • the placeholder may define a time to be elapsed before displaying the first virtual ad, for example a time elapsed for experiencing the AR environ ment, for example such that the virtual ad 16 should be shown after the user has experi enced the AR environment for a predetermined minimum time, for example 1 minute.
  • Placeholders may alternatively be triggered by user-triggered events, such as a user com- pleting a game stage or the user interacting with a virtual object 23.
  • a placeholder 24 may also be associated with a certain type of feature in the model of the real world, for exam ple a floor or a wall detected by tracking software 10.
  • a placeholder 24 may comprise instructions to generate a virtual ad 16 on a wall that has been identified by tracking software 10 to have a certain minimum size and be within a certain minimum range of the user.
  • Placeholder 24 is typically a part of third-party application 19, such as a game but may also be a part of advertising software 12.
  • Advertising software 12 or other software, such as third-party application 19, may recognize the placeholder 24 during ren dering of the AR environment 13.
  • the third-party application 19 may ask advertising soft ware 12 for data 15 for virtual ad 16.
  • the advertising software 12 then provides data 15 for rendering the virtual ad 16 to rendering software 11 so that virtual ad is rendered on the display 3.
  • Virtual advertisements and ad content are virtual advertisements (ads).
  • the virtual ad 16 is a 3-dimensional virtual object.
  • the virtual ad 16 may also be a 2- dimensional object.
  • the virtual ad 16 changes in the users' field of view when the user/camera 2 comes closer or views the virtual ad 16 from a different angle, so as to follow the real world depicted in the image 9.
  • the virtual ad 16 may for example be placed on a wall or on a floor in the AR environment 13. In Fig. la the virtual ad 16 has the shape of a floor stand standing on the floor.
  • the user may encounter virtual ad 16.
  • the user is able to interact with the virtual advertisements 16 for example by touching the screen 3 at the site where the virtual ad 16 is rendered, causing the device 1 to display ad content 20.
  • device 1 has software for handing the touch screen 3 and detecting touch.
  • the virtual ad 16 may have any suitable shape such as prismoid, spherical, rectangular, box-shaped.
  • the virtual ad 16 may show a trademark, a picture, a logo or a product name that is commercially associated with the ad content 20.
  • Advertising software 12 may be able to select data 15, 17 for virtual ad 16 and ad content 20 from a plurality of such data sets.
  • data 15 for virtual ad 16 and data 17 for ad content 20 are associated such that there is at least one dataset 17 for ad content 20 associated with data 15 for virtual ad 16.
  • certain ad content 20 will be dis- played.
  • each virtual ad 16 is associated with one ad content 20.
  • advertising content dataset 20 associated with each virtual ad 16
  • advertising software 12 selects one data set 17 for ad content 20 from a group of data sets for ad content 20.
  • the logotype of a car maker is shown in virtual ad 16
  • the ad software 12 selects one of the video clips for display to the user.
  • Selection of virtual ad 16 and ad content 20 may be used on user data or context data, for example.
  • Fig. 6 shows how one dataset 15 for one virtual ad 16 is associated with a plurality of datasets 17a, 17b, 17c for ad content 20a,
  • the advertising software selects one of dataset 17a, 17b and 17c causes one of video clips 20a, 20b and 20c to be showed on the display 3.
  • Advertising content 20 may be for example a video ad, a still image ad or a looped anima tion or a webpage shown on the display 3. Advertising content 20 may also comprise sound.
  • the advertising content 20 may enable the user to complete a transaction, for ex ample to make a purchase, to register an account or an email, or to download software, for example from App Store.
  • a video ad may for example provide the opportunity to inter act and be directed to a web page for completion of the transaction.
  • Data 17 for rendering advertising content 20 may be associated with a duration, for exam- pie a duration in seconds, which determines how long the advertising content should be displayed to the user.
  • the duration is preferably predetermined.
  • data 17 is data for playing an advertising video the duration may be the playing time in seconds.
  • the duration may also be showing a still image or a looped animation for a certain predetermined time.
  • data 14,15 for virtual objects 16, 23 and data 17 for ad content 20 may be stored on memory 21 of server 7, with which the portable device 1 may be able to communicate via network 8, such as wide area network, such as the inter- net, with the use of a communications device 6 of the device 1.
  • virtual ad data 15 and advertising content data 17 may be stored on server 7 and provided from server 7 to device 1.
  • Advertising software 12 may be configured to connect to server 7 and download or stream data 15 for virtual ads 16 and data 17 for advertising content 20 to device 1, for example at startup or with the use of any other useful schedule.
  • data 15 for virtual ads 16 are downloaded at startup whereas data 17 for ads 20 is streamed from server 7.
  • Communication between communication device 6 of device 1 and server 7 may be carried out with any suitable technology including wire bound and wireless technologies, and may comprise one or more of the following: Wi-Fi, Bluetooth and mobile data networks such as 3G, 4G, 5G and LTE networks.
  • Fig. 5 describes a method for displaying advertisements in augmented reality.
  • the advertising software 12 provides instructions 15 for rendering a virtual ad 16 in the AR environment 13 to the rendering software 11.
  • third party application 19 may provide a placeholder 24 for a virtual ad 16 to the advertising software 12.
  • the placeholder 24 may be defined in the third-party software 19 by the third party software developer.
  • step 101 the rendering software 11 renders the virtual ad 16 in the augmented reality environment 13 on the screen 3 of the portable device 1.
  • the virtual ad 16 is rendered as a virtual ad 16 to the user.
  • the user is able to see the virtual ad 16, for example an electronic billboard, in the augmented reality 13.
  • An example is shown in Fig. la, where the virtual ad 16 promotes a fishing game.
  • the user is also able to see virtual object 23 which is rendered as a part of the AR environment 13.
  • the virtual ad 16 may provide a call to action to the user, for example an icon that indicates to the user that the user can interact with the virtual ad 16.
  • step 102 the system 1 receives input from the user.
  • the user interacts with the virtual ad 16, for example by touching the screen 3 where the virtual object 16 is displayed, when the screen 3 is a touch screen, for example, by tap ping or pressing, or double clicking.
  • the user "touches" the virtual object.
  • interaction can be done in various manners.
  • device 1 may have an eye tracking device that detects where the user is directing his gaze, and interaction is detected if the user looks at the virtual object 16 longer than a threshold time.
  • the touch screen may provide a virtual joystick or similar om the screen, where the joystick is used to direct a cursor to wards the virtual object 16.
  • touching a touch display 3 is the preferred manner of interaction.
  • the interaction causes the device 1 to cease, in step 103, to display the AR environment 13, and instead displaying advertising content 20 not displaying any image 9 of the real world 22.
  • the display 3 displays at least a part of the real world 22 indicated with 9.
  • the advertising software 12 "takes over" the display 3 to show ad content 20.
  • no part of the real world 22 as captured by camera 2 is shown on the display 3.
  • the advertising content 20 is rendered without the aid of camera 2.
  • An example is shown in Fig lb where a video ad for the fishing game is shown on display 3.
  • the advertising con tent 20 may fill the entire display 3.
  • the advertising content 20 behaves like a "layer” visually "on top” of the AR environment 13.
  • the advertising content 20 may gradually visually expand out from the virtual ad 16 to fill the entire display 3.
  • the advertising content 20 is displayed for the predetermined duration, if such a duration associated with the advertising content 20, for example a duration of a video.
  • the device 1 in step 105, automatically begin to render the augmented reality 13.
  • the display 3 now again shows at least some part of the real world 22 as image 9.
  • the advertising software 12 "hands back" the display 3 to the rendering software 11.
  • the advertising software 12 may, in step 104, send a signal to the rendering software 11 or other software of device 1, causing the device 1 to begin showing the augmented reality 13 on the display 3.
  • the signal may be triggered by reaching the end of the duration of the advertising content 20 or may be triggered by a predetermined duration stored separately from the advertising content 20. There may thus be a timer checking how long time the advertising content 20 has been displayed. This may be useful when the advertising content 20 is a still image, or a looped animation.
  • the advertising content 20 may provide the user with an opportunity to carry out a trans action. If the user decides to carry out a transaction, the rendering of the augmented reality 13 may automatically begin once the user has completed the transaction, such as for exam- pie having completed a registration or download or the user having closed the advertising content 20. Thus, the signal of step 104 may be triggered by such an event.
  • the advertising content 20 may be associated with an exit icon that enables the user to close the advertise ment content 20 or the transaction application and return to the augmented reality 13.
  • the camera 2 may shut down during step 103, because it is not needed to render the real world 9.
  • the user interaction in step 102 may then cause the tracking software 10 and rendering software 11 to temporarily rest. This saves computing power.
  • the video camera 3 of the portable device 1 continues to capture video of the real world during step 103, while the captured video is not rendered on display 3.
  • video data captured may be stored in memory 5 and/or processed by processor 4 but not shown on display 3 during step 103.
  • the video thus captured during step 103 is used by the tracking software 10 to continue to create the model of the real world, without the real world being shown on the display 3.
  • the tracking software 10 continues to carry out the necessary computations to create a model of the real world, but the real world is not rendered on the display 3.
  • tracking software 10 uses information from a second camera (for example a TOF (time-of flight) camera) in which case camera 2 can optionally be shut down during step 103.
  • TOF time-of flight
  • step 105 This has the advantage that the device can resume rendering the augmented reality 13 at step 105, for example when the signal is received in step 104. For example, if the camera 2 has moved to a different location during display of the ad content 20, an updated augmented reality must be displayed to the user when the ad content 20 has finished playing, with minimum time lag. Thus, transition from step 103 to step 105 should be carried out with minimum time lag, made possible by the model being up-to-date at all times.
  • the rendering software 11 continues during step 103 to determine the position and orientation of at least one virtual object 16, 23 in the model of the real world, without the real world 22 or the virtual object 16, 23 being shown on the display 3.
  • the rendering software 11 uses the model created by tracking software 10 during step 103.
  • one or more of camera 2, tracking software 10 or rendering software 11 continues to work "in the background”. It is to be noted that it may be necessary for camera 2 to provide video data to tracking software 10 for tracking software 10 to work and it may be necessary for tracking software 10 to create the model of the real world for rendering software 11 to work. In a preferred embodiment all of the camera 2, tracking software 10, and even the rendering software 11 are working during step 103.
  • the rendering software 11 can then provide an up-to-date image 9 that can simply be displayed once the ad 20 is no longer shown.
  • the signal of step 104 from advertising software 12 may be received by either of camera 2, tracking software 10 and rendering software 11 that have been idle during step 103 and needs to be started again to be able to render augmented realty in step 105.
  • the signal from advertising software 12 may also be provided to general display control software of device 1, for example a part of the operating system of device 1, in particular if one or more of camera 2, tracking software 10 or rendering software 11 is active during step 103.
  • Advertising software 12 may generally be able to carry out the following: Detect when a virtual ad 16 is to be rendered, for example by receiving information about a placeholder from third party application 19; provide instructions (for example data 15) in step 100 to rendering software 11 to render a virtual ad 16; receive user input, for example from the touch screen in step 102 (for example via operating system of device 1); cause the device 1 to stop, in step 103; and to render the augmented reality 13. Advertising software 12 then handles the showing of ad content 20. Advertising software 12 may keep track of the dura- tion of the ad content 20, for example by using a timer. Advertising software may then "hand back" the display 3 to the rendering software 11.
  • device 1 may have dedicated display control software that han- dies display of images on display 3.
  • the dedicated display control software may be a part of the operating system of the device 1
  • the signal from advertising software 12 that causes the augmented reality to be shown on display 3 may be received by such device control software.
  • Part, or all, of the advertising software 12 may be delivered or installed on the device 1 to gether with third party software 19.
  • the advertising software 12 may be a plug-in in rela tion to third party software 19. For example, launch or start-up of adverting software 12 may be caused by a user starting or launching the third-party application 19.
  • the number of ad views may be tracked by system 100.
  • the server 7 may comprise view registration software 18.
  • the advertising software 12 may send a signal to an ad server 7 each time advertising content 20 is displayed to a user.
  • the advertising soft ware 12 may also send a signal to an ad server 7 each time a virtual ad 16 is displayed to a user.
  • the signal may be triggered by for example the interaction in step in step 102, or com pletion of display of advertisement content 20 at step 104 or 105.
  • the signal may also be triggered by the advertisement being displayed for a minimum duration.
  • the view registra tion software 18 of server 7 may be able to register views from a plurality of portable de vices 1, 1'. Thus, a plurality of portable devices 1, may be able to communicate with server 7.
  • Server 7 may register each time advertising content 20 has been displayed to a user, and counting such display as one view. The view count may be used for keeping track of how many times an ad 20 has been displayed.
  • a separate software unit may provide a software editor, for creating placeholders for virtual ads 16 in a third-party software 19.
  • the editor may enable a user to specify the conditions for showing virtual ads 16 in third party software 19, for example a game.
  • the user/game designer may be able to specify that a virtual ad 16 should be positioned on the floor, three (real-world) meters in front of the user/player after the game has been played for 30 seconds.
  • the editor may provide a limit for the number of placeholders, in order to avoid fraud. Few advertisers are interested in having too many virtual ads in the same scene.
  • System 1 may be implemented in any suitable computing environment. Above it is de scribed how various software components such as advertising software 12 is stored in memory 5 of device 1. In other embodiments one or more of software components of Fig. 3 are stored and executed in a distributed computing environment, such as for example a server 7. Individual software components may also be stored and executed at several hard ware locations, such as using a distributed web services architecture, or using virtualization. While the invention has been described with reference to specific exemplary embodiments, the description is in general only intended to illustrate the inventive concept and should not be taken as limiting the scope of the invention. The invention is generally defined by the claims.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

There is provided a method for advertising in a computer generated augmented reality graphical environment displayed on a touch screen, the touch screen being comprised in a portable device comprising a computer and a video camera able to capture images of the real world, said computer being able to receive image data from the camera and to provide image data to the touch display, the computer comprising tracking software, rendering soft ware, and advertising software, the tracking software using the images captured by the camera to determine a model of the real world and the position and orientation of the port able device in the model, the rendering software being able to determine the position and orientation of a virtual object in the model of the real world and to cause the portable de vice to render at least one virtual object in the augmented reality graphical environment on the touch display, where the augmented reality graphical environment comprises at least a part of the image captured by the camera and rendered on the touch display, the method comprising the steps of: a) the advertising software unit providing instructions for rendering a virtual object being an virtual ad in the augmented reality graphical environment, to the rendering software, b) the rendering software rendering the virtual ad in the augmented reality graphical environment on the touch screen of the portable device, c) a user interact ing with the virtual object, interacting by touching the touch screen, d) the advertising soft ware causing the device to discontinue to display the augmented reality graphical environ ment, and instead displaying advertising content 20 not displaying the real world, and e) the rendering software automatically begin to render the augmented reality when the ad vertising content 20 has been displayed.

Description

Advertising in augmented reality
Field of the invention
This invention relates to computer-generated augmented reality, in particular a method for advertising in augmented reality.
Background
In an augmented reality (AR) environment virtual objects are superimposed on the reality. Thus, the AR environment is a mix between the real environment and virtual objects. AR can be used in for example education, games or user interfaces, or to create works of art. There is a need for improved systems, methods and software for providing advertising in augmented reality.
Summary of invention In a first aspect of the invention, there is provided a method for advertising in a computer generated augmented reality environment displayed on a touch screen, the touch screen being comprised in a portable device comprising a computer and a video camera, where said video camera is able to capture real time video of the real world, said computer being able to receive image data from the camera and to provide image data to the touch display, where the method is carried out in a computer system comprising tracking software, ren dering software, and advertising software,
the tracking software using the real time video captured by the camera to determine a model of the real world and the position and orientation of the portable device in the model,
the rendering software being able to cause the portable device to render at least one virtual object in the augmented reality environment on the touch display, where the augmented reality environment comprises at least a part of the real time video captured by the camera, the method comprising the steps of: a) the advertising software receiving information about a user interacting with a virtual object which is a virtual ad in the augmented reality environment by touching the touch screen, where instructions for rendering the virtual object has been provided by the advertising software to the rendering software, and the rendering software has rendered the virtual ad in the augmented reality environment on the touch screen of the portable device, b) the advertising software causing the device to discontinue to display the augmented reality environment, and instead to display advertising content not displaying the real time video of the real world, where the video camera of the portable device continues to capture video of the real world without the real world being shown on the display,
c) the rendering software automatically causing the device to render the augmented reality comprising real time video of the real world after the advertising content has been displayed. The method provides for a novel way to present advertising to a user of an AR environ ment. The method provides automatic switching back to the AR environment after adver tising content has been provided to the user. The AR environment may be rendered to the user with minimum time lag after the adverting has been shown. The advertising software may provide a signal that causes the device to start rendering the software in step c). The advertising content may be associated with a duration and step c) may then be carried out at the end of the duration.
The tracking software may continue to create the model of the real world, during step b) without the real world is being shown on the display. The tracking software may continue to create the model of the real world and the rendering software may continue to determine the position and orientation of at least one virtual object in relation to the posi tion of the camera in the model of the real world, without the real world or the virtual ob ject being shown on the display. These embodiments have the advantage that the AR en vironment may be rendered very quickly to the user after the advertisment has been shown.
There may be a plurality of portable devices connected to a server and the number of times the advertising content has been displayed to users may be registered, and each time step a) or b) occurs is counted as one view.
Step a) may be preceded by the advertising software receiving information about a place holder where the virtual ad is to be rendered.
In a second aspect of the invention there is provided a system for advertising in a computer generated augmented reality environment displayed on a touch screen, the touch screen being comprised in a portable device comprising a computer and a video camera able to capture real time images of the real world, said computer being able to receive real time video data from the camera and to provide image data to the touch display, the system being configured to use the images captured by the camera to determine a model of the real world and the position and orientation of the portable device in the model, the system further being configured to determine the position and orientation of a virtual object in the model of the real world and to cause the portable device to render at least one virtual object in the augmented reality environment on the touch display, where the augmented reality environment comprises at least a part of the real time video captured by the camera, the computer further being configured to detect when a user interacts with the virtual object by touching the touch screen, the computer being configured to then discontinue to display the augmented reality environment while the video camera of the portable device contin ues to capture video of the real world without the real world is being shown on the display, and instead displaying advertising content not displaying the real world, and further config- ured to begin to render the augmented reality comprising video from the camera, when the advertising content has been displayed. In a second aspect there is provided a system for advertising in a computer generated augmented reality graphical environment displayed on a touch screen, the touch screen being comprised in a portable device comprising a computer and a video camera able to capture real time images of the real world, said computer being able to receive image data from the camera and to provide image data to the touch display, the computer being con figured to use the images captured by the camera to determine a model of the real world and the position and orientation of the portable device in the model, the computer fur ther being configured to determine the position and orientation of a virtual object in the model of the real world and to cause the portable device to render at least one virtual ob- ject in the augmented reality graphical environment on the touch display, where the aug mented reality graphical environment comprises at least a part of the image captured by the camera, the computer further being configured to detect when a user interacts with the virtual object by touching the touch screen where the virtual object is displayed, the computer being configured to then discontinue to display the augmented reality graphical environment, and instead displaying advertising content not displaying the real world, and further configured to begin to render the augmented reality comprising video from the camera, when the advertising content has been displayed.
In a third aspect of the invention there is provided advertising software arranged to pro- vide, to rendering software, instructions for rendering a virtual object being an virtual ad rendered as a virtual object in an augmented reality graphical environment displayed on a display on a portable device comprising a computer and a video camera and a touch screen, the video camera able to capture real time video images of the real world, the ad vertising software further arranged to detect a touch on the touch screen, and upon de- tecting such touch, causing the device to discontinue to display the augmented reality graphical environment, and instead displaying advertising content not displaying the video of the real world captured by the camera, and arranged to automatically, when the adver tising content has been displayed, cause the rendering software to display the augmented reality comprising the video from the camera. The advertising software may be arranged to receive information about a placeholder, and arranged to then provide to the rendering software, the instructions for rendering a vir tual ad, causing the virtual ad to be rendered. Brief description of drawings
The accompanying drawings form a part of the specification and schematically illustrate preferred embodiments of the invention and serve to illustrate the principles of the inven tion.
Fig. la shows a user and a device showing an augmented reality environment.
Fig. lb shows a user and a device showing advertisement content.
Fig. 2 is a schematic drawing of a system.
Figs. 3-4 schematically shows memory units.
Fig. 5 is a flow chart showing a method.
Fig. 6 schematically show association between datasets.
Detailed description
With reference to Fig 1, an augmented reality environment (AR environment) 13 is charac terized by virtual objects 16,23 being superimposed on an image 9 shown in the reality 22. Fig la shows how a part of the reality 22 is shown as a digital image 9 on a display 3 of portable device 1. In the augmented reality environment 13, the relative positions of vir- tual objects 16,23 in relation to real objects should remain constant regardless of the movements and rotation of the camera 2.The virtual objects 16, 23 are preferably ren dered as 3-dimensional objects on the screen 3 such that the view of the virtual objects 16, 23 changes when the user /camera 2 moves in relation to reality 22. The reality 22 or real world 22 refers to the reality surrounding the user at the time of ren dering of the augmented reality. With reference to Fig. 2 the system 100 comprises portable device 1 which comprises video camera 2, display 3, which preferably is a touch display, a computer that comprises processor 4, memory 5 and optional communication device 6. The portable device 1 may be a smartphone such as an iPhone or an Android phone, or a tablet computer, such as an iPad. The camera 2 is preferably directed in a direction opposite to the display 3, such that a user has his or her view in generally the same direction as the camera 2 when the dis play 3 is directed towards the user. Most smart phones and tablet computers have such camera/display arrangements. The camera 2 is capable of capturing digital video in real time. The display 3 can be any suitable type of digital display, such as an LCD or an OLED display. Device 1 is generally able to play videos and sound, show web pages and other im ages on display 3 and is preferably able to execute software that supports a wide range of file formats for this purpose.
With reference to Fig. 3 the memory 5 of device 1 stores tracking software 10, rendering software 11 and advertising software 12. Memory 5 also stores data 15 for rendering at least one virtual ad 16 and data 17 for rendering advertising content 20. The advertising content data 17 is used to render content 20 that is not used in the augmented reality 13 and is described further below. There may, of course, be data 15 for several different vir tual ads 16 and data 17 for many different advertisements 20. Virtual ad data 15 and ad- vertising content data 17 is typically created and stored memory 5 in advance, well before rendering by device 1. Memory 5 may also store an operating system for the device 1.
Tracking software 10 is able to receive image data from the camera 2, or another camera, for example an IR camera or a TOF camera, and analyse the image data. Tracking software 10 is able to create a model of at least a part of the real world 22 and determine the posi tion and orientation of the camera 2 in the model. The model of the real world 22 may be very limited. For example, the model of the real world 22 may comprise only the position of the part of the floor, or a part of a wall. Thus, it may not be necessary for the tracking software 10 to build a wireframe model of any part of the real world, although this may be an option in some embodiments. The position and orientation of the camera 2 in the model is determined by methods known in the art, and may include one or more of the following: a) recognizing and determining the position of a visible marker object (world marker) which is stationary relative to the real world (marker based tracking), b) feature- based tracking, c) matching the image with a predefined model of the real world (model based tracking), d) making a 3D image of the surrounding world using a camera for exam- pie an IR camera to build a 3D infrared image map (such as the TrueDepth of Iphone X), or a TOF camera, e) using at least one sensor attached to the device 1, such as an attitude sensor, a position sensor or an accelerometer. In a preferred embodiment tracking soft ware is able to receive image data from camera 2 and use the image data for tracking. Ma chine learning or artificial intelligence may be used by the tracking software 10. The model is 3-dimensional such that it can detect at least one real object and determine the position and orientation of the camera 2 in relation to the real object. For example, if the user is standing on the floor in front of a wall, the model may comprise the wall and the floor.
Rendering software 11 is able to render an AR environment 13 to a user on display 3. Ren- dering of the AR environment 13 is carried out in real time or near real time. Updating fre quency is preferably several times per second. The rendering software 11 renders the AR environment 13 on the display 13. The AR environment 13 comprises at least a part of the image 9 of the real world 22 captured by the digital camera 2. Rendering software 11 is ca pable of rendering at least one virtual ad 16 on the display 3. In Fig la, most of display 3 shows the real world, whereas as small portion of the display is occupied by virtual objects 16,23.
Memory 5 stores data 14, 15 describing virtual objects 16, 23 that can be rendered on the display 3 by rendering software 11. Virtual objects 16,23 can for example be wire-frame objects with defined textures and colours and other properties (such as opacity, light dif fraction, etc.) stored as digital information 14,15. Rendering software 11 is able to use data 15 for rendering a virtual ad 16 on the display 3, using methods known in the art.
The rendering software 11 is able to use the model of the real world built by the tracking software 10 to place a virtual object 16,23 in the AR environment 13. For example, the vir tual object 16,23 is placed in a coordinate system defined as distances XYZ from a point in the model of real world, which may be the position of the camera. XYZ are distances in three dimensions from the point in the model of the real world. Object 16,23 is also placed with an orientation in relation to the coordinate system. The orientation may be defined by rotation around angles in the X, Y and Z directions.
Rendering software 11 and tracking software 10 should be seen as functional units which can be comprised in the same computer software product. Many device products 1 (such as iPhone and certain Android phones) come with preinstalled parts of tracking software 10 and rendering software 11. App developers may develop AR applications for use with these products with the use of ARkit (for Apple products) or ARCore (for Android prod ucts). The software package Unity is also useful for development of applications for these packages.
Apart from virtual ad 16 the rendering software 11 may be able to render other virtual ob- jects 23, for example characters in games, or other objects, for example a virtual building, a virtual car or tree for example, etc, provided by third party application 19, which may be the main reason for the user experiencing the augmented reality 13 in the first place. For example, third party application 19 may be an augmented reality game or simulation that utilizes tracking software 10 and rendering software 11 to provide an augmented reality experience to a user. Thus, there may be data 14 for rendering non-ad virtual objects 23. Data 14 for rendering virtual objects 23 may be a part of third-party application 19, but this is not necessary. It is to be noted that data 15 for rendering virtual ads and data 14 for rendering non-ad virtual objects may generally be similar, have the same or similar for mats and may be handled in an identical manner by rendering software 11. The third- party application 19 may be executable separately from the operating system of the de vice 1, and it may be possible to update the third-party application 19 separately from the operating system of the device 1.
The virtual ad 16 may populate a predefined placeholder 24. A placeholder is a site in the model of the real world at least characterized by that it has a position in the model of the real world. The placeholder may also comprise an orientation in relation to the real world. The orientation is preferably defined by rotation around the X, Y and Z directions. Such a placeholder 24 may for example be associated with a non-ad virtual object 23, such that the virtual ad 16 is shown together or next to or as a part of non-ad virtual object 23. The placeholder 24 may also have an identity so that it can be handled by rendering software 11 and advertising software 12. The placeholder may define a time to be elapsed before displaying the first virtual ad, for example a time elapsed for experiencing the AR environ ment, for example such that the virtual ad 16 should be shown after the user has experi enced the AR environment for a predetermined minimum time, for example 1 minute. Placeholders may alternatively be triggered by user-triggered events, such as a user com- pleting a game stage or the user interacting with a virtual object 23. A placeholder 24 may also be associated with a certain type of feature in the model of the real world, for exam ple a floor or a wall detected by tracking software 10. As an example, a placeholder 24 may comprise instructions to generate a virtual ad 16 on a wall that has been identified by tracking software 10 to have a certain minimum size and be within a certain minimum range of the user. Placeholder 24 is typically a part of third-party application 19, such as a game but may also be a part of advertising software 12. Advertising software 12 or other software, such as third-party application 19, may recognize the placeholder 24 during ren dering of the AR environment 13. The third-party application 19 may ask advertising soft ware 12 for data 15 for virtual ad 16. The advertising software 12 then provides data 15 for rendering the virtual ad 16 to rendering software 11 so that virtual ad is rendered on the display 3.
Virtual advertisements and ad content Virtual objects 16 (as opposed to virtual objects 23) are virtual advertisements (ads). Pref erably the virtual ad 16 is a 3-dimensional virtual object. The virtual ad 16 may also be a 2- dimensional object. Preferably the virtual ad 16 changes in the users' field of view when the user/camera 2 comes closer or views the virtual ad 16 from a different angle, so as to follow the real world depicted in the image 9. The virtual ad 16 may for example be placed on a wall or on a floor in the AR environment 13. In Fig. la the virtual ad 16 has the shape of a floor stand standing on the floor. Thus, as the user experiences the augmented reality 13, for example a game or a simula tion, which may be a third-party software 19, he or she may encounter virtual ad 16. The user is able to interact with the virtual advertisements 16 for example by touching the screen 3 at the site where the virtual ad 16 is rendered, causing the device 1 to display ad content 20. Generally, device 1 has software for handing the touch screen 3 and detecting touch.
The virtual ad 16 may have any suitable shape such as prismoid, spherical, rectangular, box-shaped. The virtual ad 16 may show a trademark, a picture, a logo or a product name that is commercially associated with the ad content 20. Advertising software 12 may be able to select data 15, 17 for virtual ad 16 and ad content 20 from a plurality of such data sets. Generally, data 15 for virtual ad 16 and data 17 for ad content 20 are associated such that there is at least one dataset 17 for ad content 20 associated with data 15 for virtual ad 16. Thus, when a user interacts with virtual ad 16 certain ad content 20 will be dis- played. There may be a one-to one mapping such that each virtual ad 16 is associated with one ad content 20. Off course there may be more than one advertising content dataset 20 associated with each virtual ad 16, such that advertising software 12 selects one data set 17 for ad content 20 from a group of data sets for ad content 20. For example, when the logotype of a car maker is shown in virtual ad 16 there may be more than one video clip ad content 20 associated with the virtual ad 16, and the ad software 12 selects one of the video clips for display to the user. Selection of virtual ad 16 and ad content 20 may be used on user data or context data, for example. Fig. 6 shows how one dataset 15 for one virtual ad 16 is associated with a plurality of datasets 17a, 17b, 17c for ad content 20a,
20b, 20c. When the user interacts with virtual ad 16, the advertising software selects one of dataset 17a, 17b and 17c causes one of video clips 20a, 20b and 20c to be showed on the display 3.
Advertising content 20 may be for example a video ad, a still image ad or a looped anima tion or a webpage shown on the display 3. Advertising content 20 may also comprise sound. The advertising content 20 may enable the user to complete a transaction, for ex ample to make a purchase, to register an account or an email, or to download software, for example from App Store. A video ad may for example provide the opportunity to inter act and be directed to a web page for completion of the transaction.
Data 17 for rendering advertising content 20 may be associated with a duration, for exam- pie a duration in seconds, which determines how long the advertising content should be displayed to the user. The duration is preferably predetermined. When data 17 is data for playing an advertising video the duration may be the playing time in seconds. The duration may also be showing a still image or a looped animation for a certain predetermined time. Server
With reference to Figs. 2 and 4, data 14,15 for virtual objects 16, 23 and data 17 for ad content 20 may be stored on memory 21 of server 7, with which the portable device 1 may be able to communicate via network 8, such as wide area network, such as the inter- net, with the use of a communications device 6 of the device 1. In particular, virtual ad data 15 and advertising content data 17 may be stored on server 7 and provided from server 7 to device 1. Advertising software 12 may be configured to connect to server 7 and download or stream data 15 for virtual ads 16 and data 17 for advertising content 20 to device 1, for example at startup or with the use of any other useful schedule. In a pre- ferred embodiment data 15 for virtual ads 16 are downloaded at startup whereas data 17 for ads 20 is streamed from server 7. Communication between communication device 6 of device 1 and server 7 may be carried out with any suitable technology including wire bound and wireless technologies, and may comprise one or more of the following: Wi-Fi, Bluetooth and mobile data networks such as 3G, 4G, 5G and LTE networks.
Method
Fig. 5 describes a method for displaying advertisements in augmented reality. In step 100 the advertising software 12 provides instructions 15 for rendering a virtual ad 16 in the AR environment 13 to the rendering software 11. For example, third party application 19 may provide a placeholder 24 for a virtual ad 16 to the advertising software 12. The placeholder 24 may be defined in the third-party software 19 by the third party software developer.
In step 101 the rendering software 11 renders the virtual ad 16 in the augmented reality environment 13 on the screen 3 of the portable device 1. The virtual ad 16 is rendered as a virtual ad 16 to the user. Thus, the user is able to see the virtual ad 16, for example an electronic billboard, in the augmented reality 13. An example is shown in Fig. la, where the virtual ad 16 promotes a fishing game. In the example shown in Fig la, the user is also able to see virtual object 23 which is rendered as a part of the AR environment 13. The virtual ad 16 may provide a call to action to the user, for example an icon that indicates to the user that the user can interact with the virtual ad 16.
In step 102 the system 1 receives input from the user. In a preferred embodiment, in step 102 the user interacts with the virtual ad 16, for example by touching the screen 3 where the virtual object 16 is displayed, when the screen 3 is a touch screen, for example, by tap ping or pressing, or double clicking. Thus, the user "touches" the virtual object. However, interaction can be done in various manners. For example, device 1 may have an eye tracking device that detects where the user is directing his gaze, and interaction is detected if the user looks at the virtual object 16 longer than a threshold time. A third alternative is that the user holds the device such that the camera is directed such that the virtual ad 16 fills a large part of the screen, longer than a minimum time. Also, the touch screen may provide a virtual joystick or similar om the screen, where the joystick is used to direct a cursor to wards the virtual object 16. However, touching a touch display 3 is the preferred manner of interaction.
The interaction causes the device 1 to cease, in step 103, to display the AR environment 13, and instead displaying advertising content 20 not displaying any image 9 of the real world 22. In the initial state shown in Fig la, the display 3 displays at least a part of the real world 22 indicated with 9. However, in step 103, the advertising software 12 "takes over" the display 3 to show ad content 20. In step 103 no part of the real world 22 as captured by camera 2 is shown on the display 3. The advertising content 20 is rendered without the aid of camera 2. An example is shown in Fig lb where a video ad for the fishing game is shown on display 3. The advertising con tent 20 may fill the entire display 3. Thereby, the advertising content 20 behaves like a "layer" visually "on top" of the AR environment 13. In order to enhance this experience, the advertising content 20 may gradually visually expand out from the virtual ad 16 to fill the entire display 3.
The advertising content 20 is displayed for the predetermined duration, if such a duration associated with the advertising content 20, for example a duration of a video. When the duration of the advertising content 20 is over, the device 1, in step 105, automatically begin to render the augmented reality 13. Thus, the display 3 now again shows at least some part of the real world 22 as image 9. Thus, the advertising software 12 "hands back" the display 3 to the rendering software 11. For example, the advertising software 12 may, in step 104, send a signal to the rendering software 11 or other software of device 1, causing the device 1 to begin showing the augmented reality 13 on the display 3. The signal may be triggered by reaching the end of the duration of the advertising content 20 or may be triggered by a predetermined duration stored separately from the advertising content 20. There may thus be a timer checking how long time the advertising content 20 has been displayed. This may be useful when the advertising content 20 is a still image, or a looped animation.
The advertising content 20 may provide the user with an opportunity to carry out a trans action. If the user decides to carry out a transaction, the rendering of the augmented reality 13 may automatically begin once the user has completed the transaction, such as for exam- pie having completed a registration or download or the user having closed the advertising content 20. Thus, the signal of step 104 may be triggered by such an event. The advertising content 20 may be associated with an exit icon that enables the user to close the advertise ment content 20 or the transaction application and return to the augmented reality 13. The camera 2 may shut down during step 103, because it is not needed to render the real world 9. The user interaction in step 102 may then cause the tracking software 10 and rendering software 11 to temporarily rest. This saves computing power. However, in one embodiment, the video camera 3 of the portable device 1 continues to capture video of the real world during step 103, while the captured video is not rendered on display 3. Hence, video data captured may be stored in memory 5 and/or processed by processor 4 but not shown on display 3 during step 103. In one embodiment the video thus captured during step 103 is used by the tracking software 10 to continue to create the model of the real world, without the real world being shown on the display 3. Thus, the tracking software 10 continues to carry out the necessary computations to create a model of the real world, but the real world is not rendered on the display 3. Alternatively tracking software 10 uses information from a second camera (for example a TOF (time-of flight) camera) in which case camera 2 can optionally be shut down during step 103. This has the advantage that the device can resume rendering the augmented reality 13 at step 105, for example when the signal is received in step 104. For example, if the camera 2 has moved to a different location during display of the ad content 20, an updated augmented reality must be displayed to the user when the ad content 20 has finished playing, with minimum time lag. Thus, transition from step 103 to step 105 should be carried out with minimum time lag, made possible by the model being up-to-date at all times.
Furthermore, in one embodiment the rendering software 11 continues during step 103 to determine the position and orientation of at least one virtual object 16, 23 in the model of the real world, without the real world 22 or the virtual object 16, 23 being shown on the display 3. Thus, the rendering software 11 uses the model created by tracking software 10 during step 103. Thus, in these embodiments one or more of camera 2, tracking software 10 or rendering software 11 continues to work "in the background". It is to be noted that it may be necessary for camera 2 to provide video data to tracking software 10 for tracking software 10 to work and it may be necessary for tracking software 10 to create the model of the real world for rendering software 11 to work. In a preferred embodiment all of the camera 2, tracking software 10, and even the rendering software 11 are working during step 103. The rendering software 11 can then provide an up-to-date image 9 that can simply be displayed once the ad 20 is no longer shown. The signal of step 104 from advertising software 12 may be received by either of camera 2, tracking software 10 and rendering software 11 that have been idle during step 103 and needs to be started again to be able to render augmented realty in step 105. The signal from advertising software 12 may also be provided to general display control software of device 1, for example a part of the operating system of device 1, in particular if one or more of camera 2, tracking software 10 or rendering software 11 is active during step 103.
Advertising software 12 may generally be able to carry out the following: Detect when a virtual ad 16 is to be rendered, for example by receiving information about a placeholder from third party application 19; provide instructions (for example data 15) in step 100 to rendering software 11 to render a virtual ad 16; receive user input, for example from the touch screen in step 102 (for example via operating system of device 1); cause the device 1 to stop, in step 103; and to render the augmented reality 13. Advertising software 12 then handles the showing of ad content 20. Advertising software 12 may keep track of the dura- tion of the ad content 20, for example by using a timer. Advertising software may then "hand back" the display 3 to the rendering software 11. This may be done by sending a signal to the rendering software 11 or to other software of device 1, causing the device 1 to resume rendering the augmented reality 13 and hence to create and update the image on the display 3. For example, device 1 may have dedicated display control software that han- dies display of images on display 3. The dedicated display control software may be a part of the operating system of the device 1 The signal from advertising software 12 that causes the augmented reality to be shown on display 3 may be received by such device control software. Part, or all, of the advertising software 12 may be delivered or installed on the device 1 to gether with third party software 19. The advertising software 12 may be a plug-in in rela tion to third party software 19. For example, launch or start-up of adverting software 12 may be caused by a user starting or launching the third-party application 19. It may be possible to update advertising software 12 independently of the third-party software 19. The number of ad views may be tracked by system 100. For example, the server 7 may comprise view registration software 18. The advertising software 12 may send a signal to an ad server 7 each time advertising content 20 is displayed to a user. The advertising soft ware 12 may also send a signal to an ad server 7 each time a virtual ad 16 is displayed to a user. The signal may be triggered by for example the interaction in step in step 102, or com pletion of display of advertisement content 20 at step 104 or 105. The signal may also be triggered by the advertisement being displayed for a minimum duration. The view registra tion software 18 of server 7 may be able to register views from a plurality of portable de vices 1, 1'. Thus, a plurality of portable devices 1, may be able to communicate with server 7. Server 7 may register each time advertising content 20 has been displayed to a user, and counting such display as one view. The view count may be used for keeping track of how many times an ad 20 has been displayed.
A separate software unit may provide a software editor, for creating placeholders for virtual ads 16 in a third-party software 19. The editor may enable a user to specify the conditions for showing virtual ads 16 in third party software 19, for example a game. As an example, the user/game designer may be able to specify that a virtual ad 16 should be positioned on the floor, three (real-world) meters in front of the user/player after the game has been played for 30 seconds. The editor may provide a limit for the number of placeholders, in order to avoid fraud. Few advertisers are interested in having too many virtual ads in the same scene.
System 1 may be implemented in any suitable computing environment. Above it is de scribed how various software components such as advertising software 12 is stored in memory 5 of device 1. In other embodiments one or more of software components of Fig. 3 are stored and executed in a distributed computing environment, such as for example a server 7. Individual software components may also be stored and executed at several hard ware locations, such as using a distributed web services architecture, or using virtualization. While the invention has been described with reference to specific exemplary embodiments, the description is in general only intended to illustrate the inventive concept and should not be taken as limiting the scope of the invention. The invention is generally defined by the claims.

Claims

1. A method for advertising in a computer generated augmented reality environment displayed on a touch screen, the touch screen being comprised in a portable device comprising a computer and a video camera, where said video camera is able to cap ture real time video of the real world, said computer being able to receive image data from the camera and to provide image data to the touch display, where the method is carried out in a computer system comprising tracking software, rendering software, and advertising software,
the tracking software using the real time video captured by the camera to determine a model of the real world and the position and orientation of the portable device in the model,
the rendering software being able to cause the portable device to render at least one virtual object in the augmented reality environment on the touch display, where the augmented reality environment comprises at least a part of the real time video captured by the camera, the method comprising the steps of: a) the advertising software receiving information about a user interacting with a virtual object which is a virtual ad in the augmented reality environment by touching the touch screen, where instructions for rendering the virtual object has been provided by the advertising software to the rendering software, and the rendering software has rendered the virtual ad in the augmented reality environment on the touch screen of the portable device, b) the advertising software causing the device to discontinue to display the augmented reality environment, and instead to display advertising content not displaying the real time video of the real world, where the video camera of the portable device continues to capture video of the real world without the real world being shown on the display, c) the rendering software automatically causing the device to render the augmented reality comprising real time video of the real world after the advertising content has been displayed.
2. The method of claim 1 where the advertising software provides a signal that causes the device to start rendering the augmented reality in step c).
3. The method of claim 1 or 2 where the advertising content is associated with a dura tion and step c) is carried out at the end of the duration.
4. The method of any one of claims 1 to 3 where, during step b), the tracking software continues to create or update the model of the real world, without the real world being shown on the display.
5. The method of claim 4 where, during step b), the tracking software continues to create or update the model of the real world and the rendering software determines the position and orientation of at least one virtual object in relation to the position of the camera in the model of the real world, without the real world or the virtual object being shown on the display.
6. The method of any one of claims 1 to 5 where there is a plurality of portable devices connected to a server and the number of times the advertising content has been displayed to users is registered, and wherein each time step a) or b) occurs is counted as one view.
7. The method of any one of claims 1 to 6 where step a) is preceded by the advertising software receiving information about a placeholder where the virtual ad is to be rendered.
8. A system for advertising in a computer generated augmented reality environment displayed on a touch screen, the touch screen being comprised in a portable device comprising a computer and a video camera able to capture real time images of the real world, said computer being able to receive real time video data from the camera and to provide image data to the touch display, the system being configured to use the images captured by the camera to determine a model of the real world and the position and orientation of the portable device in the model, the system further be ing configured to determine the position and orientation of a virtual object in the model of the real world and to cause the portable device to render at least one vir tual object in the augmented reality environment on the touch display, where the augmented reality environment comprises at least a part of the real time video cap tured by the camera, the computer further being configured to detect when a user interacts with the virtual object by touching the touch screen, the computer being configured to then discontinue to display the augmented reality environment while the video camera of the portable device continues to capture video of the real world without the real world is being shown on the display, and instead displaying adver tising content not displaying the real world, and further configured to begin to ren der the augmented reality comprising video from the camera, when the advertising content has been displayed.
PCT/EP2019/055782 2018-03-09 2019-03-07 Advertising in augmented reality WO2019170835A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1850260 2018-03-09
SE1850260-9 2018-03-09

Publications (1)

Publication Number Publication Date
WO2019170835A1 true WO2019170835A1 (en) 2019-09-12

Family

ID=65766997

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/055782 WO2019170835A1 (en) 2018-03-09 2019-03-07 Advertising in augmented reality

Country Status (1)

Country Link
WO (1) WO2019170835A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111368102A (en) * 2020-03-05 2020-07-03 极度智慧展览(上海)有限公司 Man-machine interactive exhibition area vehicle simulation experience device
CN111899349A (en) * 2020-07-31 2020-11-06 北京市商汤科技开发有限公司 Model presentation method and device, electronic equipment and computer storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130124326A1 (en) * 2011-11-15 2013-05-16 Yahoo! Inc. Providing advertisements in an augmented reality environment
US20140025481A1 (en) * 2012-07-20 2014-01-23 Lg Cns Co., Ltd. Benefit promotion advertising in an augmented reality environment
US20150235432A1 (en) * 2009-04-01 2015-08-20 Microsoft Technology Licensing, Llc Augmented reality computing with inertial sensors
US20170039771A1 (en) * 2015-08-05 2017-02-09 Globalive Xmg Jv Inc. System and method of markerless injection of ads in ar

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150235432A1 (en) * 2009-04-01 2015-08-20 Microsoft Technology Licensing, Llc Augmented reality computing with inertial sensors
US20130124326A1 (en) * 2011-11-15 2013-05-16 Yahoo! Inc. Providing advertisements in an augmented reality environment
US20140025481A1 (en) * 2012-07-20 2014-01-23 Lg Cns Co., Ltd. Benefit promotion advertising in an augmented reality environment
US20170039771A1 (en) * 2015-08-05 2017-02-09 Globalive Xmg Jv Inc. System and method of markerless injection of ads in ar

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111368102A (en) * 2020-03-05 2020-07-03 极度智慧展览(上海)有限公司 Man-machine interactive exhibition area vehicle simulation experience device
CN111368102B (en) * 2020-03-05 2022-10-25 上海极度智慧展览股份有限公司 Man-machine interactive exhibition area vehicle simulation experience device
CN111899349A (en) * 2020-07-31 2020-11-06 北京市商汤科技开发有限公司 Model presentation method and device, electronic equipment and computer storage medium
CN111899349B (en) * 2020-07-31 2023-08-04 北京市商汤科技开发有限公司 Model presentation method and device, electronic equipment and computer storage medium

Similar Documents

Publication Publication Date Title
US10055894B2 (en) Markerless superimposition of content in augmented reality systems
US20210082194A1 (en) System and method to integrate content in real time into a dynamic real-time 3-dimensional scene
KR101574099B1 (en) Augmented reality representations across multiple devices
KR101894021B1 (en) Method and device for providing content and recordimg medium thereof
US20140267599A1 (en) User interaction with a holographic poster via a secondary mobile device
CN104461318B (en) Reading method based on augmented reality and system
US20190132521A1 (en) Method of displaying wide-angle image, image display system, and information processing apparatus
US20140267598A1 (en) Apparatus and method for holographic poster display
US20110184805A1 (en) System and method for precision placement of in-game dynamic advertising in computer games
CN113905251A (en) Virtual object control method and device, electronic equipment and readable storage medium
WO2013120851A1 (en) Method for sharing emotions through the creation of three-dimensional avatars and their interaction through a cloud-based platform
CN112330819B (en) Interaction method and device based on virtual article and storage medium
CN106464773B (en) Augmented reality device and method
CN112927349B (en) Three-dimensional virtual special effect generation method and device, computer equipment and storage medium
US20200160602A1 (en) Virtual content display opportunity in mixed reality
CN105183477A (en) System and method for acquiring virtual item information of application program
JP2021521547A (en) Advertising interaction methods and devices, electronic devices and storage media
WO2019170835A1 (en) Advertising in augmented reality
CN114358822A (en) Advertisement display method, device, medium and equipment
CN111693063A (en) Navigation interaction display method and device, electronic equipment and storage medium
WO2014189840A1 (en) Apparatus and method for holographic poster display
CN112684893A (en) Information display method and device, electronic equipment and storage medium
CN116688526A (en) Virtual character interaction method and device, terminal equipment and storage medium
JP2018185738A (en) Information processor and advertisement control program
US20170083952A1 (en) System and method of markerless injection of 3d ads in ar and user interaction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19711040

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 21/12/2020)

122 Ep: pct application non-entry in european phase

Ref document number: 19711040

Country of ref document: EP

Kind code of ref document: A1