WO2014148936A1 - Method for rendering advertisments on touchscreen devices - Google Patents
Method for rendering advertisments on touchscreen devices Download PDFInfo
- Publication number
- WO2014148936A1 WO2014148936A1 PCT/RU2013/000217 RU2013000217W WO2014148936A1 WO 2014148936 A1 WO2014148936 A1 WO 2014148936A1 RU 2013000217 W RU2013000217 W RU 2013000217W WO 2014148936 A1 WO2014148936 A1 WO 2014148936A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- advertisement
- visualizer
- user
- layer
- rendering
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0251—Targeted advertisements
- G06Q30/0267—Wireless devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0277—Online advertisement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
Definitions
- the present invention is related to touchscreen devices, and more particularly, to rendering advertisements on touchscreen devices using elements of existing/preloaded graphic user interfaces (GU Is).
- GUIs graphic user interfaces
- Mobile device users typically run special thin applications instead of heavy-footprint conventional browsers that take up a lot of system resources.
- mobile users do not use conventional searches nearly as often as PC users.
- advertisement banners and applications integrated into web resources are popular means of monetization on mobile devices.
- the banners are small ads that take up portions of a mobile device screen, which is typically quite small. Because the banners are also very small, they contain very basic advertisement data (i.e., a short text). These basic ads mostly irritate the users and do not attract attention or spark user interest.
- the most commonly used method for mobile advertising is using the ads, along with free applications that carry, i.e., ads that are essentially forced upon users, who download a free application or a game. These ads do generate some small profits due to growth of the mobile device market, but in general are also ineffective.
- the conventional mobile device advertising banners take up a large portion of an already small screen and irritate a user. Furthermore, the ad banners often appear within an application or a game. The user is focused on the application or the game and pays very little attention to the ads. So these advertisements often distract the user and, in essence, act as anti-advertisements.
- All of the conventional mobile advertisement systems such as Google AdMob employ the above described forced ads.
- the advertisement systems are "Blind Networks” that render ads without much concern for its audience.
- these networks collect and analyze user personal and/or geolocation data and act as “Targeted Networks", which also has proven to be ineffective on mobile platforms.
- FIG. 1 illustrates an example of a "graffito banner,” in accordance with the exemplary embodiment
- FIG. 2 illustrates a "graffito banner” with an active header, in accordance with the exemplary embodiment
- FIG. 3 illustrates system architecture, in accordance with the exemplary embodiment
- FIG. 4 illustrates a method implemented by the ad visualizer, in accordance to the exemplary embodiment.
- FIG. 5 illustrates a flow chart of a method for rendering ads, in accordance to the exemplary embodiment
- FIG. 6 illustrates examples of ads displayed on a mobile device
- FIG. 7 is a block diagram of an exemplary mobile device that can be used in the invention.
- FIG. 8 is a block diagram of an exemplary implementation of the mobile device
- FIGs. 9-10 illustrate operation of the Ad Activation Engine.
- FIG. 11 represents an alternative method for ad-layer positioning.
- a method and system for displaying advertisements (or, more generally, promoted items, or promoted information, or information generated in response to user goals) over existing mobile device GUI elements are provided.
- the ads used on the small screen do not require any additional "shelf space,” because they are displayed within the existing GUI controls.
- the existing GUI components are used for rendering advertisement (or other) data.
- All mobile device GUI elements have visible surfaces filled with a background color (or pattern). These surfaces are "painted” by advertisement data similar to a wall painted by the graffiti. Accordingly, herein such ad banners are referred to as “graffito" banners.
- the "graffito" ads can be static images or they can be video clips or animated pictures.
- the GUI controls can be used independent of each other. Thus, advertising data can be displayed selectively only on some of the controls without distracting the user from the main tasks.
- a separate active banner header is implemented for providing a user with a feedback capability. In other words, the user can react to particular ads using a dedicated control(s).
- the controls can initiate different scenarios, which include but are not limited to: showing a full-screen banner with a detailed description of the product(s) and/or service(s), sending a coupon/promotional code to the user, opening a page in the browser, activating an application, activating a chat with the merchant/service representative, sending a message (email, SMS, Facebook message or similar, etc.) to the user.
- FIG. 1 An example of the "graffito banners" is shown in FIG. 1.
- a "graffito banner” is shown on the left.
- some add banners can have active control elements. A user can click on the active control in order to show his reaction to the ad, or to receive more information.
- FIG. 2 illustrates a graffito banner with a header. Note that the header is not the ad itself, but is a tool (channel) for feedback collection. It does not need to deliver any information, except the "press here to react to ad" functionality. Note that the header is optional. The header is integrated into the banner and allows a user to click on it for expressing his reaction to the ad banner.
- FIG. 3 illustrates system architecture, in accordance with the exemplary embodiment.
- An ad visualizer 330 communicates with a mobile device application 360.
- the ad visualizer 330 receives ads from an ad server 310 via an ad library 320 and renders the ads in the application form 340.
- the ad library 320 serves as an interface between the ad server 310 and a "graffito" technology support system 300.
- the ad library 320 provides metadata acquired from the mobile device screen to the ad server 310 and selects appropriate ads based on the metadata.
- the metadata can include, for example, geopositioning/geolocation data from the user's computer or mobile device, gender data, specific application details (e.g., application type, application name, vendor), device type, OS, service provider, etc.
- the selected ads are processed through the visualizer 330 and "graffitied" (i.e., displayed) within the selected GUI controls 365.
- the ad library 320 also supports conventional ad banners displayed on the mobile device 360.
- FIG. 4 illustrates a method implemented by the ad visualizer, in accordance to the exemplary embodiment.
- the ad visualizer 330 controls transparency of the GUI elements (buttons) 420 and provides visibility of the ads that are loaded into an ad layer 430.
- a panel above the ad layer, which has "red color" will be transparent, the background of the button (but not the caption) will be transparent, the background of the checkbox, can also be transparent, etc.
- the ad layer 430 underlines all GUI elements 420.
- the ad visualizer 330 loads the ads into the ad layer 430. Then, the ad visualizer 330 makes a form background (fill) 410 transparent so the ads located on the ad layer 430 become visible.
- the ad layer 430 is normally transparent.
- the layer 430 changes its transparency when ad rendering is activated.
- the background of the selected GUI elements becomes transparent.
- the ad visualizer 330 supports static and dynamic ads.
- the ad banner header 435 is optional.
- the header 435 is not controlled by the ad visualizer 330. Instead, it is integrated into the ads.
- the ad banner header 435 allows users to react to the ad (or give feedback).
- an "inverse color" effect is applied to non-background areas (e.g., button labels) in order to make them visible in the background of the ad banners.
- the ad visualizer 330 controls the transparency of the GUI elements based on their type (button, panel, checkbox, editbox, combobox etc.). Users can notify the system about their interest through a GUI elements, for example, an infrared camera, an infrared sensor or a video camera - where the sensor register (Kinect-type integrated or external device) replace physical "touch screen" and recognizes user's choice; a microphone - e.g., Siri-like model (a sensor with background voice-recognition module, which can be integrated or external) - a user indicates a GUI elements by voice command; a kinesthetic detector - a user uses a joystick, a mouse or wearable sensors (special gloves); a detector of eye movement - a user uses a device such as Googleglasses or a special camera that can recognize, by the position of the eyes, the GUI element the user is looking at and the command (e.g.
- FIG. 5 illustrates a flow chart of a method for rendering ads, in accordance with the exemplary embodiment.
- An ad is activated in step 510 and sent from the ad server to the ad library in step 520.
- the ad is displayed by the ad visualize inside the ad layer in step 530.
- the ad layer is made visible.
- the ad visualizer changes transparency of the GUI controls in order to make the ads visible.
- the ads are rendered to a user in step 550.
- the ad visualizer makes the ad layer transparent in step 570.
- the activation and deactivation of the ad is the responsibility of visualizer - it can do it based on time delay, external request from ad-platform, application request, and so on. Subsequently, the background of the GUI controls is restored.
- the proposed method better attracts mobile device users' attention to the ads.
- the method allows the use of relatively large ad banners on small screens.
- the ad banners do not take up any screen space when not being displayed.
- the existing GUI elements e.g., panels, checkboxes, buttons, etc.
- Each of the GUI elements has some areas filled with a background color.
- the background color is advantageously replaced by the ad banner content.
- the user sees the ad banner through a regular GUI control, while the functionality of the control is not affected.
- the "Open” button will continue performing an action “Open”, check- and combo-buttons will perform the usual functions, etc.
- FIG. 6 illustrates examples of ads displayed on a mobile device.
- the left example illustrates an ad displayed without a banner.
- the middle example illustrates a traditional banner that takes up a portion of the screen.
- the example on the right illustrates a "graffito" banner that is displayed inside an active GUI control. This banner does not take up screen space, because it becomes invisible after deactivation.
- FIG. 11 represents an alternative method for ad-layer positioning - above all the elements in the screen.
- the ad-visualizer sets up transparency for the ad-layer (rather than for form controls).
- the ad-layer translates all the events (touches, swipes, gestures, etc.) to the elements behind it - here, the ad-layer is acting as a passive element. "On top" position of ad-layer allows to draw the information in more flexibly - there are no potential restrictions due to the controls behind it - each piece of the form can be used for the ads.
- the ad-visualizer also can dynamically recognize "free areas" in the form and/or on the screen, and shift an ad to them in a real-time mode while the user performs his usual activities.
- the ad-visualizer analyses the surface behind the ad-layer and recognizes the areas with the background or monotone color and sufficient size, which are suitable for ads and commands the ad-layer to show the ads in those spaces. Different transparency levels, as well as brightness of ad- laver, allow delivery of different visual effects.
- the Ad Visualizer (see FIG. 9) is the part of an Ad Activation Engine, which is distributed as a precompiled library available for application developers. Other components of the Ad Activation Engine are Event Catcher and Customer Feedback Processing.
- the Event catcher is registered in the OS events processing stack as a receiver for several kinds of inbound events and recognizes the events that deliver ad-data.
- the Ad Visualizer is responsible for ad presentation in the applications.
- the Customer feedback processing component is responsible for "click to ad header" events processing, such as opening a link or sending the event to an Ad server or sending data (e.g., coordinates or calendar event) to the standard application, or activating a dialog window.
- the Event Catcher receives the advertisement from a network source, such as an Ad server (step 1004).
- the Ad Visualizer renders the ad to the user.
- the Feedback processing component generates a response based on input from the user, and sends it to the Ad Server.
- the process ends, or returns to step 1002.
- FIG. 8 is a block diagram of an exemplary mobile device 59 on which the invention can be implemented.
- the mobile device 59 can be, for example, a personal digital assistant, a cellular telephone, a network appliance, a camera, a smart phone, an enhanced general packet radio service (EGP S) mobile phone, a network base station, a media player, a navigation device, an email device, a game console, or a combination of any two or more of these data processing devices or other data processing devices.
- EGP S enhanced general packet radio service
- the mobile device 59 includes a touch-sensitive display 73.
- the touch-sensitive display 73 can implement liquid crystal display (LCD) technology, light emitting polymer display (LPD) technology, or some other display technology.
- LCD liquid crystal display
- LPD light emitting polymer display
- the touch-sensitive display 73 can be sensitive to haptic and/or tactile contact with a user.
- the touch-sensitive display 73 can comprise a multi-touch-sensitive display 73.
- a multi-touch-sensitive display 73 can, for example, process multiple simultaneous touch points, including processing data related to the pressure, degree and/or position of each touch point. Such processing facilitates gestures and interactions with multiple fingers, chording, and other interactions.
- Other touch-sensitive display technologies can also be used, e.g., a display in which contact is made using a stylus or other pointing device.
- the mobile device 59 can display one or more graphical user interfaces on the touch-sensitive display 73 for providing the user access to various system objects and for conveying information to the user.
- the graphical user interface can include one or more display objects 74, 76.
- the display objects 74, 76 are graphic representations of system objects.
- system objects include device functions, applications, windows, files, alerts, events, or other identifiable system objects.
- the mobile device 59 can implement multiple device functionalities, such as a telephony device, as indicated by a phone object 91; an e-mail device, as indicated by the e- mail object 92; a network data communication device, as indicated by the Web object 93; a Wi-Fi base station device (not shown); and a media processing device, as indicated by the media player object 94.
- a telephony device as indicated by a phone object 91
- an e-mail device as indicated by the e- mail object 92
- a network data communication device as indicated by the Web object 93
- a Wi-Fi base station device not shown
- a media processing device as indicated by the media player object 94.
- particular display objects 74 e.g., the phone object 91, the e-mail object 92, the Web object 93, and the media player object 94, can be displayed in a menu bar 95.
- device functionalities can be accessed from a top-level graphical user interface, such as the graphica l user interface illustrated in the figure. Touching one of the objects 91, 92, 93 or 94 can, for example, invoke corresponding functionality.
- the mobile device 59 can implement network distribution functionality.
- the functionality can enable the user to take the mobile device 59 and its associated network while traveling.
- the mobile device 59 can extend Internet access (e.g., Wi-Fi) to other wireless devices in the vicinity.
- mobile device 59 can be configured as a base station for one or more devices. As such, mobile device 59 can grant or deny network access to other wireless devices.
- the graphical user interface of the mobile device 59 changes, or is augmented or replaced with another user interface or user interface elements, to facilitate user access to particular functions associated with the corresponding device functionality.
- the graphical user interface of the touch-sensitive display 73 may present display objects related to various phone functions; likewise, touching of the email object 92 may cause the graphical user interface to present display objects related to various e-mail functions; touching the Web object 93 may cause the graphical user interface to present display objects related to various Web-surfing functions; and touching the media player object 94 may cause the graphical user interface to present display objects related to various media processing functions.
- the top-level graphical user interface environment or state can be restored by pressing a button 96 located near the bottom of the mobile device 59.
- each corresponding device functionality may have corresponding "home" display objects displayed on the touch-sensitive display 73, and the graphical user interface environment can be restored by pressing the "home" display object.
- the top-level graphical user interface can include additional display objects 76, such as a short messaging service (SMS) object, a calendar object, a photos object, a camera object, a calculator object, a stocks object, a weather object, a maps object, a notes object, a clock object, an address book object, a settings object, and an app store object 97.
- SMS short messaging service
- Touching the SMS display object can, for example, invoke an SMS messaging environment and supporting functionality; likewise, each selection of a display object can invoke a corresponding object environment and functionality.
- Additional and/or different display objects can also be displayed in the graphical user interface.
- the display objects 76 can be configured by a user, e.g., a user may specify which display objects 76 are displayed, and/or may download additional applications or other software that provides other functionalities and corresponding display objects.
- the mobile device 59 can include one or more input/output (I/O) devices and/or sensor devices.
- I/O input/output
- a speaker 60 and a microphone 62 can be included to facilitate voice-enabled functionalities, such as phone and voice mail functions.
- an up/down button 84 for volume control of the speaker 60 and the microphone 62 can be included.
- the mobile device 59 can also include an on/off button 82 for a ring indicator of incoming phone calls.
- a loud speaker 64 can be included to facilitate hands- free voice functionalities, such as speaker phone functions.
- An audio jack 66 can also be included for use of headphones and/or a microphone.
- a proximity sensor 68 can be included to facilitate the detection of the user positioning the mobile device 59 proximate to the user's ear and, in response, to disengage the touch-sensitive display 73 to prevent accidental function invocations.
- the touch-sensitive display 73 can be turned off to conserve additional power when the mobile device 59 is proximate to the user's ear.
- an ambient light sensor 70 can be utilized to facilitate adjusting the brightness of the touch-sensitive display 73.
- an accelerometer 72 can be utilized to detect movement of the mobile device 59, as indicated by the directional arrows. Accordingly, display objects and/or media can be presented according to a detected orientation, e.g., portrait or landscape.
- the mobile device 59 may include circuitry and sensors for supporting a location determining capability, such as that provided by the global positioning system (GPS) or other positioning systems (e.g., systems using Wi-Fi access points, television signals, cellular grids, Uniform Resource Locators (URLs)).
- GPS global positioning system
- URLs Uniform Resource Locators
- a positioning system e.g., a GPS receiver
- a positioning system can be integrated into the mobile device 59 or provided as a separate device that can be coupled to the mobile device 59 through an interface (e.g., port device 90) to provide access to location-based services.
- the mobile device 59 can also include a camera lens and sensor 80.
- the camera lens and sensor 80 can be located on the back surface of the mobile device 59.
- the camera can capture still images and/or video.
- the mobile device 59 can also include one or more wireless communication subsystems, such as an 802.11b/g communication device 86, and/or a BLUETOOTH communication device 88.
- Other communication protocols can also be supported, including other 802.x communication protocols (e.g., WiMax, Wi-Fi, 3G, LTE), code division multiple access (CDMA), global system for mobile communications (GSM), Enhanced Data GSM Environment (EDGE), etc.
- 802.x communication protocols e.g., WiMax, Wi-Fi, 3G, LTE
- CDMA code division multiple access
- GSM global system for mobile communications
- EDGE Enhanced Data GSM Environment
- the port device 90 e.g., a Universal Serial Bus (USB) port, or a docking port, or some other wired port connection
- the port device 90 can, for example, be utilized to establish a wired connection to other computing devices, such as other communication devices 59, network access devices, a personal computer, a printer, or other processing devices capable of receiving and/or transmitting data.
- the port device 90 allows the mobile device 59 to synchronize with a host device using one or more protocols, such as, for example, the TCP/IP, HTTP, UDP and any other known protocol.
- a TCP/IP over USB protocol can be used.
- FIG. 9 is a block diagram 2200 of an example implementation of the mobile device 59.
- the mobile device 59 can include a memory interface 2202, one or more data processors, image processors and/or central processing units 2204, and a peripherals interface 2206.
- the memory interface 2202, the one or more processors 2204 and/or the peripherals interface 2206 can be separate components or can be integrated in one or more integrated circuits.
- the various components in the mobile device 59 can be coupled by one or more communication buses or signal lines.
- Sensors, devices and subsystems can be coupled to the peripherals interface 2206 to facilitate multiple functionalities.
- a motion sensor 2210, a light sensor 2212, and a proximity sensor 2214 can be coupled to the peripherals interface 2206 to facilitate the orientation, lighting and proximity functions described above.
- Other sensors 2216 can also be connected to the peripherals interface 2206, such as a positioning system (e.g., GPS receiver), a temperature sensor, a biometric sensor, or other sensing device, to facilitate related functionalities.
- a camera subsystem 2220 and an optical sensor 2222 can be utilized to facilitate camera functions, such as recording photographs and video clips.
- an optical sensor 2222 e.g., a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, can be utilized to facilitate camera functions, such as recording photographs and video clips.
- CCD charged coupled device
- CMOS complementary metal-oxide semiconductor
- Communication functions can be facilitated through one or more wireless comm unication subsystems 2224, which can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters.
- the specific design and implementation of the communication subsystem 2224 can depend on the communication network(s) over which the mobile device 59 is intended to operate.
- a mobile device 59 may include communication subsystems 2224 designed to operate over a GSM network, a GPRS network, an EDGE network, a Wi-Fi or WiMax network, and a BLUETOOTH network.
- the wireless communication subsystems 2224 may include hosting protocols such that the device 59 may be configured as a base station for other wireless devices.
- An audio subsystem 2226 can be coupled to a speaker 2228 and a microphone 2230 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions.
- the I/O subsystem 2240 can include a touch screen controller 2242 and/or other input controller(s) 2244.
- the touch-screen controller 2242 can be coupled to a touch screen 2246.
- the touch screen 2246 and touch screen controller 2242 can, for example, detect contact and movement or break thereof using any of multiple touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch screen 2246.
- the other input controller(s) 2244 can be coupled to other input/control devices 2248, such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus.
- the one or more buttons can include an up/down button for volume control of the speaker 2228 and/or the microphone 2230.
- a pressing of the button for a first duration may disengage a lock of the touch screen 2246; and a pressing of the button for a second duration that is longer than the first duration may turn power to the mobile device 59 on or off.
- the user may be able to customize a functionality of one or more of the buttons.
- the touch screen 2246 can, for example, also be used to implement virtual or soft buttons and/or a keyboard.
- the mobile device 59 can present recorded audio and/or video files, such as MP3, AAC, and MPEG files.
- the mobile device 59 can include the functionality of an MP3 player.
- the mobile device 59 may, therefore, include a 32-pin connector that is compatible with the MP3 player.
- Other input/output and control devices can also be used.
- the memory interface 2202 can be coupled to memory 2250.
- the memory 2250 can include highspeed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory (e.g., NAND, NOR).
- the memory 2250 can store an operating system 2252, such as Darwin, RTXC, LINUX, UNIX, OS X, ANDROID, IOS, WINDOWS, or an embedded operating system such as VxWorks.
- the operating system 2252 may include instructions for handling basic system services and for performing hardware dependent tasks.
- the operating system 2252 can be a kernel (e.g., UNIX kernel).
- the memory 2250 may also store communication instructions 2254 to facilitate communicating with one or more additional devices, one or more computers and/or one or more servers.
- the memory 2250 may include graphical user interface instructions 2256 to facilitate graphic user interface processing including presentation, navigation, and selection within an application store; sensor processing instructions 2258 to facilitate sensor-related processing and functions; phone instructions 2260 to facilitate phone-related processes and functions; electronic messaging instructions 2262 to facilitate electronic-messaging related processes and functions; web browsing instructions 2264 to facilitate web browsing-related processes and functions; media processing instructions 2266 to facilitate media processing-related processes and functions; GPS/Navigation instructions 2268 to facilitate GPS and navigation-related processes and instructions; camera instructions 2270 to facilitate camera-related processes and functions; and/or other software instructions 2272 to facilitate other processes and functions.
- Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described above. These instructions need not be implemented as separate software programs, procedures or modules.
- the memory 2250 can include additional instructions or fewer instructions.
- various functions of the mobile device 59 may be im plemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Finance (AREA)
- General Engineering & Computer Science (AREA)
- Strategic Management (AREA)
- General Physics & Mathematics (AREA)
- Entrepreneurship & Innovation (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Economics (AREA)
- Game Theory and Decision Science (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Computer Networks & Wireless Communication (AREA)
- Digital Computer Display Output (AREA)
Abstract
A method for rendering advertisements on touchscreen devices having small screens is provided. Components of the existing GUI (i.e., controls) are used for rendering advertisement (or other) data. Existing mobile device GUI elements have visible surfaces filled with a background color (or pattern). These surfaces are "painted" by an advertisement data similar to a wall painted by graffiti. The advertisement data can be static or it can be generated on-the-fly. The GUI controls can be used independent of each other. Thus, the advertising data can be selectively displayed only on some of the controls without distracting the user from the main tasks.
Description
METHOD FOR RENDERING ADVERTISM ENTS ON TOUCHSCREEN DEVICES
The present invention is related to touchscreen devices, and more particularly, to rendering advertisements on touchscreen devices using elements of existing/preloaded graphic user interfaces (GU Is).
An existing market of advertisement on touchscreen devices (i.e. mobile devices, smartphones, desktop, netbook, laptop, TV-set, game console, etc.) has been basically formed by application of the technologies and principles of advertising used on PCs and notebooks to mobile platforms. However, the effectiveness of advertising on mobile devices is significantly lower compared to advertising on PCs.
Mobile device users typically run special thin applications instead of heavy-footprint conventional browsers that take up a lot of system resources. Thus, mobile users do not use conventional searches nearly as often as PC users. Accordingly, the most popular means of monetization on mobile devices are advertisement banners and applications integrated into web resources.
The banners are small ads that take up portions of a mobile device screen, which is typically quite small. Because the banners are also very small, they contain very basic advertisement data (i.e., a short text). These basic ads mostly irritate the users and do not attract attention or spark user interest.
The most commonly used method for mobile advertising is using the ads, along with free applications that carry, i.e., ads that are essentially forced upon users, who download a free application or a game. These ads do generate some small profits due to growth of the mobile device market, but in general are also ineffective.
The conventional mobile device advertising banners take up a large portion of an already small screen and irritate a user. Furthermore, the ad banners often appear within an application or a game. The user is focused on the application or the game and pays very little attention to the ads. So these advertisements often distract the user and, in essence, act as anti-advertisements.
All of the conventional mobile advertisement systems such as Google AdMob employ the above described forced ads. In other words, the advertisement systems are "Blind Networks" that render ads without much concern for its audience. In some implementations, these networks collect and analyze user personal and/or geolocation data and act as "Targeted Networks", which also has proven to be ineffective on mobile platforms.
Accordingly, a method for rendering ads on mobile devices, which takes into consideration the small screen size, is desired.
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
In the drawings:
FIG. 1 illustrates an example of a "graffito banner," in accordance with the exemplary embodiment;
FIG. 2 illustrates a "graffito banner" with an active header, in accordance with the exemplary embodiment;
FIG. 3 illustrates system architecture, in accordance with the exemplary embodiment;
FIG. 4 illustrates a method implemented by the ad visualizer, in accordance to the exemplary embodiment.
FIG. 5 illustrates a flow chart of a method for rendering ads, in accordance to the exemplary embodiment;
FIG. 6 illustrates examples of ads displayed on a mobile device;
FIG. 7 is a block diagram of an exemplary mobile device that can be used in the invention;
FIG. 8 is a block diagram of an exemplary implementation of the mobile device;
FIGs. 9-10 illustrate operation of the Ad Activation Engine.
FIG. 11 represents an alternative method for ad-layer positioning.
Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings.
According to the exemplary embodiment, a method and system for displaying advertisements (or, more generally, promoted items, or promoted information, or information generated in response to user goals) over existing mobile device GUI elements are provided. The ads used on the small screen do not require any additional "shelf space," because they are displayed within the existing GUI controls.
According to an exemplary embodiment, the existing GUI components (i.e., controls) are used for rendering advertisement (or other) data. All mobile device GUI elements have visible surfaces filled with a background color (or pattern). These surfaces are "painted" by advertisement data similar to a wall painted by the graffiti. Accordingly, herein such ad banners are referred to as "graffito" banners. The "graffito" ads can be static images or they can be video clips or animated pictures.
The GUI controls can be used independent of each other. Thus, advertising data can be displayed selectively only on some of the controls without distracting the user from the main tasks. In one embodiment, a separate active banner header is implemented for providing a user with a feedback capability. In other words, the user can react to particular ads using a dedicated control(s). The controls can initiate different scenarios, which include but are not limited to: showing a full-screen banner with a detailed description of the product(s) and/or service(s), sending a coupon/promotional code to the user, opening a page in the browser, activating an application, activating a chat with the merchant/service representative, sending a message (email, SMS, Facebook message or similar, etc.) to the user.
An example of the "graffito banners" is shown in FIG. 1. A "graffito banner" is shown on the left. According to the exemplary embodiment, some add banners can have active control elements. A user can click on the active control in order to show his reaction to the ad, or to receive more information. FIG. 2 illustrates a graffito banner with a header. Note that the header is not the ad itself, but is a tool (channel) for feedback collection. It does not need to deliver any information, except the "press here to react to ad" functionality. Note that the header is optional. The header is integrated into the banner and allows a user to click on it for expressing his reaction to the ad banner.
FIG. 3 illustrates system architecture, in accordance with the exemplary embodiment. An ad visualizer 330 communicates with a mobile device application 360. The ad visualizer 330 receives ads from an ad server 310 via an ad library 320 and renders the ads in the application form 340. According to the exemplary embodiment, the ad library 320 serves as an interface between the ad server 310 and a "graffito" technology support system 300.
The ad library 320 provides metadata acquired from the mobile device screen to the ad server 310 and selects appropriate ads based on the metadata. The metadata can include, for example, geopositioning/geolocation data from the user's computer or mobile device, gender data, specific application details (e.g., application type, application name, vendor), device type, OS, service provider, etc. Then, the selected ads are processed through the visualizer 330 and "graffitied" (i.e., displayed) within the selected GUI controls 365. Note that the ad library 320 also supports conventional ad banners displayed on the mobile device 360.
FIG. 4 illustrates a method implemented by the ad visualizer, in accordance to the exemplary embodiment. The ad visualizer 330 controls transparency of the GUI elements (buttons) 420 and provides visibility of the ads that are loaded into an ad layer 430. A panel above the ad layer, which has "red color" will be transparent, the background of the button (but not the caption) will be transparent, the background of the checkbox, can also be transparent, etc. The ad layer 430 underlines all GUI elements 420. The ad visualizer 330 loads the ads into the ad layer 430. Then, the ad visualizer 330 makes a form background (fill) 410 transparent so the ads located on the ad layer 430 become visible.
Note that the ad layer 430 is normally transparent. The layer 430 changes its transparency when ad rendering is activated. At this point, the background of the selected GUI elements becomes transparent. The ad visualizer 330 supports static and dynamic ads. According to the exemplary embodiment, the ad banner header 435 is optional. The header 435 is not controlled by the ad visualizer 330. Instead, it is integrated into the ads. The ad banner header 435 allows users to react to the ad (or give feedback). According to the exemplary embodiment, an "inverse color" effect is applied to non-background areas (e.g., button labels) in order to make them visible in the background of the ad banners.
The ad visualizer 330 controls the transparency of the GUI elements based on their type (button, panel, checkbox, editbox, combobox etc.). Users can notify the system about their interest through a GUI elements, for example, an infrared camera, an infrared sensor or a video camera - where the sensor register (Kinect-type integrated or external device) replace physical "touch screen" and recognizes user's choice; a microphone - e.g., Siri-like model (a sensor with background voice-recognition module, which can be integrated or external) - a user indicates a GUI elements by voice command; a kinesthetic detector - a user uses a joystick, a mouse or wearable sensors (special gloves); a detector of eye movement - a user uses a device such as Googleglasses or a special camera that can recognize, by the position of the eyes, the GUI element the user is looking at and the command (e.g. double-blink) to "activate" our solution. The automatic transparency change option must be turned on by the GUI developers. The level of the brightness/transparency of ad-level as well as the transparency of the controls can be varied over time (they are managed by ad-visualizer, and can be set of various levels, such as 20%, 50%, 80%, etc., with permissions, or no permissions, to click through to the active elements, such as buttons or icons, on the layer below), which allows supporting additional visual effects for ad presentation.
FIG. 5 illustrates a flow chart of a method for rendering ads, in accordance with the exemplary embodiment. An ad is activated in step 510 and sent from the ad server to the ad library in step 520. Then, the ad is displayed by the ad visualize inside the ad layer in step 530. The ad layer is made visible. In step 540, the ad visualizer changes transparency of the GUI controls in order to make the ads visible. Subsequently, the ads are rendered to a user in step 550. Once the ad is deactivated in step 560, the ad visualizer makes the ad layer transparent in step 570. The activation and deactivation of the ad is the responsibility of visualizer - it can do it based on time delay, external request from ad-platform, application request, and so on. Subsequently, the background of the GUI controls is restored.
Those skilled in the art will appreciate that the proposed method better attracts mobile device users' attention to the ads. The method allows the use of relatively large ad banners on small screens. The ad banners do not take up any screen space when not being displayed. According to the exemplary embodiment, the existing GUI elements (e.g., panels, checkboxes, buttons, etc.) are used. Each of the GUI elements has some areas filled with a background color. The background color is advantageously replaced by the ad banner content.
Thus, the user sees the ad banner through a regular GUI control, while the functionality of the control is not affected. For example, the "Open" button will continue performing an action "Open", check- and combo-buttons will perform the usual functions, etc.
FIG. 6 illustrates examples of ads displayed on a mobile device. The left example illustrates an ad displayed without a banner. The middle example illustrates a traditional banner that takes up a portion of the screen. The example on the right illustrates a "graffito" banner that is displayed inside an active GUI control. This banner does not take up screen space, because it becomes invisible after deactivation.
FIG. 11 represents an alternative method for ad-layer positioning - above all the elements in the screen. In this case, the ad-visualizer sets up transparency for the ad-layer (rather than for form controls). The ad-layer translates all the events (touches, swipes, gestures, etc.) to the elements behind it - here, the ad-layer is acting as a passive element. "On top" position of ad-layer allows to draw the information in more flexibly - there are no potential restrictions due to the controls behind it - each piece of the form can be used for the ads. The ad-visualizer also can dynamically recognize "free areas" in the form and/or on the screen, and shift an ad to them in a real-time mode while the user performs his usual activities. The ad-visualizer analyses the surface behind the ad-layer and recognizes the areas with the background or monotone color and sufficient size, which are suitable for ads and commands the ad-layer to show the ads in those spaces. Different transparency levels, as well as brightness of ad- laver, allow delivery of different visual effects.
The Ad Visualizer (see FIG. 9) is the part of an Ad Activation Engine, which is distributed as a precompiled library available for application developers. Other components of the Ad Activation Engine are Event Catcher and Customer Feedback Processing.
The Event catcher is registered in the OS events processing stack as a receiver for several kinds of inbound events and recognizes the events that deliver ad-data. The Ad Visualizer is responsible for ad presentation in the applications. The Customer feedback processing component is responsible for "click to ad header" events processing, such as opening a link or sending the event to an Ad server or sending data (e.g., coordinates or calendar event) to the standard application, or activating a dialog window.
The operation of the Ad Activation Engine is shown in FIG. 10. After starting (step 1002), the Event Catcher receives the advertisement from a network source, such as an Ad server (step 1004). In
step 1006, the Ad Visualizer renders the ad to the user. In step 1008, the Feedback processing component generates a response based on input from the user, and sends it to the Ad Server. In step 1010, the process ends, or returns to step 1002.
FIG. 8 is a block diagram of an exemplary mobile device 59 on which the invention can be implemented. The mobile device 59 can be, for example, a personal digital assistant, a cellular telephone, a network appliance, a camera, a smart phone, an enhanced general packet radio service (EGP S) mobile phone, a network base station, a media player, a navigation device, an email device, a game console, or a combination of any two or more of these data processing devices or other data processing devices.
In some implementations, the mobile device 59 includes a touch-sensitive display 73. The touch- sensitive display 73 can implement liquid crystal display (LCD) technology, light emitting polymer display (LPD) technology, or some other display technology. The touch-sensitive display 73 can be sensitive to haptic and/or tactile contact with a user.
In some implementations, the touch-sensitive display 73 can comprise a multi-touch-sensitive display 73. A multi-touch-sensitive display 73 can, for example, process multiple simultaneous touch points, including processing data related to the pressure, degree and/or position of each touch point. Such processing facilitates gestures and interactions with multiple fingers, chording, and other interactions. Other touch-sensitive display technologies can also be used, e.g., a display in which contact is made using a stylus or other pointing device.
In some implementations, the mobile device 59 can display one or more graphical user interfaces on the touch-sensitive display 73 for providing the user access to various system objects and for conveying information to the user. In some implementations, the graphical user interface can include one or more display objects 74, 76. In the example shown, the display objects 74, 76, are graphic representations of system objects. Some examples of system objects include device functions, applications, windows, files, alerts, events, or other identifiable system objects.
In some implementations, the mobile device 59 can implement multiple device functionalities, such as a telephony device, as indicated by a phone object 91; an e-mail device, as indicated by the e- mail object 92; a network data communication device, as indicated by the Web object 93; a Wi-Fi base station device (not shown); and a media processing device, as indicated by the media player object 94. In some implementations, particular display objects 74, e.g., the phone object 91, the e-mail object 92, the Web object 93, and the media player object 94, can be displayed in a menu bar 95. In some implementations, device functionalities can be accessed from a top-level graphical user interface, such as the graphica l user interface illustrated in the figure. Touching one of the objects 91, 92, 93 or 94 can, for example, invoke corresponding functionality.
In some implementations, the mobile device 59 can implement network distribution functionality. For example, the functionality can enable the user to take the mobile device 59 and its associated network while traveling. In particular, the mobile device 59 can extend Internet access (e.g., Wi-Fi) to other wireless devices in the vicinity. For example, mobile device 59 can be configured as a base station for one or more devices. As such, mobile device 59 can grant or deny network access to other wireless devices.
In some implementations, upon invocation of device functionality, the graphical user interface of the mobile device 59 changes, or is augmented or replaced with another user interface or user interface
elements, to facilitate user access to particular functions associated with the corresponding device functionality. For example, in response to a user touching the phone object 91, the graphical user interface of the touch-sensitive display 73 may present display objects related to various phone functions; likewise, touching of the email object 92 may cause the graphical user interface to present display objects related to various e-mail functions; touching the Web object 93 may cause the graphical user interface to present display objects related to various Web-surfing functions; and touching the media player object 94 may cause the graphical user interface to present display objects related to various media processing functions.
In some implementations, the top-level graphical user interface environment or state can be restored by pressing a button 96 located near the bottom of the mobile device 59. In some implementations, each corresponding device functionality may have corresponding "home" display objects displayed on the touch-sensitive display 73, and the graphical user interface environment can be restored by pressing the "home" display object.
In some implementations, the top-level graphical user interface can include additional display objects 76, such as a short messaging service (SMS) object, a calendar object, a photos object, a camera object, a calculator object, a stocks object, a weather object, a maps object, a notes object, a clock object, an address book object, a settings object, and an app store object 97. Touching the SMS display object can, for example, invoke an SMS messaging environment and supporting functionality; likewise, each selection of a display object can invoke a corresponding object environment and functionality.
Additional and/or different display objects can also be displayed in the graphical user interface. For example, if the device 59 is functioning as a base station for other devices, one or more "connection" objects may appear in the graphical user interface to indicate the connection. In some implementations, the display objects 76 can be configured by a user, e.g., a user may specify which display objects 76 are displayed, and/or may download additional applications or other software that provides other functionalities and corresponding display objects.
In some implementations, the mobile device 59 can include one or more input/output (I/O) devices and/or sensor devices. For example, a speaker 60 and a microphone 62 can be included to facilitate voice-enabled functionalities, such as phone and voice mail functions. In some implementations, an up/down button 84 for volume control of the speaker 60 and the microphone 62 can be included. The mobile device 59 can also include an on/off button 82 for a ring indicator of incoming phone calls. In some implementations, a loud speaker 64 can be included to facilitate hands- free voice functionalities, such as speaker phone functions. An audio jack 66 can also be included for use of headphones and/or a microphone.
In some implementations, a proximity sensor 68 can be included to facilitate the detection of the user positioning the mobile device 59 proximate to the user's ear and, in response, to disengage the touch-sensitive display 73 to prevent accidental function invocations. In some implementations, the touch-sensitive display 73 can be turned off to conserve additional power when the mobile device 59 is proximate to the user's ear.
Other sensors can also be used. For example, in some implementations, an ambient light sensor 70 can be utilized to facilitate adjusting the brightness of the touch-sensitive display 73. In some implementations, an accelerometer 72 can be utilized to detect movement of the mobile device 59, as indicated by the directional arrows. Accordingly, display objects and/or media can be presented according to a detected orientation, e.g., portrait or landscape. In some implementations, the mobile
device 59 may include circuitry and sensors for supporting a location determining capability, such as that provided by the global positioning system (GPS) or other positioning systems (e.g., systems using Wi-Fi access points, television signals, cellular grids, Uniform Resource Locators (URLs)). In some implementations, a positioning system (e.g., a GPS receiver) can be integrated into the mobile device 59 or provided as a separate device that can be coupled to the mobile device 59 through an interface (e.g., port device 90) to provide access to location-based services.
The mobile device 59 can also include a camera lens and sensor 80. In some implementations, the camera lens and sensor 80 can be located on the back surface of the mobile device 59. The camera can capture still images and/or video.
The mobile device 59 can also include one or more wireless communication subsystems, such as an 802.11b/g communication device 86, and/or a BLUETOOTH communication device 88. Other communication protocols can also be supported, including other 802.x communication protocols (e.g., WiMax, Wi-Fi, 3G, LTE), code division multiple access (CDMA), global system for mobile communications (GSM), Enhanced Data GSM Environment (EDGE), etc.
In some implementations, the port device 90, e.g., a Universal Serial Bus (USB) port, or a docking port, or some other wired port connection, is included. The port device 90 can, for example, be utilized to establish a wired connection to other computing devices, such as other communication devices 59, network access devices, a personal computer, a printer, or other processing devices capable of receiving and/or transmitting data. In some implementations, the port device 90 allows the mobile device 59 to synchronize with a host device using one or more protocols, such as, for example, the TCP/IP, HTTP, UDP and any other known protocol. In some implementations, a TCP/IP over USB protocol can be used.
FIG. 9 is a block diagram 2200 of an example implementation of the mobile device 59. The mobile device 59 can include a memory interface 2202, one or more data processors, image processors and/or central processing units 2204, and a peripherals interface 2206. The memory interface 2202, the one or more processors 2204 and/or the peripherals interface 2206 can be separate components or can be integrated in one or more integrated circuits. The various components in the mobile device 59 can be coupled by one or more communication buses or signal lines.
Sensors, devices and subsystems can be coupled to the peripherals interface 2206 to facilitate multiple functionalities. For example, a motion sensor 2210, a light sensor 2212, and a proximity sensor 2214 can be coupled to the peripherals interface 2206 to facilitate the orientation, lighting and proximity functions described above. Other sensors 2216 can also be connected to the peripherals interface 2206, such as a positioning system (e.g., GPS receiver), a temperature sensor, a biometric sensor, or other sensing device, to facilitate related functionalities.
A camera subsystem 2220 and an optical sensor 2222, e.g., a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, can be utilized to facilitate camera functions, such as recording photographs and video clips.
Communication functions can be facilitated through one or more wireless comm unication subsystems 2224, which can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. The specific design and implementation of the communication subsystem 2224 can depend on the communication network(s) over which the mobile device 59 is intended to operate. For example, a mobile device 59 may include communication subsystems 2224 designed to operate over a GSM network, a GPRS network, an EDGE network, a Wi-Fi or WiMax
network, and a BLUETOOTH network. In particular, the wireless communication subsystems 2224 may include hosting protocols such that the device 59 may be configured as a base station for other wireless devices.
An audio subsystem 2226 can be coupled to a speaker 2228 and a microphone 2230 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions.
The I/O subsystem 2240 can include a touch screen controller 2242 and/or other input controller(s) 2244. The touch-screen controller 2242 can be coupled to a touch screen 2246. The touch screen 2246 and touch screen controller 2242 can, for example, detect contact and movement or break thereof using any of multiple touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch screen 2246.
The other input controller(s) 2244 can be coupled to other input/control devices 2248, such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus. The one or more buttons (not shown) can include an up/down button for volume control of the speaker 2228 and/or the microphone 2230.
In one implementation, a pressing of the button for a first duration may disengage a lock of the touch screen 2246; and a pressing of the button for a second duration that is longer than the first duration may turn power to the mobile device 59 on or off. The user may be able to customize a functionality of one or more of the buttons. The touch screen 2246 can, for example, also be used to implement virtual or soft buttons and/or a keyboard.
In some implementations, the mobile device 59 can present recorded audio and/or video files, such as MP3, AAC, and MPEG files. In some implementations, the mobile device 59 can include the functionality of an MP3 player. The mobile device 59 may, therefore, include a 32-pin connector that is compatible with the MP3 player. Other input/output and control devices can also be used.
The memory interface 2202 can be coupled to memory 2250. The memory 2250 can include highspeed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory (e.g., NAND, NOR). The memory 2250 can store an operating system 2252, such as Darwin, RTXC, LINUX, UNIX, OS X, ANDROID, IOS, WINDOWS, or an embedded operating system such as VxWorks. The operating system 2252 may include instructions for handling basic system services and for performing hardware dependent tasks. In some implementations, the operating system 2252 can be a kernel (e.g., UNIX kernel).
The memory 2250 may also store communication instructions 2254 to facilitate communicating with one or more additional devices, one or more computers and/or one or more servers. The memory 2250 may include graphical user interface instructions 2256 to facilitate graphic user interface processing including presentation, navigation, and selection within an application store; sensor processing instructions 2258 to facilitate sensor-related processing and functions; phone instructions 2260 to facilitate phone-related processes and functions; electronic messaging instructions 2262 to facilitate electronic-messaging related processes and functions; web browsing instructions 2264 to facilitate web browsing-related processes and functions; media processing instructions 2266 to facilitate media processing-related processes and functions; GPS/Navigation instructions 2268 to facilitate GPS and navigation-related processes and instructions; camera instructions 2270 to facilitate camera-related
processes and functions; and/or other software instructions 2272 to facilitate other processes and functions.
Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described above. These instructions need not be implemented as separate software programs, procedures or modules. The memory 2250 can include additional instructions or fewer instructions. Furthermore, various functions of the mobile device 59 may be im plemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.
Those skilled in the art will appreciate that proposed system and method allow for effective advertising directed to mobile device users.
Having thus described a preferred embodiment, it should be apparent to those skilled in the art that certain advantages of the described method and apparatus have been achieved.
It should also be appreciated that various modifications, adaptations and alternative embodiments thereof may be made within the scope and spirit of the present invention. The invention is further defined by the following claims.
Claims
1. A method for rendering advertisements on touchscreen devices, the method comprising:
selecting an advertisement to be rendered to a user;
loading the selected advertisement from an ad server into an ad library;
processing the advertisement by an ad visualizer and loading the advertisement into a transparent full-screen ad layer a touchscreen device GUI controls;
changing transparency of the ad layer to "visible" by the ad visualizer; changing transparency of the GUI controls to make the advertisement visible;
rendering the advertisement to the user through or behind the GUI controls;
deactivating the advertisement;
making the ad layer transparent by the ad visualizer; and
restoring a background transparency of the GUI controls.
2. The method of claim 1, wherein the advertisement is rendered as a banner.
3. The method of claim 2, wherein the banner has a clickable header.
4. The method of claim 1, wherein GUI controls are any of:
a button,
a checkbox; and
a panel,
a combobox,
an editbox.
5. A method for rendering advertisements on touchscreen devices, the method comprising:
selecting an advertisement to be rendered to a user;
loading the selected advertisement from an ad server into an ad library;
processing the advertisement by an ad visualizer and loading the advertisement into a top transparent full-screen ad layer of touchscreen device GUI controls;
translating gestures of the user, including touches and swipes, through thru the ad layer to the GUI controls behind it;
changing transparency of the ad layer to "visible" by the ad visualizer;
rendering the advertisement to the user above the GUI control; deactivating the advertisement; and
restoring background transparency of the ad layer control.
6. The method of claim 5, wherein the advertisement is rendered as a banner.
7. The method of claim 6, wherein the banner has a clickable header.
8. A system for rendering advertisements on touchscreen devices, the system comprising: a hardware ad server for storing ads;
an ad library for selecting the ads based on user-related metadata; an ad visualizer connected to the ad library;
at least one user touchscreen device running a mobile application connected to the ad visualizer, wherein the ad library selects ads from the ad server and provide the ads to the ad visualizer; and wherein the ad visualizer processes the ads and renders them to a user through existing GUI controls of the touchscreen device application.
9. The system of claim 8, wherein the ad visualizer changes transparency of the GUI controls in order to display advertisement content.
10. The system of claim 8, wherein the ad visualizer loads advertisement content into a transparent ad layer underlining the GUI controls.
11. The system of claim 7, wherein the ad visualizer makes the ad layer visible in order to render the ads to the user.
12. A method for rendering advertisements on touch-screen devices, the method comprising:
selecting an advertisement for rendering to a user;
loading the selected advertisement from an ad server into an ad library;
processing the advertisement by an ad visualizer and loading the advertisement into a transparent full-screen ad layer that is controlled by a touch-screen device GUI;
using the ad visualizer to change a transparency of the ad layer to make it visible by utilizing GUI controls;
rendering the advertisement to the user through or behind the GUI controls;
deactivating the advertisement;
making the ad layer transparent by the ad visualizer; and
restoring background transparency of GUI controls.
13. The method of claim 12, wherein the ad-layer shows multiple advertisements at one time.
14. The method of claim 12, wherein the ad-visualizer dynamically changes size, color, position and transparency of an advertisement.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EA201500956A EA201500956A1 (en) | 2013-03-18 | 2013-03-18 | METHOD OF DISPLAYING ADVERTISING ON DEVICES WITH A TOUCH DISPLAY |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
RU2013111922 | 2013-03-18 | ||
RU2013111922A RU2617544C2 (en) | 2013-03-18 | 2013-03-18 | Method and system of displaying advertising on devices with touch display |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014148936A1 true WO2014148936A1 (en) | 2014-09-25 |
Family
ID=51580471
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/RU2013/000217 WO2014148936A1 (en) | 2013-03-18 | 2013-03-18 | Method for rendering advertisments on touchscreen devices |
Country Status (3)
Country | Link |
---|---|
EA (1) | EA201500956A1 (en) |
RU (1) | RU2617544C2 (en) |
WO (1) | WO2014148936A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6121960A (en) * | 1996-08-28 | 2000-09-19 | Via, Inc. | Touch screen systems and methods |
US20090144628A1 (en) * | 2001-06-30 | 2009-06-04 | Cokinetic Systems Corp. | Internet interface & integration language system and method |
US7979877B2 (en) * | 2003-12-23 | 2011-07-12 | Intellocity Usa Inc. | Advertising methods for advertising time slots and embedded objects |
US20120271718A1 (en) * | 2010-11-05 | 2012-10-25 | Chung Hee Sung | Method and system for providing background advertisement of virtual key input device |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005258015A (en) * | 2004-03-11 | 2005-09-22 | Mitsubishi Electric Corp | Multi-dimensional display device and method |
RU2410259C2 (en) * | 2006-03-22 | 2011-01-27 | Фольксваген Аг | Interactive control device and method of operating interactive control device |
GB2464094A (en) * | 2008-09-30 | 2010-04-07 | Rapid Mobile Media Ltd | Method and apparatus for displaying content at a mobile device |
KR101682218B1 (en) * | 2010-06-07 | 2016-12-02 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
KR101737839B1 (en) * | 2010-09-16 | 2017-05-19 | 엘지전자 주식회사 | Transparent display device and method for detecting an object |
-
2013
- 2013-03-18 WO PCT/RU2013/000217 patent/WO2014148936A1/en active Application Filing
- 2013-03-18 RU RU2013111922A patent/RU2617544C2/en active IP Right Revival
- 2013-03-18 EA EA201500956A patent/EA201500956A1/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6121960A (en) * | 1996-08-28 | 2000-09-19 | Via, Inc. | Touch screen systems and methods |
US20090144628A1 (en) * | 2001-06-30 | 2009-06-04 | Cokinetic Systems Corp. | Internet interface & integration language system and method |
US7979877B2 (en) * | 2003-12-23 | 2011-07-12 | Intellocity Usa Inc. | Advertising methods for advertising time slots and embedded objects |
US20120271718A1 (en) * | 2010-11-05 | 2012-10-25 | Chung Hee Sung | Method and system for providing background advertisement of virtual key input device |
Also Published As
Publication number | Publication date |
---|---|
EA201500956A1 (en) | 2016-01-29 |
RU2013111922A (en) | 2014-09-27 |
RU2617544C2 (en) | 2017-04-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10948949B2 (en) | Electronic apparatus having a hole area within screen and control method thereof | |
US10127089B2 (en) | Dynamic deep links to targets | |
US9921713B2 (en) | Transitional data sets | |
KR102330829B1 (en) | Method and apparatus for providing augmented reality function in electornic device | |
KR102311221B1 (en) | operating method and electronic device for object | |
US9990105B2 (en) | Accessible contextual controls within a graphical user interface | |
US20170235435A1 (en) | Electronic device and method of application data display therefor | |
CN108463799B (en) | Flexible display of electronic device and operation method thereof | |
KR102324083B1 (en) | Method for providing screen magnifying and electronic device thereof | |
US10599336B2 (en) | Method of displaying content and electronic device adapted to the same | |
KR102528389B1 (en) | Electronic device and notification processing method of electronic device | |
US20090235189A1 (en) | Native support for manipulation of data content by an application | |
US10254883B2 (en) | Electronic device for sensing pressure of input and method for operating the electronic device | |
EP3097470B1 (en) | Electronic device and user interface display method for the same | |
KR102042211B1 (en) | Apparatas and method for changing display an object of bending state in an electronic device | |
WO2020006669A1 (en) | Icon switching method, method for displaying gui, and electronic device | |
EP3340155A1 (en) | Electronic device and method for displaying web page using the same | |
US20150346989A1 (en) | User interface for application and device | |
KR20160147432A (en) | Device For Controlling Respectively Multiple Areas of Display and Method thereof | |
KR20170052003A (en) | Electronic device having multiple displays and method for controlling thereof | |
CN112347048A (en) | Electronic device and data sharing method thereof | |
US20140095315A1 (en) | Mobile device with reveal of dynamic content | |
KR102616793B1 (en) | Electronic device and method for providing scrren thereof | |
US9299090B1 (en) | Method for rendering advertisements on mobile devices | |
KR102351317B1 (en) | Method for displaying an electronic document and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13878694 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 201500956 Country of ref document: EA |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13878694 Country of ref document: EP Kind code of ref document: A1 |