US20170249071A1 - Self optimizing and reducing user experiences - Google Patents

Self optimizing and reducing user experiences Download PDF

Info

Publication number
US20170249071A1
US20170249071A1 US15/593,537 US201715593537A US2017249071A1 US 20170249071 A1 US20170249071 A1 US 20170249071A1 US 201715593537 A US201715593537 A US 201715593537A US 2017249071 A1 US2017249071 A1 US 2017249071A1
Authority
US
United States
Prior art keywords
user
screen
user interface
information regarding
elements
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/593,537
Inventor
John Tapley
Krystal Rose Higgins
Eric J. Farraro
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
eBay Inc
Original Assignee
eBay Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by eBay Inc filed Critical eBay Inc
Priority to US15/593,537 priority Critical patent/US20170249071A1/en
Assigned to EBAY INC. reassignment EBAY INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FARRARO, ERIC J., HIGGINS, Krystal Rose, TAPLEY, JOHN
Publication of US20170249071A1 publication Critical patent/US20170249071A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/08Auctions

Definitions

  • User experience is a broad term covering many aspects of experiences of users with computing products or services accessed through the computing products (such as web sites).
  • the user experience includes not only the user interface, but also the graphics and physical interaction. For the most part, such user experience is somewhat static in nature.
  • the layout of a website is generally the same for most or all users who access it, until such time as the web designer alters the layout. To the extent that different users are served different layouts, they are generally distributed based on preset criteria (such as hardware specifications of the user device).
  • FIG. 1 is a network diagram depicting a client-server system, within which one example embodiment may be deployed.
  • FIG. 2 is a diagram illustrating a progression, in accordance with an example embodiment, of dynamic alteration of a user interface.
  • FIG. 3 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user interface.
  • FIG. 4 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user interface.
  • FIG. 5 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user interface.
  • FIG. 6 is an interaction diagram illustrating a method, in accordance with an example embodiment, of dynamically altering a user interface.
  • FIG. 7 is a flow diagram illustrating a method, in accordance with an example embodiment, of dynamically altering a user interface.
  • FIG. 8 is a flow diagram illustrating a method, in accordance with another example embodiment, of dynamically altering a user interface.
  • FIG. 9 shows a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.
  • various aspects of a user experience are dynamically optimized in order to provide a customized and efficient experience for the user.
  • Elements within a user interface such as button size, advertising sizing, font, color, placement, the presences of certain interface objects, etc., can all be dynamically altered based on usage information as well as other factors (e.g., demographic information, information from user profiles, etc.).
  • a search bar displayed on a web site may change in size and location on the screen based on how often the user utilized the search bar. In the extreme, all elements but the search bar could be removed for users who primarily use the web site for searches.
  • the usage information is gathered from touchscreen usage on a user device. Key presses indicating where a user has pressed a touchscreen with his or her finger (or stylus) may be tracked to determine the frequency of presses on various elements of the user interface and areas of the touchscreen. This information may then be used to dynamically modify the user interface, by atrophying out lesser used elements and areas and introducing in new elements.
  • the dynamic adjustment may be performed by the local user device, or may be performed at the server-level by, for example, a web server. In some embodiments, a combination of local user devices and one or more servers may be used to perform the dynamic adjustment.
  • FIG. 1 is a network diagram depicting a client-server system 100 , within which one example embodiment may be deployed.
  • a networked system 102 in the example forms of a network-based marketplace or publication system, provides server-side functionality, via a network 104 (e.g., the Internet or Wide Area Network (WAN)) to one or more clients.
  • FIG. 1 illustrates, for example, a web client 106 (e.g., a browser), and a programmatic client 108 executing on respective client machines 110 and 112 .
  • a web client 106 e.g., a browser
  • programmatic client 108 executing on respective client machines 110 and 112 .
  • An Application Program Interface (API) server 114 and a web server 116 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 118 .
  • the application servers 118 host one or more marketplace applications 120 and payment applications 122 .
  • the application servers 118 are, in turn, shown to be coupled to one or more database servers 124 that facilitate access to one or more databases 126 .
  • the marketplace applications 120 may provide a number of marketplace functions and services to users that access the networked system 102 .
  • the payment applications 122 may likewise provide a number of payment services and functions to users.
  • the payment applications 122 may allow users to accumulate value (e.g., in a commercial currency, such as the U.S. dollar, or a proprietary currency, such as “points”) in accounts, and then later to redeem the accumulated value for products (e.g., goods or services) that are made available via the marketplace applications 120 . While the marketplace and payment applications 120 and 122 are shown in FIG. 1 to both form part of the networked system 102 , it will be appreciated that, in alternative embodiments, the payment applications 122 may form part of a payment service that is separate and distinct from the networked system 102 .
  • system 100 shown in FIG. 1 employs a client-server architecture
  • present disclosure is of course not limited to such an architecture, and could equally well find application in a distributed, or peer-to-peer, architecture system, for example.
  • the various marketplace and payment applications 120 and 122 could also be implemented as standalone software programs, which do not necessarily have networking capabilities.
  • the web client 106 accesses the various marketplace and payment applications 120 and 122 via the web interface supported by the web server 116 .
  • the programmatic client 108 accesses the various services and functions provided by the marketplace and payment applications 120 and 122 via the programmatic interface provided by the API server 114 .
  • the programmatic client 108 may, for example, be a seller application (e.g., the TurboLister application developed by eBay Inc., of San Jose, Calif.) to enable sellers to author and manage listings on the networked system 102 in an off-line manner, and to perform batch-mode communications between the programmatic client 108 and the networked system 102 .
  • FIG. 1 also illustrates a third party application 128 , executing on a third party server machine 130 , as having programmatic access to the networked system 102 via the programmatic interface provided by the API server 114 .
  • the third party application 128 may, utilizing information retrieved from the networked system 102 , support one or more features or functions on a website hosted by the third party.
  • the third party website may, for example, provide one or more promotional, marketplace or payment functions that are supported by the relevant applications of the networked system 102 .
  • FIG. 2 is a diagram illustrating a progression, in accordance with an example embodiment, of dynamic alteration of a user interface.
  • Pictured here are a mobile device 200 a , 200 b , 200 c in three states. It should be noted that while a mobile device is depicted, a similar process could run on any electronic device. Beginning with the mobile device 200 a in the first state, it can be seen that the user interface 202 a has various sections, including a search bar 204 , an activity dashboard 206 , a merchandise area 208 , and an advertisement 210 .
  • the user interface 202 a may depict an interface to an online auction web site, although one of ordinary skill in the art will recognize that this disclosure can apply to other types of user interfaces as well.
  • the activity dashboard 206 Within the activity dashboard 206 are three activities: watching 212 (for items in the online auction the user has selected as being of interest), buying 214 (for items in the online auction the user has bid on), and selling 216 (for items in the online auction the user is selling).
  • Within the merchandise area 208 may be a number of items 218 , 220 , 222 .
  • the items depicted are in separate categories, labelled as categories A, B, and C.
  • category A may be automobiles
  • category B may be toys
  • category C may be books.
  • the system may display, for example, a single item for each category representing the item in the category that is likeliest to be of interest to the user (based, perhaps, on previous purchases or searches).
  • mobile device 200 b which is in the second state, the user has begun to use the user interface 202 b by selecting various of the sections of the user interface 202 b . Depicted here is use by pressing on a touchscreen of the mobile device 200 b .
  • Various circles and circle groupings 224 a - 224 d represent “presses,” namely areas of the user interface 202 b that the user has selected.
  • the user has selected on the search bar 204 a number of times, on the buying activity 214 , and also selected on category A merchandise items 218 .
  • the system may track these interactions and adjust the user interface to better align with the user's apparent interests and usage preferences.
  • the system here may dynamically adjust the activity dashboard 206 in the user interface 202 c so that only a buying activity 214 is depicted, since the user has expressed little or no interest in the watching activity 212 or selling activity 216 .
  • the system may also remove the advertisement 210 , since the user has expressed little or no interest in that, and expand the merchandise area 208 to compensate, allowing for more items 226 a - 226 f to be displayed.
  • the merchandise area 208 may be reconfigured to only display items 226 a - 226 f from category A.
  • the result is a custom designed interface that has been dynamically and automatically reconfigured to the user's needs.
  • FIG. 3 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user interface.
  • the user interface 302 a may include a search bar 304 , an activity dashboard 306 , a merchandise area 308 , and an advertisement 310 .
  • Within the activity dashboard 306 are three activities: watching 312 (for items in the online auction the user has selected as being of interest), buying 314 (for items in the online auction the user has bid on), and selling 316 (for items in the online auction the user is selling).
  • Within the merchandise area 308 may be a number of items 318 , 320 , 322 .
  • the user has selected on the search bar 304 and the advertisement 310 , but not the other items of the user interface 302 a .
  • the system may then dynamically alter the user interface 302 a into the user interface 302 b of mobile device 300 b , which is the second state.
  • the search bar 304 remains, but the activity dashboard 306 and merchandise area 308 have been removed, in favor of advertisements 326 - 330 , which represent advertisements of different types.
  • the “type” of the advertisement may, in fact, be any categorization useful to differentiate advertisements, including differentiations based on the category of the advertised item or service (e.g., automobile advertisement versus a drug advertisement), or based on the format of the advertisement (e.g., animated advertisement versus static advertisement).
  • the user has then subsequently continued to select on the search bar 304 as well as on type B advertisement(s) 328 .
  • the system may then respond by keeping the search bar 304 in the user interface 302 c of the mobile device 300 c in the third state, but replacing all advertisements 326 , 328 , 330 with advertisements 334 - 338 of type B.
  • FIG. 4 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user interface.
  • the user interface 402 a may include a first advertisement 404 , a search bar 406 , a merchandise area 408 , and a second advertisement 410 .
  • Within the merchandise area 408 may be a number of items 412 - 422 .
  • circles and circle groupings 424 a - 424 e the user has selected on the search bar 406 and the items 412 - 416 in the merchandise area 408 , but not the other items 418 - 422 of the user interface 402 a .
  • the system may then dynamically alter the user interface 402 a into the user interface 402 b of mobile device 400 b , which is the second state.
  • the search bar 406 and merchandise area 408 remain, but the advertisements 404 , 410 have been removed.
  • the merchandise area 408 has been expanded to compensate.
  • the user has then subsequently continued to select on the search bar 406 .
  • the system may then respond by removing all elements but the search bar 406 in the user interface 402 c of the mobile device 400 c in the third state. This has greatly simplified the interface for a user who is now, apparently, only interested in searching using the search bar 406 .
  • FIG. 5 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user interface.
  • the mobile device 500 a in the first state includes a user interface 502 a having a first advertisement 504 , a search bar 506 , a merchandise area 508 , and a second advertisement 510 .
  • Within the merchandise area 508 may be a number of items 512 - 522 .
  • the user has selected on the search bar 506 and the items 512 - 516 in the merchandise area 508 , but not the other items 518 - 522 of the user interface 502 a . Additionally, the system may then retrieve information about the user, such as the user's age.
  • two different states for the mobile device 500 b , 500 c are possible next states depending upon whether the system determines that the user is a child or a senior.
  • the system In the mobile device 500 b for the second state, if the user is a child, that information along with the previous interaction information from the circles and circle groupings 524 a - 524 e leads the system to provide a user interface 502 b designed for a child and also custom designed for this particular child.
  • the child user interface may, for example, include bright colors, simplistic shapes (such as circles 526 a , 526 b ), and language more appropriate for a child (such as replacing the word “Merchandise” with “Neat Stuff!” in the merchandise area 508 .
  • the system determines that the user was a senior, that information along with the previous interaction information from the circles and circle groupings 524 a - 524 e leads the system to provide a user interface 502 c designed for a senior and also custom designed for this particular senior.
  • the senior interface may, for example, include very large font and increased areas for the search bar 506 and the merchandise area 508 .
  • the areas may be added based on user interaction and various assumptions that can be made about a user (e.g., if a user uses a search bar a lot, he or she may wish to be presented with an additional search tool instead of a browsing tool).
  • areas that have been previously removed could be reintroduced to the user interface to gauge whether the user may now be interested in them. For example, a user's tastes may evolve over time, so that interest in certain categories of merchandise may wane and interest in other categories may grow.
  • merchandise from areas not necessarily known to be of interest based on previous interactions may be randomly introduced into the user interface, in order to allow the user to express interest in these categories. This may be extended not just to areas that were previously removed but new areas that have never been introduced as well.
  • FIG. 6 is an interaction diagram illustrating a method 600 , in accordance with an example embodiment, of dynamically altering a user interface.
  • a user interface 602 (which may be contained on an electronic device such as a mobile device) interacts with a user interaction tracking module 604 .
  • the user interaction tracking module 604 may be contained on the same user device as the user interface 602 .
  • a user interface modification module 606 may either be located on the same electronic device as the other components 602 , 604 , or may be located on another device, such as on a web server.
  • user interactions with the user interface 602 may be sent to the user interaction tracking module 604 .
  • these interactions are reported to the user interface modification module 606 .
  • This may occur in a number of ways. While the user interface 602 and user interaction tracking module 604 are depicted as separate components, in some embodiments the user interaction tracking module 604 is actually built into the user interface 602 , such that the user interactions are automatically reported to the user interface modification module 606 . The interactions may be logged and tracked. In some embodiments, a summary report may be generated to the user interface modification module 606 (e.g., the user has interacted with area A 69 times, area B 52 times, area C 0 times, etc.). In other embodiments, the raw data can be reported to the user interface modification module 606 .
  • the user interaction modification module 606 may dynamically modify the user interface 602 based on the interactions.
  • the dynamically modified user interface may be passed back. The whole process may then be repeated continuously, allowing for an always-up-to-date customized user interface 602 .
  • the dynamic modification may happen periodically, either using fixed periods (e.g., once an hour, once a day), or non-fixed periods, or even randomly.
  • the modification occurs when a certain number of interactions have been recorded (e.g., every 1000 interactions). If the periods are too short, it may be difficult to have tracked enough user interactions to be useful.
  • FIG. 7 is a flow diagram illustrating a method 700 , in accordance with an example embodiment, of dynamically altering a user interface.
  • a user interface is presented to a user.
  • user interaction with one or more elements of the user interface is measured.
  • one or more of the elements of the user interface are dynamically modified based on the measured user interaction.
  • FIG. 8 is a flow diagram illustrating a method 800 , in accordance with another example embodiment, of dynamically altering a user interface.
  • information regarding user interaction with one or more elements of the user interface is received from an electronic device.
  • one or more of the elements of the user interface are dynamically modified based on the measured user interaction.
  • the dynamically modified user interface is returned to the electronic device for display.
  • FIG. 9 shows a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.
  • the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
  • the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • the machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA personal digital assistant
  • STB set-top box
  • a cellular telephone a web appliance
  • network router switch or bridge
  • the example computer system 900 includes a processor 902 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), a main memory 904 and a static memory 906 , which communicate with each other via a bus 908 .
  • the computer system 900 may further include a video display unit 910 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)).
  • the computer system 900 also includes an alphanumeric input device 912 (e.g., a keyboard), a cursor control device 914 (e.g., a mouse), a disk drive unit 916 , a signal generation device 918 (e.g., a speaker), and a network interface device 920 .
  • the disk drive unit 916 includes a compuer-readable medium 922 on which is stored one or more sets of instructions 924 (e.g., software) embodying any one or more of the methodologies or functions described herein.
  • the instructions 924 may also reside, completely or at least partially, within the main memory 904 and/or within the processor 902 during execution thereof by the computer system 900 , with the main memory 904 and the processor 902 also constituting machine-readable media.
  • the instructions 924 may further be transmitted or received over a network 926 via the network interface device 920 .
  • machine-readable medium 922 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions 924 .
  • the term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies described herein.
  • the term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals.

Abstract

In an example embodiment, a method of dynamically optimizing a user interface on an electronic device is provided. A user interface is presented to a user, wherein the user interface includes one or more elements. User interactions with the one or more elements are then measured. The one or more elements of the user interface are then dynamically modified based on the measured user interaction.

Description

    PRIORITY
  • This application is a continuation of and claims the benefit of priority to U.S. patent application Ser. No. 13/681,653, entitled “SELF OPTIMIZING AND REDUCING USER EXPERIENCES,” filed on Nov. 14, 2012, which is hereby incorporated by reference herein in its entirety.
  • BACKGROUND
  • User experience is a broad term covering many aspects of experiences of users with computing products or services accessed through the computing products (such as web sites). The user experience includes not only the user interface, but also the graphics and physical interaction. For the most part, such user experience is somewhat static in nature. The layout of a website is generally the same for most or all users who access it, until such time as the web designer alters the layout. To the extent that different users are served different layouts, they are generally distributed based on preset criteria (such as hardware specifications of the user device).
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a network diagram depicting a client-server system, within which one example embodiment may be deployed.
  • FIG. 2 is a diagram illustrating a progression, in accordance with an example embodiment, of dynamic alteration of a user interface.
  • FIG. 3 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user interface.
  • FIG. 4 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user interface.
  • FIG. 5 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user interface.
  • FIG. 6 is an interaction diagram illustrating a method, in accordance with an example embodiment, of dynamically altering a user interface.
  • FIG. 7 is a flow diagram illustrating a method, in accordance with an example embodiment, of dynamically altering a user interface.
  • FIG. 8 is a flow diagram illustrating a method, in accordance with another example embodiment, of dynamically altering a user interface.
  • FIG. 9 shows a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.
  • DETAILED DESCRIPTION
  • The description that follows includes illustrative systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques have not been shown in detail.
  • Although the present embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the embodiments. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
  • In an example embodiment, various aspects of a user experience are dynamically optimized in order to provide a customized and efficient experience for the user. Elements within a user interface, such as button size, advertising sizing, font, color, placement, the presences of certain interface objects, etc., can all be dynamically altered based on usage information as well as other factors (e.g., demographic information, information from user profiles, etc.). For example, a search bar displayed on a web site may change in size and location on the screen based on how often the user utilized the search bar. In the extreme, all elements but the search bar could be removed for users who primarily use the web site for searches.
  • In another example embodiments, the usage information is gathered from touchscreen usage on a user device. Key presses indicating where a user has pressed a touchscreen with his or her finger (or stylus) may be tracked to determine the frequency of presses on various elements of the user interface and areas of the touchscreen. This information may then be used to dynamically modify the user interface, by atrophying out lesser used elements and areas and introducing in new elements.
  • The dynamic adjustment may be performed by the local user device, or may be performed at the server-level by, for example, a web server. In some embodiments, a combination of local user devices and one or more servers may be used to perform the dynamic adjustment.
  • FIG. 1 is a network diagram depicting a client-server system 100, within which one example embodiment may be deployed. A networked system 102, in the example forms of a network-based marketplace or publication system, provides server-side functionality, via a network 104 (e.g., the Internet or Wide Area Network (WAN)) to one or more clients. FIG. 1 illustrates, for example, a web client 106 (e.g., a browser), and a programmatic client 108 executing on respective client machines 110 and 112.
  • An Application Program Interface (API) server 114 and a web server 116 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 118. The application servers 118 host one or more marketplace applications 120 and payment applications 122. The application servers 118 are, in turn, shown to be coupled to one or more database servers 124 that facilitate access to one or more databases 126.
  • The marketplace applications 120 may provide a number of marketplace functions and services to users that access the networked system 102. The payment applications 122 may likewise provide a number of payment services and functions to users. The payment applications 122 may allow users to accumulate value (e.g., in a commercial currency, such as the U.S. dollar, or a proprietary currency, such as “points”) in accounts, and then later to redeem the accumulated value for products (e.g., goods or services) that are made available via the marketplace applications 120. While the marketplace and payment applications 120 and 122 are shown in FIG. 1 to both form part of the networked system 102, it will be appreciated that, in alternative embodiments, the payment applications 122 may form part of a payment service that is separate and distinct from the networked system 102.
  • Further, while the system 100 shown in FIG. 1 employs a client-server architecture, the present disclosure is of course not limited to such an architecture, and could equally well find application in a distributed, or peer-to-peer, architecture system, for example. The various marketplace and payment applications 120 and 122 could also be implemented as standalone software programs, which do not necessarily have networking capabilities.
  • The web client 106 accesses the various marketplace and payment applications 120 and 122 via the web interface supported by the web server 116. Similarly, the programmatic client 108 accesses the various services and functions provided by the marketplace and payment applications 120 and 122 via the programmatic interface provided by the API server 114. The programmatic client 108 may, for example, be a seller application (e.g., the TurboLister application developed by eBay Inc., of San Jose, Calif.) to enable sellers to author and manage listings on the networked system 102 in an off-line manner, and to perform batch-mode communications between the programmatic client 108 and the networked system 102.
  • FIG. 1 also illustrates a third party application 128, executing on a third party server machine 130, as having programmatic access to the networked system 102 via the programmatic interface provided by the API server 114. For example, the third party application 128 may, utilizing information retrieved from the networked system 102, support one or more features or functions on a website hosted by the third party. The third party website may, for example, provide one or more promotional, marketplace or payment functions that are supported by the relevant applications of the networked system 102.
  • FIG. 2 is a diagram illustrating a progression, in accordance with an example embodiment, of dynamic alteration of a user interface. Pictured here are a mobile device 200 a, 200 b, 200 c in three states. It should be noted that while a mobile device is depicted, a similar process could run on any electronic device. Beginning with the mobile device 200 a in the first state, it can be seen that the user interface 202 a has various sections, including a search bar 204, an activity dashboard 206, a merchandise area 208, and an advertisement 210. For discussion purposes, this may be referred to as a default or beginning layout, although since the methods described herein are dynamically applied, there need not be any state that is strictly known as a default or beginning layout because the layout my simply be continuously adjusted. The user interface 202 a here may depict an interface to an online auction web site, although one of ordinary skill in the art will recognize that this disclosure can apply to other types of user interfaces as well.
  • Within the activity dashboard 206 are three activities: watching 212 (for items in the online auction the user has selected as being of interest), buying 214 (for items in the online auction the user has bid on), and selling 216 (for items in the online auction the user is selling).
  • Within the merchandise area 208 may be a number of items 218, 220, 222. Here the items depicted are in separate categories, labelled as categories A, B, and C. For example, category A may be automobiles, category B may be toys, and category C may be books. The system may display, for example, a single item for each category representing the item in the category that is likeliest to be of interest to the user (based, perhaps, on previous purchases or searches).
  • Turning to mobile device 200 b, which is in the second state, the user has begun to use the user interface 202 b by selecting various of the sections of the user interface 202 b. Depicted here is use by pressing on a touchscreen of the mobile device 200 b. Various circles and circle groupings 224 a-224 d represent “presses,” namely areas of the user interface 202 b that the user has selected. Of course, it is not necessary that the user interface 202 b be operated on a touchscreen display, and other types of user interaction may be measured other than “presses” on a touchscreen, such as clicks using a mouse or other input device.
  • As can be seen by the patterns of circles and circle groupings 224 a-224 d, the user has selected on the search bar 204 a number of times, on the buying activity 214, and also selected on category A merchandise items 218.
  • The system may track these interactions and adjust the user interface to better align with the user's apparent interests and usage preferences. Specifically, referring to mobile device 200 c, which is the third state, the system here may dynamically adjust the activity dashboard 206 in the user interface 202 c so that only a buying activity 214 is depicted, since the user has expressed little or no interest in the watching activity 212 or selling activity 216. The system may also remove the advertisement 210, since the user has expressed little or no interest in that, and expand the merchandise area 208 to compensate, allowing for more items 226 a-226 f to be displayed. Lastly, given that the user has expressed interest in merchandise from category A (as evidence from the presses on item 218), the merchandise area 208 may be reconfigured to only display items 226 a-226 f from category A. The result is a custom designed interface that has been dynamically and automatically reconfigured to the user's needs.
  • FIG. 3 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user interface. Here, depicted are three states of the mobile device 300 a, 300 b, 300 c. The user interface 302 a may include a search bar 304, an activity dashboard 306, a merchandise area 308, and an advertisement 310. Within the activity dashboard 306 are three activities: watching 312 (for items in the online auction the user has selected as being of interest), buying 314 (for items in the online auction the user has bid on), and selling 316 (for items in the online auction the user is selling). Within the merchandise area 308 may be a number of items 318, 320, 322. As shown by circles and circle groupings 324 a-324 e, the user has selected on the search bar 304 and the advertisement 310, but not the other items of the user interface 302 a. The system may then dynamically alter the user interface 302 a into the user interface 302 b of mobile device 300 b, which is the second state. Here, the search bar 304 remains, but the activity dashboard 306 and merchandise area 308 have been removed, in favor of advertisements 326-330, which represent advertisements of different types. The “type” of the advertisement may, in fact, be any categorization useful to differentiate advertisements, including differentiations based on the category of the advertised item or service (e.g., automobile advertisement versus a drug advertisement), or based on the format of the advertisement (e.g., animated advertisement versus static advertisement).
  • As can be seen from the circles and circle groupings 332 a-332 c in the user interface 302 b, the user has then subsequently continued to select on the search bar 304 as well as on type B advertisement(s) 328. The system may then respond by keeping the search bar 304 in the user interface 302 c of the mobile device 300 c in the third state, but replacing all advertisements 326, 328, 330 with advertisements 334-338 of type B.
  • FIG. 4 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user interface. Here, depicted are three states of the mobile device 400 a, 400 b, 400 c. The user interface 402 a may include a first advertisement 404, a search bar 406, a merchandise area 408, and a second advertisement 410. Within the merchandise area 408 may be a number of items 412-422. As shown by circles and circle groupings 424 a-424 e, the user has selected on the search bar 406 and the items 412-416 in the merchandise area 408, but not the other items 418-422 of the user interface 402 a. The system may then dynamically alter the user interface 402 a into the user interface 402 b of mobile device 400 b, which is the second state. Here, the search bar 406 and merchandise area 408 remain, but the advertisements 404, 410 have been removed. The merchandise area 408 has been expanded to compensate.
  • As can be seen from the circles and circle groupings 426 a-426 b in the user interface 402 b, the user has then subsequently continued to select on the search bar 406. The system may then respond by removing all elements but the search bar 406 in the user interface 402 c of the mobile device 400 c in the third state. This has greatly simplified the interface for a user who is now, apparently, only interested in searching using the search bar 406.
  • In addition to basing its decision on user interactions, such as areas where the user has pressed, the system can also use other information in how to dynamically alter the user interface. Information about the user, such as demographic or user interest information gleaned from a user profile or other information source, can be used in altering the user interface as well. FIG. 5 is a diagram illustrating a progression, in accordance with another example embodiment, of dynamic alteration of a user interface. Here, the mobile device 500 a in the first state includes a user interface 502 a having a first advertisement 504, a search bar 506, a merchandise area 508, and a second advertisement 510. Within the merchandise area 508 may be a number of items 512-522. As shown by circles and circle groupings 524 a-524 e, the user has selected on the search bar 506 and the items 512-516 in the merchandise area 508, but not the other items 518-522 of the user interface 502 a. Additionally, the system may then retrieve information about the user, such as the user's age. Here, two different states for the mobile device 500 b, 500 c are possible next states depending upon whether the system determines that the user is a child or a senior. In the mobile device 500 b for the second state, if the user is a child, that information along with the previous interaction information from the circles and circle groupings 524 a-524 e leads the system to provide a user interface 502 b designed for a child and also custom designed for this particular child. The child user interface may, for example, include bright colors, simplistic shapes (such as circles 526 a, 526 b), and language more appropriate for a child (such as replacing the word “Merchandise” with “Neat Stuff!” in the merchandise area 508.
  • Alternatively, if the system determined that the user was a senior, that information along with the previous interaction information from the circles and circle groupings 524 a-524 e leads the system to provide a user interface 502 c designed for a senior and also custom designed for this particular senior. The senior interface may, for example, include very large font and increased areas for the search bar 506 and the merchandise area 508.
  • The above examples depict areas of the user interface being removed, but it is also possible to add areas of the user interface. In some instances, the areas may be added based on user interaction and various assumptions that can be made about a user (e.g., if a user uses a search bar a lot, he or she may wish to be presented with an additional search tool instead of a browsing tool). In other instances, areas that have been previously removed could be reintroduced to the user interface to gauge whether the user may now be interested in them. For example, a user's tastes may evolve over time, so that interest in certain categories of merchandise may wane and interest in other categories may grow. In order to capture this, merchandise from areas not necessarily known to be of interest based on previous interactions may be randomly introduced into the user interface, in order to allow the user to express interest in these categories. This may be extended not just to areas that were previously removed but new areas that have never been introduced as well.
  • Additionally, certain areas, such as advertisements, tend to lose their effectiveness with overuse. Users, over time, learn to ignore advertisements that appear on their user interface. As such, removing the advertisements only to reintroduce them at a later time allows the user to be “retrained” to notice the advertisements once again. The system may implement these and other strategies in determining how to dynamically alter the user interface.
  • FIG. 6 is an interaction diagram illustrating a method 600, in accordance with an example embodiment, of dynamically altering a user interface. In this method 600, a user interface 602 (which may be contained on an electronic device such as a mobile device) interacts with a user interaction tracking module 604. The user interaction tracking module 604 may be contained on the same user device as the user interface 602. A user interface modification module 606 may either be located on the same electronic device as the other components 602, 604, or may be located on another device, such as on a web server.
  • At operation 608, user interactions with the user interface 602 may be sent to the user interaction tracking module 604. Then at operation 610 these interactions are reported to the user interface modification module 606. This may occur in a number of ways. While the user interface 602 and user interaction tracking module 604 are depicted as separate components, in some embodiments the user interaction tracking module 604 is actually built into the user interface 602, such that the user interactions are automatically reported to the user interface modification module 606. The interactions may be logged and tracked. In some embodiments, a summary report may be generated to the user interface modification module 606 (e.g., the user has interacted with area A 69 times, area B 52 times, area C 0 times, etc.). In other embodiments, the raw data can be reported to the user interface modification module 606.
  • At operation 612 the user interaction modification module 606 may dynamically modify the user interface 602 based on the interactions. Of course, as described above, other factors, such as user demographic or interest information, could also be used in this dynamic modification. At operation 614, the dynamically modified user interface may be passed back. The whole process may then be repeated continuously, allowing for an always-up-to-date customized user interface 602. It should be noted that the dynamic modification may happen periodically, either using fixed periods (e.g., once an hour, once a day), or non-fixed periods, or even randomly. In other embodiments, the modification occurs when a certain number of interactions have been recorded (e.g., every 1000 interactions). If the periods are too short, it may be difficult to have tracked enough user interactions to be useful.
  • FIG. 7 is a flow diagram illustrating a method 700, in accordance with an example embodiment, of dynamically altering a user interface. At operation 702, a user interface is presented to a user. At operation 704, user interaction with one or more elements of the user interface is measured. At operation 706, one or more of the elements of the user interface are dynamically modified based on the measured user interaction.
  • FIG. 8 is a flow diagram illustrating a method 800, in accordance with another example embodiment, of dynamically altering a user interface. At operation 802, information regarding user interaction with one or more elements of the user interface is received from an electronic device. At operation 804, one or more of the elements of the user interface are dynamically modified based on the measured user interaction. At operation 806, the dynamically modified user interface is returned to the electronic device for display.
  • FIG. 9 shows a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a server computer, a client computer, a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
  • The example computer system 900 includes a processor 902 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), a main memory 904 and a static memory 906, which communicate with each other via a bus 908. The computer system 900 may further include a video display unit 910 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 900 also includes an alphanumeric input device 912 (e.g., a keyboard), a cursor control device 914 (e.g., a mouse), a disk drive unit 916, a signal generation device 918 (e.g., a speaker), and a network interface device 920.
  • The disk drive unit 916 includes a compuer-readable medium 922 on which is stored one or more sets of instructions 924 (e.g., software) embodying any one or more of the methodologies or functions described herein. The instructions 924 may also reside, completely or at least partially, within the main memory 904 and/or within the processor 902 during execution thereof by the computer system 900, with the main memory 904 and the processor 902 also constituting machine-readable media. The instructions 924 may further be transmitted or received over a network 926 via the network interface device 920.
  • While the machine-readable medium 922 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions 924. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies described herein. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals.
  • Although the inventive concepts have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the inventive concepts. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
  • The Abstract of the Disclosure is provided to comply with 37 C.F.R. §1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.

Claims (20)

1. A user device comprising:
a user interface having one or more elements;
a user interaction tracking module, comprising a processor, to measure user interaction with the one or more elements of the screen over a fixed period, the user interaction includes finger presses on a touchscreen on which the user interface is operating, the measuring includes identifying patterns over time indicating a history of the finger presses in relation to a first, and second elements of the screen; and
a user interface modification module to dynamically modify the user interface based both on the measured user interaction and on information about the user entered by the user, the dynamically modifying including removing the first element from the screen but presenting the third element, of the same type as the first element but not identical to the first element in the screen, and modify a property of the second element, and further to present the dynamically modified user interface upon a future navigation by the user to the particular screen.
2. The user device of claim 1, wherein the modifying a property of the second element includes resizing the second element.
3. The user device of claim 2, wherein the resizing includes expanding a size of the second element.
4. The user device of claim 1, wherein the information regarding the user is retrieved from a user profile.
5. The user device of claim 1, wherein the information regarding the user is the user's age.
6. The user device of claim 1, wherein the information regarding the user is the user's sex.
7. The user device of claim 1, wherein the information regarding the user is information regarding a location of the user.
8. A method of dynamically optimizing a user interface on an electronic device, the method comprising:
upon navigation by a user to a particular screen of the user interface, presenting the screen to a user, the screen including one or more elements;
measuring user interaction with the one or more elements of the screen over a fixed period, the user interaction includes finger presses on a touchscreen on which the user interface is operating, the measuring includes identifying patterns over time indicating a history of the finger presses in relation to a first, and second elements of the screen; and
dynamically modifying the user interface based both on the measured user interaction and on information about the user entered by the user, the dynamically modifying including removing the first element from the screen but presenting the third element, of the same type as the first element but not identical to the first element in the screen, and modifying a property of the second element; and
presenting the dynamically modified user interface upon a future navigation by the user to the particular screen.
9. The method of claim 8, wherein the modifying a property of the second element includes resizing the second element.
10. The method of claim 9, wherein the resizing includes expanding a size of the second element.
11. The method of claim 8, wherein the information regarding the user is retrieved from a user profile.
12. The method of claim 8, wherein the information regarding the user is the user's age.
13. The method of claim 8, wherein the information regarding the user is the user's sex.
14. The method of claim 8, wherein the information regarding the user is information regarding a location of the user.
15. A non-transitory computer-readable storage medium comprising instructions that, when executed by at least one processor of a machine, cause the machine to perform operations comprising:
upon navigation by a user to a particular screen of the user interface, presenting the screen to a user, the screen including one or more elements;
measuring user interaction with the one or more elements of the screen over a fixed period, the user interaction includes finger presses on a touchscreen on which the user interface is operating, the measuring includes identifying patterns over time indicating a history of the finger presses in relation to a first, and second elements of the screen; and
dynamically modifying the user interface based both on the measured user interaction and on information about the user entered by the user, the dynamically modifying including removing the first element from the screen but presenting the third element, of the same type as the first element but not identical to the first element in the screen, and modifying a property of the second element; and
presenting the dynamically modified user interface upon a future navigation by the user to the particular screen.
16. The non-transitory computer-readable storage of claim 15, wherein the modifying a property of the second element includes resizing the second element.
17. The non-transitory computer-readable storage of claim 16, wherein the resizing includes expanding a size of the second element.
18. The non-transitory computer-readable storage of claim 15, wherein the information regarding the user is retrieved from a user profile.
19. The non-transitory computer-readable storage of claim 15, wherein the information regarding the user is the user's age.
20. The non-transitory computer-readable storage of claim 15, wherein the information regarding the user is the user's sex.
US15/593,537 2012-11-20 2017-05-12 Self optimizing and reducing user experiences Abandoned US20170249071A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/593,537 US20170249071A1 (en) 2012-11-20 2017-05-12 Self optimizing and reducing user experiences

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/681,653 US9652777B2 (en) 2012-11-20 2012-11-20 Self optimizing and reducing user experiences
US15/593,537 US20170249071A1 (en) 2012-11-20 2017-05-12 Self optimizing and reducing user experiences

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/681,653 Continuation US9652777B2 (en) 2012-11-20 2012-11-20 Self optimizing and reducing user experiences

Publications (1)

Publication Number Publication Date
US20170249071A1 true US20170249071A1 (en) 2017-08-31

Family

ID=50729177

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/681,653 Active 2033-07-30 US9652777B2 (en) 2012-11-20 2012-11-20 Self optimizing and reducing user experiences
US15/593,537 Abandoned US20170249071A1 (en) 2012-11-20 2017-05-12 Self optimizing and reducing user experiences

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/681,653 Active 2033-07-30 US9652777B2 (en) 2012-11-20 2012-11-20 Self optimizing and reducing user experiences

Country Status (1)

Country Link
US (2) US9652777B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10785310B1 (en) * 2015-09-30 2020-09-22 Open Text Corporation Method and system implementing dynamic and/or adaptive user interfaces

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10387934B1 (en) 2014-06-12 2019-08-20 Amazon Technologies, Inc. Method medium and system for category prediction for a changed shopping mission
US9767204B1 (en) * 2014-06-12 2017-09-19 Amazon Technologies, Inc. Category predictions identifying a search frequency
US9767417B1 (en) * 2014-06-12 2017-09-19 Amazon Technologies, Inc. Category predictions for user behavior
US10474670B1 (en) 2014-06-12 2019-11-12 Amazon Technologies, Inc. Category predictions with browse node probabilities
US20160321741A1 (en) * 2015-04-30 2016-11-03 Wal-Mart Stores, Inc. System and method of module-based website data organization
US10489043B2 (en) 2015-12-15 2019-11-26 International Business Machines Corporation Cognitive graphical control element
EP3656495B1 (en) * 2018-11-23 2024-04-17 Ewm Ag Optimizing user interfaces of a welding device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6959319B1 (en) * 2000-09-11 2005-10-25 International Business Machines Corporation System and method for automatically personalizing web portals and web services based upon usage history
US20080126476A1 (en) * 2004-08-04 2008-05-29 Nicholas Frank C Method and System for the Creating, Managing, and Delivery of Enhanced Feed Formatted Content
US20130086481A1 (en) * 2011-09-29 2013-04-04 Avaya Inc. System and method for adaptive communication user interface

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020084991A1 (en) * 2001-01-04 2002-07-04 Harrison Edward R. Simulating mouse events with touch screen displays
KR20080078291A (en) * 2007-02-23 2008-08-27 엘지전자 주식회사 Method for displaying browser and terminal capable of implementing the same
US9405830B2 (en) * 2007-02-28 2016-08-02 Aol Inc. Personalization techniques using image clouds
US10139812B2 (en) * 2008-09-29 2018-11-27 Fisher-Rosemount Systems, Inc. Dynamic user interface for configuring and managing a process control system
US9720527B2 (en) * 2012-08-06 2017-08-01 Tracfone Wireless, Inc. Evolutionary touch-based graphical user interface for electronic devices

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6959319B1 (en) * 2000-09-11 2005-10-25 International Business Machines Corporation System and method for automatically personalizing web portals and web services based upon usage history
US20080126476A1 (en) * 2004-08-04 2008-05-29 Nicholas Frank C Method and System for the Creating, Managing, and Delivery of Enhanced Feed Formatted Content
US20130086481A1 (en) * 2011-09-29 2013-04-04 Avaya Inc. System and method for adaptive communication user interface

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10785310B1 (en) * 2015-09-30 2020-09-22 Open Text Corporation Method and system implementing dynamic and/or adaptive user interfaces

Also Published As

Publication number Publication date
US9652777B2 (en) 2017-05-16
US20140143694A1 (en) 2014-05-22

Similar Documents

Publication Publication Date Title
US20170249071A1 (en) Self optimizing and reducing user experiences
US10572928B2 (en) Method and system for recommending products based on a ranking cocktail
US11354584B2 (en) Systems and methods for trend aware self-correcting entity relationship extraction
US10922327B2 (en) Search guidance
US20200090230A1 (en) Systems and methods for suggesting creative types for online content items to an advertiser
US10409821B2 (en) Search result ranking using machine learning
US20130311340A1 (en) Systems and methods for displaying items
US10290040B1 (en) Discovering cross-category latent features
CN102947849A (en) Interactive ads
WO2020243894A1 (en) Advertisement recommending method and apparatus, and electronic device
US11756088B2 (en) Displaying listings based on listing activity
US9817846B1 (en) Content selection algorithms
CA2693675A1 (en) Contextual advertising based on user configurable preferences
KR101858133B1 (en) Saving and presenting a communication session state
US20150242082A1 (en) Networked client user interface
US9804741B2 (en) Methods and systems for managing N-streams of recommendations
US20160048855A1 (en) Multivariate testing for content discovery systems
US10762149B2 (en) System and method for inducing user activity via enhanced web content
US20240005362A1 (en) Systems and methods for dynamic link redirection
US20240013252A1 (en) Systems and methods for dynamic link redirection
JP7122286B2 (en) Decision device, decision method and decision program
AU2014203798B2 (en) Contextual advertising based on user configurable preferences

Legal Events

Date Code Title Description
AS Assignment

Owner name: EBAY INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAPLEY, JOHN;HIGGINS, KRYSTAL ROSE;FARRARO, ERIC J.;SIGNING DATES FROM 20121116 TO 20121119;REEL/FRAME:042352/0241

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION