WO2014008208A2 - Visual ui guide triggered by user actions - Google Patents
Visual ui guide triggered by user actions Download PDFInfo
- Publication number
- WO2014008208A2 WO2014008208A2 PCT/US2013/048978 US2013048978W WO2014008208A2 WO 2014008208 A2 WO2014008208 A2 WO 2014008208A2 US 2013048978 W US2013048978 W US 2013048978W WO 2014008208 A2 WO2014008208 A2 WO 2014008208A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- application
- user interface
- help
- visual
- given action
- Prior art date
Links
- 230000000007 visual effect Effects 0.000 title claims abstract description 94
- 230000009471 action Effects 0.000 title claims abstract description 41
- 230000001960 triggered effect Effects 0.000 title description 12
- 238000000034 method Methods 0.000 claims description 28
- 238000004891 communication Methods 0.000 claims description 20
- 238000012545 processing Methods 0.000 claims description 9
- 230000004044 response Effects 0.000 claims description 4
- 230000005055 memory storage Effects 0.000 claims description 3
- 230000000694 effects Effects 0.000 claims 7
- 230000003993 interaction Effects 0.000 abstract description 6
- 238000001514 detection method Methods 0.000 abstract description 5
- 230000033001 locomotion Effects 0.000 description 16
- 230000015654 memory Effects 0.000 description 15
- 230000006870 function Effects 0.000 description 6
- 239000000047 product Substances 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 230000006399 behavior Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000007667 floating Methods 0.000 description 2
- 239000010454 slate Substances 0.000 description 2
- 230000006641 stabilisation Effects 0.000 description 2
- 238000011105 stabilization Methods 0.000 description 2
- 230000007723 transport mechanism Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000003490 calendering Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/453—Help systems
Definitions
- a user may want or need help discovering features or capabilities associated with an application. For example, a user may need assistance knowing what input is needed or a shortcut to accomplish a task within the application, such as checking off items in a to-do list.
- a current method for helping users to discover features includes help articles that may include text and images.
- help articles may include text and images.
- a limitation to this approach is that when viewing a help article, a user is not in the context an application and may not know how actions described in the article may be executing with his content.
- the user may have to manage two contexts at once- the help article user interface pane and the application user interface pane.
- screen space may be limited and, in some cases, the device may be unable to show multiple application panes at once. Additionally, describing or explaining gestures or action sequences via a help article can be difficult.
- Embodiments of the present invention solve the above and other problems by providing a visual guidance user interface to help a user learn a product's capability and inputs needed to achieve a given action.
- a visual help user interface may be launched via a trigger and may overlay a software application with a graphical user interface (GUI).
- GUI graphical user interface
- the visual help UI may be utilized to demonstrate a feature (e.g., a gesture, functionality, behavior, etc.), to suggest or demonstrate a work flow (e.g., how to create a to-do list), or may teach a gesture (e.g., demonstrate a gesture or correct an unrecognized gesture).
- the visual help UI may be animated to imply interaction and may demonstrate a suggested workflow or input sequence using a user's content in the application GUI.
- FIGURE 1 is an illustration of an example help article in current applications
- FIGURE 2 is an illustration of an example visual help UI displayed on a task list application GUI on a mobile phone demonstrating a gesture input;
- FIGURE 3 is an illustration of a user manually selecting to view a visual help UI through a help or "how-to" article;
- FIGURE 4 is an illustration of a visual help UI demonstrating a work flow suggestion
- FIGURES 5A-C are illustrations of a visual help UI displayed on a map application on a mobile phone, wherein a compass or gyroscope is utilized to sense orientation of a device;
- FIGURES 6A and 6B are illustrations of a visual help UI displayed on a camera application on a mobile phone, wherein an accelerometer is utilized to sense motion;
- FIGURES 7 A and 7B are illustrations of a visual help UI displayed on a notes application on a laptop computer, wherein a microphone is utilized to detect noise;
- FIGURE 8 is an illustration of a visual help UI displayed on a IP phone, wherein a microphone is utilized to detect noise;
- FIGURE 9 is a flow chart of a method for providing a visual guidance user interface to help a user learn a product's capability and inputs needed to achieve a given action;
- FIGURE 10 is a block diagram illustrating example physical components of a computing device with which embodiments of the invention may be practiced;
- FIGURE 11A and 11B are simplified block diagrams of a mobile computing device with which embodiments of the present invention may be practiced; and [00020] FIGURE 12 is a simplified block diagram of a distributed computing system in which embodiments of the present invention may be practiced.
- embodiments of the present invention are directed to providing a visual guidance user interface to help a user learn a product's capability and inputs needed to achieve a given action.
- Embodiments may be utilized to aid a user in discovering or learning about an application by demonstrating possible workflows or input sequences for a given action within the application.
- Embodiments do not require additional application user interface panes and may be launched via a smart trigger, via interaction with a user's content, and/or via a selection by a user to demonstrate a work flow or input sequence to complete a task.
- a current help tool solution may include a help article 106.
- the help article 106 may be displayed in a user interface pane separate from the application user interface pane 102, which may require the user to manage two contexts at once.
- a computing device with less screen space e.g., a mobile phone or a tablet device
- multiple application user interface panes may not be opened at the same time. This, in addition to the method of instruction available in the help article 106 being in text and images, may make it difficult for a user to discover or understand a product's features.
- Embodiments of the present invention comprise a visual help user interface
- UI graphical user interface
- GUI graphical user interface
- a visual help UI may be utilized for a feature discovery (e.g., a gesture, functionality, behavior, etc.), and may be triggered manually by a user or automatically.
- a user may select to view a visual help UI for a particular task or feature or alternatively, a visual help UI may be triggered automatically. For example, a determination may be made that a user is going through extra steps to complete a task that could be accomplished via a shortcut or via a method involving less steps.
- a visual help UI may be displayed after a predetermined time period has elapsed without user input.
- a user may utilize a task list application 202 on a mobile computing device 200.
- the user may want to mark 206 an item 204 as completed.
- the user may go through a menu, such as selecting an edit function 208 to utilize a functionality to mark 206 through an item 204.
- the user may be unaware that the task list application 202 has a feature that allows the user to use his finger to swipe across an item 204 to mark 206 it as complete.
- a visual help UI such as a ghost hand 210, may be triggered and displayed on the task list application 202 GUI 212 and may demonstrate an input, such as a gesture input, that can be utilized to accomplish a task.
- a gesture input may include an input made without a mechanical device (e.g., a user body movement) or with a mechanical input device (e.g., with a mouse, touchscreen, stylus, etc.), the input originating from a bodily motion that can be received, recognized, and translated into a selection and/or movement of an element or object on a graphical user interface that mimics the bodily motion.
- a mechanical device e.g., a user body movement
- a mechanical input device e.g., with a mouse, touchscreen, stylus, etc.
- a visual help UI may be utilized for a work flow suggestion.
- a work flow suggestion may be triggered manually or may be triggered automatically by user action.
- a user may manually select to view a visual help UI through a help or "how-to" article 302 as illustrated in FIGURE 3.
- a help article 302 which may contain instructions on how to complete a task is displayed on a tablet computing device 300.
- a control such as a "show me" button 304 may be provided, which when selected, may launch a visual help UI (i.e., demonstration) showing a sequence of inputs to accomplish a task as detailed in the help article 302.
- the demonstration may include a ghost hand 310 selecting controls in a toolbar 308 of an application user interface pane 306 to accomplish a task, for example, changing the line spacing in a document.
- a visual help UI demonstrating a work flow suggestion may be triggered automatically.
- a user may open an application and create a new document 402 as illustrated in FIGURE 4. If the user hesitates before performing a next action, for example, a predetermined time period elapses before a subsequent action (e.g., clicking an "add title” control 404) is detected, a visual help UI may be provided.
- a ghosted animation 406 selecting a control e.g., "add title” control 404
- a visual help UI may be triggered manually or automatically to teach or demonstrate or a gesture that can be utilized in an application.
- a visual help UI such as a floating hand or an arrow
- a visual help UI may provide gesture feedback. For example, a determination may be made that a user is inputting a gesture that is not a recognized gesture but is identified as being close to a recognized gesture.
- a visual help UI may be displayed showing the user the recognized gesture.
- a visual help UI may be displayed showing a horizontal swiping motion through the task item and showing the item being marked as complete.
- a sensitivity level of triggering a visual help UI may be adjustable.
- a visual help UI may be displayed upon a determination of any incorrect gesture, may be displayed automatically upon detecting an input of an unrecognized gesture after a predetermined number of times, may be displayed once, or may be displayed a predetermined number of times.
- a visual help UI may be a feature that may be toggled on or off.
- Embodiments of the present invention may be applied to various software applications and may be utilized with various input methods.
- the examples illustrated in the figures show touch based UIs on mobile 200 and tablet 300 devices
- embodiments may be utilized on a vast array of devices including, but not limited to, desktop computer systems, wired and wireless computing systems, mobile computing systems (e.g., mobile telephones, netbooks, tablet or slate type computers, notebook computers, and laptop computers), hand-held devices, IP telephones, gaming devices, cameras, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, and mainframe computers.
- desktop computer systems e.g., desktop computer systems, wired and wireless computing systems, mobile computing systems (e.g., mobile telephones, netbooks, tablet or slate type computers, notebook computers, and laptop computers), hand-held devices, IP telephones, gaming devices, cameras, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, and mainframe computers.
- a visual help UI may include, but is not restricted to, an arrow or focus indicator, a ghosted hand, finger, or stylus, an animated figure or avatar, highlighting, audio, or an indication of a touch or a selection.
- Visuals may be animated to imply interaction or emphasis and may manipulate or show a suggested workflow or input sequence using a user's actual content.
- a visual help UI may be activated by various triggers including, but not limited to, a user action detected by a device or sensor, a selection of a control or command sequence, or an explicit request by a user.
- a trigger activated by a device or sensor may include various types of sensors including, but not limited to, a digitizer, a gyroscope, a compass, an accelerometer, a microphone, a light sensor, a proximity sensor, a near field communications (NFC) sensor, a GPS, etc.
- a digitizer is an electronic component that may be utilized to receive input via sensing a touch of a human finger, stylus, or other input device on a touch screen interface.
- a digitizer for example, may be utilized to sense when a user makes or repeats an unrecognized gesture on a screen.
- a compass or gyroscope for example, may be utilized to sense orientation of a device, navigation, etc.
- a user may utilize a mobile phone 200 or a GPS device for a map or GPS application (e.g., an augmented reality application).
- a user may be holding the device 200 so that a map is displayed in a portrait orientation 500 as illustrated in FIGURE 5A, which may not be an ideal orientation for utilization of the application.
- a compass or gyroscope may be utilized to sense the orientation of the device 200 and, as illustrated in FIGURE 5B, a visual help UI, such as a rotation arrow 502 and/or text, may be displayed on the map interface to suggest to the user to rotate the device 200.
- the user may turn the device 200 as illustrated in FIGURE 5C, wherein the map may be displayed in a landscape orientation 504, providing a better display for utilization of the application.
- An accelerometer may be utilized to detect movement or stability of a device.
- an accelerometer may be utilized to detect that a device is not stable and may be being used in a car or in a bus.
- Embodiments may provide a visual help UI to suggest to the user to use voice input and may show the user where to click to initiate the voice input.
- a user may utilize a mobile phone 200, a camera, or other type of device for taking a picture.
- An accelerometer on the device 200 may be utilized to detect that the device 200 is not being held steadily, resulting in a blurred photograph 602.
- a visual help UI such as a ghosted hand 604 displayed on the GUI, may be provided demonstrating to the user that a stabilization function 606 may be utilized to take a better photograph.
- the user may choose to select the stabilization function 606 as shown in FIGURE 6B, resulting in a better output 608 for the camera application functionality.
- a microphone may be utilized to detect audio.
- a microphone 702 on a device such as a laptop computer 700 running an application such as a notes application 704, may be utilized to detect noise, such as a voice in the environment.
- a detection of a voice may trigger a visual help UI, for example, a mouse pointer 708 displayed selecting a record audio functionality 706.
- a microphone 702 on a device such as an IP telephone 800
- a microphone 702 on a device may be utilized to detect noise, such as a user speaking into the microphone while on a conference call while the telephone is on mute.
- Embodiments may detect that the user is talking 802 into the telephone 800 while the telephone is on mute and may suggest to the user via a visual help UI, for example, a visual notification 804 that the telephone is on mute, to unmute the telephone.
- a light sensor may be utilized to detect if a functionality such as a flash on a camera or a speakerphone on a mobile phone should be used while a user is utilizing the device for a certain application.
- a light sensor on a mobile phone 200 may detect that, while a user is using a camera application, the amount of light may produce an underdeveloped photograph. This detection may trigger a visual help UI to suggest turning on a flash on the device 200.
- a GPS may be utilized to detect that a user is travelling and trigger a visual help UI.
- a GPS may detect that a user is driving while the user has an application open, for example, a food finder application, on his mobile phone 200.
- a visual help UI may be triggered to suggest using a "local feature" on the application to find nearby restaurants.
- a proximity sensor may be utilized to detect a distance between a device and another object (e.g., the distance between a mobile phone 200 and a user's face). For example, a user may use a front-facing camera on a mobile phone 200 to chat with someone. A proximity sensor may be used to detect if the phone is being held too closely to the user's face. A visual help UI may be triggered to suggest holding the phone 200 further away.
- a near field communications (NFC) sensor may be utilized to detect other objects
- NFC-capable devices may also be utilized to facilitate an exchange of data between NFC-capable devices.
- a user may use a mobile phone 200 to pay for a cup of coffee at a coffee shop.
- An NFC sensor may be used to detect if a user does not hold his phone 200 over a payment sensor long enough for a transaction to complete.
- a visual help UI may be triggered to inform the user to hold his phone over the payment sensor longer or may warn the user that the transaction has not completed.
- FIGURE 9 a flow chart of a method 900 for providing a visual guidance user interface for helping a user learn a product's capability and inputs needed to achieve a given action is illustrated.
- the method 900 starts at OPERATION 905 and proceeds to OPERATION 910 where an application is opened.
- the application may be one of various types of applications with a graphical user interface.
- an input is received.
- an input may include a user action detected by a device or a sensor on a device, a selection of a button, functionality, or command sequence on a device, or may be a request by a user for help.
- a user action detected by a device may include, but is not limited to, a digitizer sensing a gesture made on the device, a gyroscope or compass sensing an orientation of the device, an accelerometer sensing motion of the device, a microphone sensing noise, a light sensor sensing light, a proximity sensor sensing proximity of a user to the device, a near field communications (NFC) sensor detecting other NFC-capable devices and facilitating an exchange of data between devices, or a GPS sensing a position of the device.
- a digitizer sensing a gesture made on the device
- an accelerometer sensing motion of the device
- a microphone sensing noise
- a light sensor sensing light
- a proximity sensor sensing proximity of a user to the device
- NFC near field communications
- a gesture input may include an input made without a mechanical device (e.g., a user body movement) or with a mechanical input device (e.g., with a mouse, touchscreen, stylus, etc.), the input originating from a bodily motion that can be received, recognized, and translated into a selection and/or movement of an element or object on a graphical user interface that mimics the bodily motion.
- a selection of a button, functionality, or command sequence on a device may include, for example, a detection of a sequence of commands a user enters to accomplish a task or a hesitation or elapse of a threshold of time after opening an application or after completing a task.
- a request by a user for help may include, but is not limited to, a selection of a "show me” button 304 in a help or "how-to" article 302 as illustrated in FIGURE 3, selecting a "help” functionality command, or a selection of a "turn help tips on” feature in a settings or help menu.
- the method 900 proceeds to OPERATION 920, where a determination is made whether the input received in OPERATION 915 meets a criterion for triggering a visual help UI.
- the visual help UI is displayed.
- the visual help UI may be displayed over the application GUI and may demonstrate a feature, workflow, or gesture using the current application being utilized and using the user's content.
- the visual help UI may demonstrate a feature on a device, such as a gesture, functionality, or behavior.
- the visual help UI may suggest a work flow, for example, how to create a to-do list or a suggested next step after a pause or hesitation is detected at OPERATION 915.
- the visual help UI may teach a gesture language. For example, a gesture such as a swipe across a task item in a to-do list may be demonstrated.
- a digitizer on a device detects that a user is using a gesture that is not recognized, at OPERATION 925, a demonstration of a gesture that may be determined as a recognized gesture similar to the gesture made by the user may be displayed.
- the visual help UI may include, but is not restricted to, an arrow or focus indicator, a ghosted hand, finger, or stylus, an animated figure or avatar, highlighting, audio, or an indication of a touch or a selection.
- Visuals may be animated to imply interaction or emphasis and may manipulate or show a suggested workflow or input sequence using a user's actual content. The method ends at OPERATION 995.
- the embodiments and functionalities described herein may operate via a multitude of computing systems including, without limitation, desktop computer systems, wired and wireless computing systems, mobile computing systems (e.g., mobile telephones, netbooks, tablet or slate type computers, notebook computers, and laptop computers), hand-held devices, IP phones, gaming devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, and mainframe computers.
- mobile computing systems e.g., mobile telephones, netbooks, tablet or slate type computers, notebook computers, and laptop computers
- hand-held devices IP phones
- gaming devices e.g., gaming devices
- multiprocessor systems e.g., microprocessor-based or programmable consumer electronics
- minicomputers e.g., Apple MacBook Air Traffic Control
- mainframe computers e.g., Apple MacBook Air Traffic Control, etc.
- distributed systems e.g., cloud-based computing systems
- application functionality, memory, data storage and retrieval and various processing functions may be operated remotely from each other over a distributed computing network, such
- User interfaces and information of various types may be displayed via on-board computing device displays or via remote display units associated with one or more computing devices. For example user interfaces and information of various types may be displayed and interacted with on a wall surface onto which user interfaces and information of various types are projected. Interaction with the multitude of computing systems with which embodiments of the invention may be practiced include, keystroke entry, touch screen entry, voice or other audio entry, gesture entry where an associated computing device is equipped with detection (e.g., camera) functionality for capturing and interpreting user gestures for controlling the functionality of the computing device, and the like.
- detection e.g., camera
- gesture entry may also include an input made with a mechanical input device (e.g., with a mouse, touchscreen, stylus, etc.), the input originating from a bodily motion that can be received, recognized, and translated into a selection and/or movement of an element or object on a graphical user interface that mimics the bodily motion.
- a mechanical input device e.g., with a mouse, touchscreen, stylus, etc.
- the input originating from a bodily motion that can be received, recognized, and translated into a selection and/or movement of an element or object on a graphical user interface that mimics the bodily motion.
- FIGURES 10 through 12 and the associated descriptions provide a discussion of a variety of operating environments in which embodiments of the invention may be practiced.
- the devices and systems illustrated and discussed with respect to FIGURES 10 through 12 are for purposes of example and illustration and are not limiting of a vast number of computing device configurations that may be utilized for practicing embodiments of the invention, described herein.
- FIGURE 10 is a block diagram illustrating example physical components (i.e., hardware) of a computing device 1000 with which embodiments of the invention may be practiced.
- the computing device components described below may be suitable for the computing devices described above.
- the computing device 1000 may include at least one processing unit 1002 and a system memory 1004.
- the system memory 1004 may comprise, but is not limited to, volatile storage (e.g., random access memory), nonvolatile storage (e.g., read-only memory), flash memory, or any combination of such memories.
- the system memory 1004 may include an operating system 1005 and one or more program modules 1006 suitable for running software applications 1020 such as a visual help UI application 1050.
- the operating system 1005 may be suitable for controlling the operation of the computing device 1000.
- embodiments of the invention may be practiced in conjunction with a graphics library, other operating systems, or any other application program and is not limited to any particular application or system.
- This basic configuration is illustrated in FIGURE 10 by those components within a dashed line 1008.
- the computing device 1300 may have additional features or functionality.
- the computing device 1000 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape.
- additional storage is illustrated in FIGURE 10 by a removable storage device 1009 and a non-removable storage device 1010.
- a number of program modules and data files may be stored in the system memory 1004.
- the program modules 1006, such as the visual help UI application 1050 may perform processes including, for example, one or more of the stages of the method 900.
- the aforementioned process is an example, and the processing unit 1002 may perform other processes.
- Other program modules that may be used in accordance with embodiments of the present invention may include electronic mail and contacts applications, word processing applications, database applications, slide presentation applications, drawing or computer- aided application programs, etc. Although described herein as being performed by a spreadsheet application 1050, embodiments may apply to any application with tables or grid-structured data.
- embodiments of the invention may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors.
- embodiments of the invention may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in FIGURE 10 may be integrated onto a single integrated circuit.
- SOC system-on-a-chip
- Such an SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or "burned") onto the chip substrate as a single integrated circuit.
- the functionality, described herein, with respect to the spreadsheet application 1050 may be operated via application-specific logic integrated with other components of the computing device 1000 on the single integrated circuit (chip).
- Embodiments of the invention may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies.
- embodiments of the invention may be practiced within a general purpose computer or in any other circuits or systems.
- the computing device 1000 may also have one or more input device(s) 1012 such as a keyboard, a mouse, a pen, a sound input device, a touch input device, a microphone, a gesture recognition device, etc.
- the output device(s) 1014 such as a display, speakers, a printer, etc. may also be included.
- the aforementioned devices are examples and others may be used.
- the computing device 1000 may include one or more communication connections 1016 allowing communications with other computing devices 1018. Examples of suitable communication connections 1016 include, but are not limited to, RF transmitter, receiver, and/or transceiver circuitry; universal serial bus (USB), parallel, or serial ports, and other connections appropriate for use with the applicable computer readable media.
- USB universal serial bus
- Embodiments of the invention may be implemented as a computer process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media.
- the computer program product may be a computer storage media readable by a computer system and encoding a computer program of instructions for executing a computer process.
- Computer readable media may include computer storage media and communication media.
- Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
- the system memory 1004, the removable storage device 1009, and the non-removable storage device 1010 are all computer storage media examples (i.e., memory storage.)
- Computer storage media may include, but is not limited to, RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store information and which can be accessed by the computing device 1000. Any such computer storage media may be part of the computing device 1000.
- Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media.
- modulated data signal may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal.
- communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
- RF radio frequency
- FIGURES 11A and 11B illustrate a mobile computing device 1100, for example, a mobile telephone 200, a smart phone, a tablet personal computer 300, a laptop computer 700, and the like, with which embodiments of the invention may be practiced.
- a mobile computing device 1100 for example, a mobile telephone 200, a smart phone, a tablet personal computer 300, a laptop computer 700, and the like, with which embodiments of the invention may be practiced.
- FIGURE 11 A an exemplary mobile computing device 1100 for implementing the embodiments is illustrated.
- the mobile computing device 1100 is a handheld computer having both input elements and output elements.
- the mobile computing device 1100 typically includes a display 1105 and one or more input buttons 1110 that allow the user to enter information into the mobile computing device 1100.
- the display 1105 of the mobile computing device 1100 may also function as an input device (e.g., a touch screen display).
- an optional side input element 1115 allows further user input.
- the side input element 1115 may be a rotary switch, a button, or any other type of manual input element.
- mobile computing device 1100 may incorporate more or less input elements.
- the display 1105 may not be a touch screen in some embodiments.
- the mobile computing device 1100 is a portable phone system, such as a cellular phone.
- the mobile computing device 1100 may also include an optional keypad 1135.
- Optional keypad 1135 may be a physical keypad or a "soft" keypad generated on the touch screen display.
- the output elements include the display 1105 for showing a graphical user interface (GUI), a visual indicator 1120 (e.g., a light emitting diode), and/or an audio transducer 1125 (e.g., a speaker).
- GUI graphical user interface
- the mobile computing device 1100 incorporates a vibration transducer for providing the user with tactile feedback.
- the mobile computing device 1100 incorporates input and/or output ports, such as an audio input (e.g., a microphone jack), an audio output (e.g., a headphone jack), and a video output (e.g., a HDMI port) for sending signals to or receiving signals from an external device.
- FIGURE 11B is a block diagram illustrating the architecture of one embodiment of a mobile computing device. That is, the mobile computing device 1100 can incorporate a system (i.e., an architecture) 1102 to implement some embodiments.
- the system 1102 is implemented as a "smart phone" capable of running one or more applications (e.g., browser, e-mail, calendaring, contact managers, messaging clients, games, and media clients/players).
- the system 1102 is integrated as a computing device, such as an integrated personal digital assistant (PDA) and wireless phone.
- PDA personal digital assistant
- One or more application programs 1166 may be loaded into the memory 1162 and run on or in association with the operating system 1164. Examples of the application programs include phone dialer programs, e-mail programs, personal information management (PIM) programs, word processing programs, spreadsheet programs, Internet browser programs, messaging programs, and so forth.
- the system 1102 also includes a non-volatile storage area 1168 within the memory 1162. The non-volatile storage area 1168 may be used to store persistent information that should not be lost if the system 1102 is powered down.
- the application programs 1166 may use and store information in the non-volatile storage area 1168, such as e-mail or other messages used by an e-mail application, and the like.
- a synchronization application (not shown) also resides on the system 1102 and is programmed to interact with a corresponding synchronization application resident on a host computer to keep the information stored in the non-volatile storage area 1168 synchronized with corresponding information stored at the host computer.
- other applications may be loaded into the memory 1162 and run on the mobile computing device 1100, including the visual help UI application 1050 described herein.
- the system 1102 has a power supply 1170, which may be implemented as one or more batteries.
- the power supply 1170 might further include an external power source, such as an AC adapter or a powered docking cradle that supplements or recharges the batteries.
- the system 1102 may also include a radio 1172 that performs the function of transmitting and receiving radio frequency communications.
- the radio 1172 facilitates wireless connectivity between the system 1102 and the "outside world", via a communications carrier or service provider. Transmissions to and from the radio 1172 are conducted under control of the operating system 1164. In other words, communications received by the radio 1172 may be disseminated to the application programs 1166 via the operating system 1164, and vice versa.
- the radio 1172 allows the system 1102 to communicate with other computing devices, such as over a network.
- the radio 1172 is one example of communication media.
- Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
- This embodiment of the system 1102 provides notifications using the visual indicator 1120 that can be used to provide visual notifications and/or an audio interface 1174 producing audible notifications via the audio transducer 1125.
- the visual indicator 1120 is a light emitting diode (LED) and the audio transducer 1125 is a speaker.
- LED light emitting diode
- the LED may be programmed to remain on indefinitely until the user takes action to indicate the powered-on status of the device.
- the audio interface 1174 is used to provide audible signals to and receive audible signals from the user.
- the audio interface 1174 may also be coupled to a microphone 702 to receive audible input, such as to facilitate a telephone conversation.
- the microphone may also serve as an audio sensor to facilitate control of notifications, as will be described below.
- the system 1102 may further include a video interface 1176 that enables an operation of an on-board camera 1130 to record still images, video stream, and the like.
- a mobile computing device 1100 implementing the system 1102 may have additional features or functionality.
- the mobile computing device 1100 may also include additional data storage devices (removable and/or non-removable) such as, magnetic disks, optical disks, or tape.
- additional storage is illustrated in Figure 11B by the non-volatile storage area 1168.
- Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
- Data/information generated or captured by the mobile computing device 1100 and stored via the system 1102 may be stored locally on the mobile computing device 1100, as described above, or the data may be stored on any number of storage media that may be accessed by the device via the radio 1172 or via a wired connection between the mobile computing device 1100 and a separate computing device associated with the mobile computing device 1100, for example, a server computer in a distributed computing network, such as the Internet.
- a server computer in a distributed computing network such as the Internet.
- data/information may be accessed via the mobile computing device 1100 via the radio 1172 or via a distributed computing network.
- data/information may be readily transferred between computing devices for storage and use according to well- known data/information transfer and storage means, including electronic mail and collaborative data/information sharing systems.
- FIGURE 12 illustrates one embodiment of the architecture of a system for providing the visual help UI application 1050 to one or more client devices, as described above.
- Content developed, interacted with or edited in association with the visual help UI application 1050 may be stored in different communication channels or other storage types.
- various documents may be stored using a directory service 1222, a web portal 1224, a mailbox service 1226, an instant messaging store 1228, or a social networking site 1230.
- the visual help UI application 1050 may use any of these types of systems or the like for providing a visual guidance user interface for helping a user learn a product's capability and inputs needed to achieve a given action, as described herein.
- a server 1220 may provide the visual help UI application 1050 to clients.
- the server 1220 may be a web server providing the visual help UI application 1050 over the web.
- the server 1220 may provide the visual help UI application 1050 over the web to clients through a network 1215.
- the client computing device 1218 may be implemented as the computing device 1000 and embodied in a personal computer 1218a, a tablet computing device 1218b and/or a mobile computing device 1218c (e.g., a smart phone). Any of these embodiments of the client computing device 1218 may obtain content from the store 1216.
- the types of networks used for communication between the computing devices that make up the present invention include, but are not limited to, an internet, an intranet, wide area networks (WAN), local area networks (LAN), and virtual private networks (VPN).
- the networks include the enterprise network and the network through which the client computing device accesses the enterprise network (i.e., the client network).
- the client network is part of the enterprise network.
- the client network is a separate network accessing the enterprise network through externally available entry points, such as a gateway, a remote access protocol, or a public or private internet address.
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201380035458.4A CN104428749A (zh) | 2012-07-02 | 2013-07-01 | 由用户动作触发的可视ui向导 |
EP13739550.5A EP2867765A2 (en) | 2012-07-02 | 2013-07-01 | Visual ui guide triggered by user actions |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/539,849 | 2012-07-02 | ||
US13/539,849 US20140006944A1 (en) | 2012-07-02 | 2012-07-02 | Visual UI Guide Triggered by User Actions |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2014008208A2 true WO2014008208A2 (en) | 2014-01-09 |
WO2014008208A3 WO2014008208A3 (en) | 2014-02-27 |
Family
ID=48808514
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2013/048978 WO2014008208A2 (en) | 2012-07-02 | 2013-07-01 | Visual ui guide triggered by user actions |
Country Status (4)
Country | Link |
---|---|
US (1) | US20140006944A1 (zh) |
EP (1) | EP2867765A2 (zh) |
CN (1) | CN104428749A (zh) |
WO (1) | WO2014008208A2 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9924313B1 (en) | 2017-02-23 | 2018-03-20 | International Business Machines Corporation | Location based generation of pertinent information |
WO2019093716A1 (ko) * | 2017-11-10 | 2019-05-16 | 삼성전자(주) | 전자장치 및 그 제어방법 |
Families Citing this family (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102012014910A1 (de) * | 2012-07-27 | 2014-01-30 | Volkswagen Aktiengesellschaft | Bedienschnittstelle, Verfahren zum Anzeigen einer eine Bedienung einer Bedienschnittstelle erleichternden Information und Programm |
JP6188405B2 (ja) * | 2013-05-01 | 2017-08-30 | キヤノン株式会社 | 表示制御装置、表示制御方法、及びプログラム |
RU2688390C1 (ru) * | 2013-12-20 | 2019-05-21 | Филип Моррис Продактс С.А. | Курительное изделие, имеющее фильтр, содержащий капсулу |
US9390726B1 (en) | 2013-12-30 | 2016-07-12 | Google Inc. | Supplementing speech commands with gestures |
US9213413B2 (en) | 2013-12-31 | 2015-12-15 | Google Inc. | Device interaction with spatially aware gestures |
KR102155091B1 (ko) * | 2014-01-22 | 2020-09-11 | 엘지전자 주식회사 | 이동단말기 및 그 제어방법 |
US9715875B2 (en) | 2014-05-30 | 2017-07-25 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US9529605B2 (en) | 2014-08-27 | 2016-12-27 | Microsoft Technology Licensing, Llc | Customizing user interface indicators based on prior interactions |
US10496420B2 (en) * | 2014-12-02 | 2019-12-03 | Cerner Innovation, Inc. | Contextual help within an application |
US20160180352A1 (en) * | 2014-12-17 | 2016-06-23 | Qing Chen | System Detecting and Mitigating Frustration of Software User |
US20160179798A1 (en) * | 2014-12-19 | 2016-06-23 | Blackboard Inc. | Method and system for navigating through a datacenter hierarchy in a mobile computer device |
KR20160101605A (ko) * | 2015-02-17 | 2016-08-25 | 삼성전자주식회사 | 제스처 입력 처리 방법 및 이를 지원하는 전자 장치 |
US9830249B2 (en) * | 2015-03-04 | 2017-11-28 | International Business Machines Corporation | Preemptive trouble shooting using dialog manager |
US20160306503A1 (en) * | 2015-04-16 | 2016-10-20 | Vmware, Inc. | Workflow Guidance Widget with State-Indicating Buttons |
US10606466B2 (en) * | 2015-07-13 | 2020-03-31 | Facebook, Inc. | Presenting additional content to an online system user based on user interaction with a scrollable content unit |
US10455049B2 (en) | 2015-07-13 | 2019-10-22 | Facebook, Inc. | Presenting content to an online system user based on content presented by a scrollable content unit |
CN114896015A (zh) * | 2015-09-23 | 2022-08-12 | 尹特根埃克斯有限公司 | 实时帮助的系统和方法 |
US9910641B2 (en) | 2015-10-14 | 2018-03-06 | Microsoft Technology Licensing, Llc | Generation of application behaviors |
US11093210B2 (en) * | 2015-10-28 | 2021-08-17 | Smule, Inc. | Wireless handheld audio capture device and multi-vocalist method for audiovisual media application |
US10146397B2 (en) | 2015-11-27 | 2018-12-04 | International Business Machines Corporation | User experience steering |
CN113467868B (zh) * | 2016-08-26 | 2023-12-15 | 成都华为技术有限公司 | 一种创建设备资源的方法和装置 |
US9843672B1 (en) | 2016-11-14 | 2017-12-12 | Motorola Mobility Llc | Managing calls |
US9843673B1 (en) * | 2016-11-14 | 2017-12-12 | Motorola Mobility Llc | Managing calls |
US10671602B2 (en) | 2017-05-09 | 2020-06-02 | Microsoft Technology Licensing, Llc | Random factoid generation |
CN114968453A (zh) * | 2017-09-30 | 2022-08-30 | 华为技术有限公司 | 显示方法、移动终端及图形用户界面 |
US10733000B1 (en) * | 2017-11-21 | 2020-08-04 | Juniper Networks, Inc | Systems and methods for providing relevant software documentation to users |
US11093118B2 (en) | 2019-06-05 | 2021-08-17 | International Business Machines Corporation | Generating user interface previews |
US10956015B1 (en) | 2019-09-11 | 2021-03-23 | International Business Machines Corporation | User notification based on visual trigger event |
US11150923B2 (en) * | 2019-09-16 | 2021-10-19 | Samsung Electronics Co., Ltd. | Electronic apparatus and method for providing manual thereof |
US11900320B2 (en) * | 2020-11-09 | 2024-02-13 | Accenture Global Solutions Limited | Utilizing machine learning models for identifying a subject of a query, a context for the subject, and a workflow |
US12061470B2 (en) | 2020-12-04 | 2024-08-13 | UiPath, Inc. | Guided operation by robotic processes |
CN114461063B (zh) * | 2022-01-18 | 2022-09-20 | 深圳时空科技集团有限公司 | 一种基于车载屏幕的人机交互方法 |
US11995457B2 (en) * | 2022-06-03 | 2024-05-28 | Apple Inc. | Digital assistant integration with system interface |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050138559A1 (en) * | 2003-12-19 | 2005-06-23 | International Business Machines Corporation | Method, system and computer program for providing interactive assistance in a computer application program |
US7346846B2 (en) * | 2004-05-28 | 2008-03-18 | Microsoft Corporation | Strategies for providing just-in-time user assistance |
KR101304461B1 (ko) * | 2006-12-04 | 2013-09-04 | 삼성전자주식회사 | 제스처 기반 사용자 인터페이스 방법 및 장치 |
US8196042B2 (en) * | 2008-01-21 | 2012-06-05 | Microsoft Corporation | Self-revelation aids for interfaces |
US9053088B2 (en) * | 2008-03-31 | 2015-06-09 | Qualcomm Incorporated | Displaying mnemonic abbreviations for commands |
US8418085B2 (en) * | 2009-05-29 | 2013-04-09 | Microsoft Corporation | Gesture coach |
US10387175B2 (en) * | 2009-10-23 | 2019-08-20 | Autodesk, Inc. | Method and system for providing software application end-users with contextual access to text and video instructional information |
US20110191675A1 (en) * | 2010-02-01 | 2011-08-04 | Nokia Corporation | Sliding input user interface |
US9063757B2 (en) * | 2010-04-06 | 2015-06-23 | Microsoft Technology Licensing, Llc | Interactive application assistance, such as for web applications |
TWI438675B (zh) * | 2010-04-30 | 2014-05-21 | Ibm | 提供情境感知援助說明之方法、裝置及電腦程式產品 |
US8751419B2 (en) * | 2010-11-19 | 2014-06-10 | Shipjo, Llc | Shipping system and method with taxonomic tariff harmonization |
US8990689B2 (en) * | 2011-02-03 | 2015-03-24 | Sony Corporation | Training for substituting touch gestures for GUI or hardware keys to control audio video play |
US9602453B2 (en) * | 2011-02-10 | 2017-03-21 | International Business Machines Corporation | Smart attachment to electronic messages |
US8630822B2 (en) * | 2011-02-11 | 2014-01-14 | International Business Machines Corporation | Data center design tool |
US9129324B2 (en) * | 2011-10-05 | 2015-09-08 | The Okanjo Company, Llc | Social platform ecommerce system and method of operation |
-
2012
- 2012-07-02 US US13/539,849 patent/US20140006944A1/en not_active Abandoned
-
2013
- 2013-07-01 CN CN201380035458.4A patent/CN104428749A/zh active Pending
- 2013-07-01 WO PCT/US2013/048978 patent/WO2014008208A2/en active Application Filing
- 2013-07-01 EP EP13739550.5A patent/EP2867765A2/en not_active Withdrawn
Non-Patent Citations (1)
Title |
---|
None |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9924313B1 (en) | 2017-02-23 | 2018-03-20 | International Business Machines Corporation | Location based generation of pertinent information |
WO2019093716A1 (ko) * | 2017-11-10 | 2019-05-16 | 삼성전자(주) | 전자장치 및 그 제어방법 |
KR20190053727A (ko) * | 2017-11-10 | 2019-05-20 | 삼성전자주식회사 | 전자장치 및 그 제어방법 |
US11169774B2 (en) | 2017-11-10 | 2021-11-09 | Samsung Electronics Co., Ltd. | Electronic apparatus and control method thereof |
KR102480728B1 (ko) | 2017-11-10 | 2022-12-23 | 삼성전자주식회사 | 전자장치 및 그 제어방법 |
Also Published As
Publication number | Publication date |
---|---|
WO2014008208A3 (en) | 2014-02-27 |
EP2867765A2 (en) | 2015-05-06 |
CN104428749A (zh) | 2015-03-18 |
US20140006944A1 (en) | 2014-01-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140006944A1 (en) | Visual UI Guide Triggered by User Actions | |
US10705783B2 (en) | Showing interactions as they occur on a whiteboard | |
US10684769B2 (en) | Inset dynamic content preview pane | |
US10235018B2 (en) | Browsing electronic messages displayed as titles | |
EP2872982B1 (en) | Location-dependent drag and drop ui | |
US20150052465A1 (en) | Feedback for Lasso Selection | |
US9792038B2 (en) | Feedback via an input device and scribble recognition | |
US20140354554A1 (en) | Touch Optimized UI | |
EP2907047B1 (en) | User interface elements for content selection and extended content selection | |
US20140122626A1 (en) | Method and apparatus for message conversation in electronic device | |
KR20150023284A (ko) | 향상된 전자 통신 초안 관리 기법 | |
US20130339903A1 (en) | UI Differentiation Between Delete and Clear | |
US12056413B2 (en) | Contextual workflow triggering on devices | |
US20140365261A1 (en) | Creating recurring appointments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13739550 Country of ref document: EP Kind code of ref document: A2 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013739550 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |