US20030210260A1 - Methods and apparatuses for providing message information in graphical user interfaces based on user inputs - Google Patents

Methods and apparatuses for providing message information in graphical user interfaces based on user inputs Download PDF

Info

Publication number
US20030210260A1
US20030210260A1 US10/143,325 US14332502A US2003210260A1 US 20030210260 A1 US20030210260 A1 US 20030210260A1 US 14332502 A US14332502 A US 14332502A US 2003210260 A1 US2003210260 A1 US 2003210260A1
Authority
US
United States
Prior art keywords
user input
input portion
gui
recited
message
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US10/143,325
Other versions
US7890865B2 (en
Inventor
Steve Palmer
Valery Tolkov
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US10/143,325 priority Critical patent/US7890865B2/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PALMER, STEVE, TOLKOV, VALERY
Publication of US20030210260A1 publication Critical patent/US20030210260A1/en
Priority to US12/980,719 priority patent/US20110093782A1/en
Application granted granted Critical
Publication of US7890865B2 publication Critical patent/US7890865B2/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04895Guidance during keyboard input operation, e.g. prompting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/453Help systems

Definitions

  • This invention relates to computers and software, and more particularly to methods and apparatuses for providing information in a graphical user interface (GUI) computing environment using non-modal messages.
  • GUI graphical user interface
  • GUIs graphical user interfaces
  • modal message boxes can be distracting to the user, and/or unintentionally/prematurely dismissed. For example, if the user is busy typing or clicking elsewhere when the box appears, they might accidentally dismiss the modal message box before having a chance to read it.
  • a typical modal message box does not graphically indicate the source of an error and/or problem, should it be visible within the GUI. For example, if the user entered the wrong information in a user input field presented by the GUI.
  • Non-modal error message within a GUI.
  • One exemplary type of non-modal message is a pop-up error message.
  • Balloon error messages improve the way that error information is presented to the user by replacing the usual modal message box with a pop-up error message that is not modal and thus does not need to be dismissed by the user before the error can be corrected.
  • a typical pop-up error message has the additional advantage of being strategically located to help identify the location within the GUI that is associated with the error. This allows the user to quickly identify where corrections may be needed.
  • a further exemplary drawback to conventional modal message boxes is that the message box needs to be dismissed by the user before the user is allowed to make any corrections.
  • conventional pop-up error message techniques may remove the pop-up error message automatically after having displayed it for a period of time and/or removing the balloon error message from the display when the user begins making applicable corrections.
  • the modal message box or pop-up error message includes information that may be beneficial during subsequent input by the user, then the user will need to remember or perhaps write down such information.
  • GUI graphical user interface
  • a method in accordance with certain aspects of the present invention that includes displaying at least one user input portion within a GUI and determining if the user input portion is in an invalid state by determining that valid user input associated with the user input portion has not been received.
  • the method further includes displaying a non-modal message within the GUI.
  • the non-modal message is visibly graphically associated with the user input portion.
  • the method also includes automatically applying a focus of the GUI on the user input portion. As long as the focus of the GUI remains on the user input portion, the method includes displaying the non-modal message until the user input portion is determined to be in a valid state.
  • a computer-readable medium which has computer-executable instructions for causing a GUI having at least one user input portion to be displayed, selectively causing a focus of the GUI to be applied on the user input portion if valid user input associated with said user input portion has not been received, and displaying a non-modal message within the GUI that is visibly connected to the user input portion until the user input portion is determined to be in a valid state or the focus of said GUI is removed from the user input portion.
  • an apparatus in accordance with still other aspects of the present invention, includes logic, memory, at least one user input device and at least one display device.
  • the logic is configured to cause a GUI to be visibly presented via the display device.
  • the GUI includes at least one user input portion.
  • the logic is further configured to determine if the user input portion is in an invalid state by determining that valid user input associated with the user input portion has not been received.
  • the logic will then cause a non-modal message to be presented within the GUI.
  • the non-modal message is visibly associated with the user input portion.
  • the logic applies a user input focus of the GUI on the user input portion. As long as the focus of the GUI is on the user input portion, the logic will continue presenting the non-modal message until the user input portion is determined to be in a valid state.
  • FIG. 1 is a block diagram depicting a computer system/environment suitable for use in accordance with certain exemplary implementations of the present invention.
  • FIG. 2 depicts illustrative representations of graphical user interfaces (GUIs) having user input portions and information being displayed in non-modal messages, in accordance with certain exemplary implementations of the present invention.
  • GUIs graphical user interfaces
  • FIG. 3 is a flow diagram depicting a process for displaying information associated with user input portions of a GUI, in accordance with certain exemplary implementations of the present invention.
  • FIG. 4 is a block diagram depicting a device configured to display information associated with user input portions of a GUI, in accordance with certain further exemplary implementations of the present invention.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • the invention may be practiced with other computer system configurations, including hand-held devices, multi-processor systems, microprocessor based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like.
  • the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules may be located in both local and remote memory storage devices.
  • FIG. 1 illustrates an example of a suitable computing environment 120 on which the subsequently described methods and apparatuses may be implemented.
  • Exemplary computing environment 120 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the improved methods and systems described herein. Neither should computing environment 120 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in computing environment 120 .
  • the improved methods and apparatuses herein are operational with numerous other general purpose or special purpose computing system environments or configurations.
  • Examples of well known computing systems, environments, and/or configurations that may be suitable include, but are not limited to, personal computers, server computers, thin clients, thick clients, handheld or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
  • computing environment 120 includes a general-purpose computing device in the form of a computer 130 .
  • the components of computer 130 may include one or more processors or processing units 132 , a system memory 134 , and a bus 136 that couples various system components including system memory 134 to processor 132 .
  • Bus 136 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures.
  • bus architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnects (PCI) bus also known as Mezzanine bus.
  • Computer 130 typically includes a variety of computer readable media. Such media may be any available media that is accessible by computer 130 , and it includes both volatile and non-volatile media, removable and non-removable media.
  • system memory 134 includes computer readable media in the form of volatile memory, such as random access memory (RAM) 140 , and/or nonvolatile memory, such as read only memory (ROM) 138 .
  • RAM random access memory
  • ROM read only memory
  • a basic input/output system (BIOS) 142 containing the basic routines that help to transfer information between elements within computer 130 , such as during start-up, is stored in ROM 138 .
  • BIOS basic input/output system
  • RAM 140 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processor 132 .
  • Computer 130 may further include other removable/non-removable, volatile/non-volatile computer storage media.
  • FIG. 1 illustrates a hard disk drive 144 for reading from and writing to a non-removable, non-volatile magnetic media (not shown and typically called a “hard drive”), a magnetic disk drive 146 for reading from and writing to a removable, non-volatile magnetic disk 148 (e.g., a “floppy disk”), and an optical disk drive 150 for reading from or writing to a removable, non-volatile optical disk 152 such as a CD-ROM/R/RW, DVD-ROM/R/RW/+R/RAM or other optical media.
  • Hard disk drive 144 , magnetic disk drive 146 and optical disk drive 150 are each connected to bus 136 by one or more interfaces 154 .
  • the drives and associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules, and other data for computer 130 .
  • the exemplary environment described herein employs a hard disk, a removable magnetic disk 148 and a removable optical disk 152 , it should be appreciated by those skilled in the art that other types of computer readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, random access memories (RAMs), read only memories (ROM), and the like, may also be used in the exemplary operating environment.
  • a number of program modules may be stored on the hard disk, magnetic disk 148 , optical disk 152 , ROM 138 , or RAM 140 , including, e.g., an operating system 158 , one or more application programs 160 , other program modules 162 , and program data 164 .
  • the improved methods and systems described herein may be implemented within operating system 158 , one or more application programs 160 , other program modules 162 , and/or program data 164 .
  • a user may provide commands and information into computer 130 through input devices such as keyboard 166 and pointing device 168 (such as a “mouse”).
  • Other input devices may include a microphone, joystick, game pad, satellite dish, serial port, scanner, camera, etc.
  • a user input interface 170 that is coupled to bus 136 , but may be connected by other interface and bus structures, such as a parallel port, game port, or a universal serial bus (USB).
  • USB universal serial bus
  • a monitor 172 or other type of display device is also connected to bus 136 via an interface, such as a video adapter 174 .
  • personal computers typically include other peripheral output devices (not shown), such as speakers and printers, which may be connected through output peripheral interface 175 .
  • Computer 130 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 182 .
  • Remote computer 182 may include many or all of the elements and features described herein relative to computer 130 .
  • Logical connections shown in FIG. 1 are a local area network (LAN) 177 and a general wide area network (WAN) 179 .
  • LAN local area network
  • WAN wide area network
  • Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets, and the Internet.
  • computer 130 When used in a LAN networking environment, computer 130 is connected to LAN 177 via network interface or adapter 186 .
  • the computer When used in a WAN networking environment, the computer typically includes a modem 178 or other means for establishing communications over WAN 179 .
  • Modem 178 which may be internal or external, may be connected to system bus 136 via the user input interface 170 or other appropriate mechanism.
  • FIG. 1 Depicted in FIG. 1, is a specific implementation of a WAN via the Internet.
  • computer 130 employs modem 178 to establish communications with at least one remote computer 182 via the Internet 180 .
  • program modules depicted relative to computer 130 may be stored in a remote memory storage device.
  • remote application programs 189 may reside on a memory device of remote computer 182 . It will be appreciated that the network connections shown and described are exemplary and other means of establishing a communications link between the computers may be used.
  • FIG. 2 depicts illustrative representations of graphical user interfaces (GUIs) having user input portions and information being displayed in non-modal messages, in accordance with certain exemplary implementations of the present invention.
  • GUIs graphical user interfaces
  • GUI 202 is represented as being displayed by a display device (dashed-line box) 200 .
  • GUI 202 Within GUI 202 is a plurality of user input portions 204 a - n and 208 .
  • user input portions 204 a - n and/or 208 take the form of a data entry field suitable for the user to type or otherwise enter alphanumeric character strings and the like.
  • user input portion 208 is illustrated as being a user input field designed to allow the user to enter a numerical number relating to a month of the year.
  • a prompt 206 is shown as soliciting such user input.
  • user input portions 204 a - n may include user selectable/activated buttons, knobs, sliders, or other like graphically displayed user input mechanisms. It should be recognized, therefore, that the rectangular shaped dashed-line boxes defining user input portions in FIG. 2 are merely representative shapes and that the user input portions may take on any applicable shape, pattern, color, etc. Moreover, certain GUIs may only have a single user input portion, while other implementations have many user input portions.
  • user input portion 208 includes a user input (data) of “15”.
  • this user input value has been determined to represent an invalid entry since prompt 206 is requesting that the user enter a numerical identifier for a month of the year and only integers between 1 and 12 are valid entries.
  • the user input data (or lack thereof) is analyzed to determine if it is valid or invalid. If the user input data is determined to be valid, then the process associated with and/or supported by user input portion 208 is allowed to continue in some manner. If the user input data is determined to be invalid, then a non-modal message 210 is generated and displayed.
  • non-modal message 210 takes the shape of a balloon message having a tip pointing to or otherwise directing the user to user portion 208 which currently contains invalid user input data (i.e., the number “15”). Included in this exemplary visible graphical non-modal message 210 is message information that reads “Please enter an integer between 1 and 12 ”.
  • GUI 202 when non-modal message 210 is displayed the “focus” of GUI 202 is placed, moved or otherwise applied to user input field 208 .
  • user input portion 208 or the current data therein may be highlighted or changed in some visible manner to help the user to identify the user input portion associated with non-modal message 210 .
  • a cursor or like visible item can be placed in user input portion 208 and logic supporting GUI 202 operatively configured to receive new/revised user input data.
  • non-modal message 210 is maintained/displayed until a valid user input has been provided and/or the focus of GUI 202 is moved/removed from user input portion 208 .
  • the message information remains visible while the user provides new input(s) and until the user provides valid user inputs.
  • Non-modal message 210 is no longer displayed once the user has input valid user input. If the user decides to redirect the focus of GUI 202 , then non-modal message 210 will stop being displayed. However, if the user has failed to provide requisite valid user inputs, then non-modal message 210 will be displayed again.
  • the user can change or move the focus of GUI 202 by selectively moving and/or activating a pointing device such as a mouse, touch-pad, trackball, or the like, and/or striking one or more input keys on a keyboard or other like mechanism.
  • a pointing device such as a mouse, touch-pad, trackball, or the like
  • the user may hit a “tab” key to selectively move the focus of GUI 202 to another user input portion.
  • the focus of GUI 202 can be automatically moved to another portion within GUI 202 .
  • the focus of GUI 202 may change after the passage of a certain amount of time.
  • Non-modal message 210 may also time out in some manner as may be needed.
  • FIG. 2 also includes a portion of an exemplary GUI 220 further illustrating certain features associated with certain implementations of a non-modal message 224 that is displayed in reference to user input field 222 .
  • non-modal message 224 includes a graphical icon 226 , an identifier 228 and message information 230 .
  • Graphical icon 226 in this example visibly identifies that an error has occurred.
  • Identifier 228 provides a title or summary, for example, of the error (here, e.g., “Field is mandatory.”).
  • Message information 230 in this example further elaborates on the error by stating that “You must enter a value for Open Build.”
  • the focus of GUI 220 is user input field 222 .
  • FIG. 3 is a flow diagram depicting a process 300 for displaying information associated with user input portions of a GUI, in accordance with certain exemplary implementations of the present invention.
  • step 302 at least one user input portion is displayed within a GUI.
  • step 304 user input associated with the user input portion is received.
  • step 306 a determination is made that an input validation moment associated with the GUI and/or user input portion has been reached. For example, an input validation moment may be reached after the passage of a period of time with or without user inputs received in step 304 .
  • An input validation moment may be associated with the user selecting a particular GUI mechanism, such as, for example, a form “complete” button, an “enter” button, a “submit” button, a “send” button, etc. Note that process 300 may move from step 302 directly to step 306 without step 304 , for example, if the user does not provide user input for the user input portion.
  • step 308 an inquiry is made to determine if a user input was required for the user input portion. If the answer to the inquiry is “No”, then the user input or lack thereof was not required and hence is inherently valid. As such, process 300 may return to step 302 , for example, to display or process other user input portions or features of the GUI. If the answer to the inquiry in step 308 is “Yes”, then user input is required for the user input portion being analyzed and process 300 continues with step 310 .
  • step 310 an inquiry is made to determine if the current user input associated with the user input portion is valid. If the received user input is determined to be valid (i.e., the answer to inquiry 310 is “Yes”), then process 300 continues with step 312 , wherein if a non-modal message is displayed it is dismissed or closed and process 300 is allowed to return to step 302 , for example. If the received user input is determined to be invalid (i.e., the answer to inquiry 310 is “No”), then process 300 continues with step 314 .
  • the user input may be invalid if, for example, it is missing (not entered/received yet) and/or it fails to meet certain validation criteria associated with the user input portion.
  • step 314 An inquiry is made in step 314 to determine if any non-modal messages are currently open. If the answer is “Yes”, then process 300 continues with step 316 , wherein the open non-modal message is closed. From step 316 , process 300 proceeds to step 318 . If the answer to the inquiry in step 314 is “No”, then process 300 continues to step 318 .
  • step 318 a non-modal message is displayed with regard to the user input portion, which according to the analysis of process 300 currently contains invalid user input and/or is missing valid user input.
  • the focus of the GUI can be moved or otherwise applied to the user input portion.
  • process 300 continues with step 320 , wherein new user input is received.
  • the non-modal message displayed in step 318 is continually displayed until it is subsequently removed in either step 312 or step 316 , and/or the focus of the GUI is moved/removed from the user input portion.
  • process 300 returns to step 310 to determine if the newly received user input is valid or invalid.
  • FIG. 4 is a block diagram depicting a device 400 configured to interact with a user and display information associated with user input portions of a GUI, in accordance with certain further exemplary implementations of the present invention.
  • GUI data 402 is representative of any device that receives or is otherwise programmable to display/operate according to GUI data 402 .
  • GUI data 402 may therefore be provided on a computer-readable media, transmitted over a network, etc.
  • GUI data 402 includes GUI logic data 404 , valid user input information 406 , guidance message information 408 , and error message information 410 .
  • GUI data 402 includes, for example, the computer-implementable instructions associated with the GUI and/or the process(es) supported by the GUI.
  • Valid user input information 406 includes, for example, information suitable to help GUI data 402 or other associated logic make decisions as in steps 306 - 310 (FIG.
  • valid user input information 406 may define valid (or invalid) types of user inputs. In certain implementations, therefore, valid user input information 406 includes one or more validation/invalidation parameters that can be compared to the current user input to make such decisions.
  • Guidance message information 408 includes information that is displayed in the non-modal message in step 318 .
  • guidance message information 408 may help guide the user to enter valid user input.
  • guidance information may be displayed when the user has failed to input any user input.
  • Error message information 410 includes information that is displayed in the non-modal message in step 318 , when a particular error is detected.
  • error information may be displayed when the user has input invalid user input.
  • guidance message information 408 and error message information 410 are combined.
  • GUI data 402 is provided to a memory 412 and then processed accordingly by a processor 414 and a corresponding display is presented through a display device 416 .
  • GUI data 402 One advantage to GUI data 402 is that a web page or other like markup language file can be downloaded to (client) device 400 over a network.
  • the GUI that is presented can be checked/processed locally to determine if valid input has been received, without requiring additional processing, for example, by sending the user inputs to a server device for validation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Improved methods and apparatuses are provided for determining when and how to display non-modal messages relating to user input portions of a graphical user interface (GUI). One method includes displaying at least one user input portion within a GUI and determining if the user input portion is in an invalid state by determining that valid user input associated with the user input portion has not been received. The method further includes displaying a non-modal message within the GUI. The non-modal message is visibly graphically associated with the user input portion. The method also includes automatically applying a focus of the GUI on the user input portion. As long as the focus of the GUI remains on the user input portion, the method includes displaying the non-modal message until the user input portion is determined to be in a valid state.

Description

    TECHNICAL FIELD
  • This invention relates to computers and software, and more particularly to methods and apparatuses for providing information in a graphical user interface (GUI) computing environment using non-modal messages. [0001]
  • BACKGROUND
  • Traditional computing devices, and in particular the graphical user interfaces (GUIs) provided by the computing devices have relied on the use of message boxes to communicate to the user that an error has occurred or to otherwise inform the user about some matter. Typically, such pop-up message boxes are modal in that they require the user to actively dismiss them, for example, by hitting either an “Okay” or “Cancel” button within the message box. Often, the user needs to dismiss the message box prior to taking any corrective action and/or otherwise continuing on with whatever task is at hand. [0002]
  • There are other drawbacks to such traditional message boxes too. By way of example, modal message boxes can be distracting to the user, and/or unintentionally/prematurely dismissed. For example, if the user is busy typing or clicking elsewhere when the box appears, they might accidentally dismiss the modal message box before having a chance to read it. Furthermore, a typical modal message box does not graphically indicate the source of an error and/or problem, should it be visible within the GUI. For example, if the user entered the wrong information in a user input field presented by the GUI. [0003]
  • For these and other reasons, more recent operating systems and applications have introduced the use of a non-modal error message within a GUI. One exemplary type of non-modal message is a pop-up error message. Balloon error messages improve the way that error information is presented to the user by replacing the usual modal message box with a pop-up error message that is not modal and thus does not need to be dismissed by the user before the error can be corrected. A typical pop-up error message has the additional advantage of being strategically located to help identify the location within the GUI that is associated with the error. This allows the user to quickly identify where corrections may be needed. [0004]
  • A further exemplary drawback to conventional modal message boxes is that the message box needs to be dismissed by the user before the user is allowed to make any corrections. Similarly, conventional pop-up error message techniques may remove the pop-up error message automatically after having displayed it for a period of time and/or removing the balloon error message from the display when the user begins making applicable corrections. Thus, if the modal message box or pop-up error message includes information that may be beneficial during subsequent input by the user, then the user will need to remember or perhaps write down such information. [0005]
  • While conventional pop-up error messages usually help locate an error within the GUI, one shortcoming is that the user is required to manually place or otherwise associate (e.g., using a cursor, entry point, etc.) the focus of the GUI on the data field being pointed too, if they have not done so previously. One example is the current version of an application named MathCad available from MathSoft Engineering & Education, Inc. of Cambridge, Mass. This application uses painted graphic messages to indicate mathematical errors or undefined variables inside the mathematical equation displayed by the application. Here, all errors are displayed together at the same time. However, the user is required to then manually place the focus of the GUI appropriately within the equation before any changes to the equation can be made based on the error(s). [0006]
  • Consequently, for the above stated reasons and others it would be advantageous to have improved methods and apparatuses that display non-modal messages relating to user input portions of a GUI at the appropriate time and location, and that remain displayed for an adequate amount of time for the user to act upon the message information. Additionally, there is a need for more user friendly error and/or guidance messages that allow for expedited user entry/re-entry of valid information without requiring manual adjustment of the focus of the GUI. [0007]
  • SUMMARY
  • Improved methods and apparatuses are provided for determining when and/or how to display non-modal messages relating to user input portions of a graphical user interface (GUI). [0008]
  • The above stated needs and others are satisfied, for example, by a method in accordance with certain aspects of the present invention that includes displaying at least one user input portion within a GUI and determining if the user input portion is in an invalid state by determining that valid user input associated with the user input portion has not been received. The method further includes displaying a non-modal message within the GUI. The non-modal message is visibly graphically associated with the user input portion. The method also includes automatically applying a focus of the GUI on the user input portion. As long as the focus of the GUI remains on the user input portion, the method includes displaying the non-modal message until the user input portion is determined to be in a valid state. [0009]
  • In accordance with certain other exemplary aspects of the present invention, a computer-readable medium is provided, which has computer-executable instructions for causing a GUI having at least one user input portion to be displayed, selectively causing a focus of the GUI to be applied on the user input portion if valid user input associated with said user input portion has not been received, and displaying a non-modal message within the GUI that is visibly connected to the user input portion until the user input portion is determined to be in a valid state or the focus of said GUI is removed from the user input portion. [0010]
  • In accordance with still other aspects of the present invention, an apparatus is provided. The apparatus includes logic, memory, at least one user input device and at least one display device. The logic is configured to cause a GUI to be visibly presented via the display device. The GUI includes at least one user input portion. The logic is further configured to determine if the user input portion is in an invalid state by determining that valid user input associated with the user input portion has not been received. The logic will then cause a non-modal message to be presented within the GUI. Here, the non-modal message is visibly associated with the user input portion. The logic applies a user input focus of the GUI on the user input portion. As long as the focus of the GUI is on the user input portion, the logic will continue presenting the non-modal message until the user input portion is determined to be in a valid state.[0011]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A more complete understanding of the various exemplary methods and apparatuses of the present invention may be had by reference to the following detailed description when taken in conjunction with the accompanying drawings wherein: [0012]
  • FIG. 1 is a block diagram depicting a computer system/environment suitable for use in accordance with certain exemplary implementations of the present invention. [0013]
  • FIG. 2 depicts illustrative representations of graphical user interfaces (GUIs) having user input portions and information being displayed in non-modal messages, in accordance with certain exemplary implementations of the present invention. [0014]
  • FIG. 3 is a flow diagram depicting a process for displaying information associated with user input portions of a GUI, in accordance with certain exemplary implementations of the present invention. [0015]
  • FIG. 4 is a block diagram depicting a device configured to display information associated with user input portions of a GUI, in accordance with certain further exemplary implementations of the present invention. [0016]
  • DETAILED DESCRIPTION
  • Turning to the drawings, wherein like reference numerals refer to like elements, the invention is illustrated as being implemented in a suitable computing environment. [0017]
  • Although not required, the invention will be described in the general context of computer-executable instructions, such as program modules, being executed by a personal computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. [0018]
  • Moreover, those skilled in the art will appreciate that the invention may be practiced with other computer system configurations, including hand-held devices, multi-processor systems, microprocessor based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices. [0019]
  • FIG. 1 illustrates an example of a [0020] suitable computing environment 120 on which the subsequently described methods and apparatuses may be implemented. Exemplary computing environment 120 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the improved methods and systems described herein. Neither should computing environment 120 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in computing environment 120.
  • The improved methods and apparatuses herein are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable include, but are not limited to, personal computers, server computers, thin clients, thick clients, handheld or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like. [0021]
  • As shown in FIG. 1, [0022] computing environment 120 includes a general-purpose computing device in the form of a computer 130. The components of computer 130 may include one or more processors or processing units 132, a system memory 134, and a bus 136 that couples various system components including system memory 134 to processor 132.
  • [0023] Bus 136 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnects (PCI) bus also known as Mezzanine bus.
  • [0024] Computer 130 typically includes a variety of computer readable media. Such media may be any available media that is accessible by computer 130, and it includes both volatile and non-volatile media, removable and non-removable media.
  • In FIG. 1, [0025] system memory 134 includes computer readable media in the form of volatile memory, such as random access memory (RAM) 140, and/or nonvolatile memory, such as read only memory (ROM) 138. A basic input/output system (BIOS) 142, containing the basic routines that help to transfer information between elements within computer 130, such as during start-up, is stored in ROM 138. RAM 140 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processor 132.
  • [0026] Computer 130 may further include other removable/non-removable, volatile/non-volatile computer storage media. For example, FIG. 1 illustrates a hard disk drive 144 for reading from and writing to a non-removable, non-volatile magnetic media (not shown and typically called a “hard drive”), a magnetic disk drive 146 for reading from and writing to a removable, non-volatile magnetic disk 148 (e.g., a “floppy disk”), and an optical disk drive 150 for reading from or writing to a removable, non-volatile optical disk 152 such as a CD-ROM/R/RW, DVD-ROM/R/RW/+R/RAM or other optical media. Hard disk drive 144, magnetic disk drive 146 and optical disk drive 150 are each connected to bus 136 by one or more interfaces 154.
  • The drives and associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules, and other data for [0027] computer 130. Although the exemplary environment described herein employs a hard disk, a removable magnetic disk 148 and a removable optical disk 152, it should be appreciated by those skilled in the art that other types of computer readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, random access memories (RAMs), read only memories (ROM), and the like, may also be used in the exemplary operating environment.
  • A number of program modules may be stored on the hard disk, [0028] magnetic disk 148, optical disk 152, ROM 138, or RAM 140, including, e.g., an operating system 158, one or more application programs 160, other program modules 162, and program data 164.
  • The improved methods and systems described herein may be implemented within [0029] operating system 158, one or more application programs 160, other program modules 162, and/or program data 164.
  • A user may provide commands and information into [0030] computer 130 through input devices such as keyboard 166 and pointing device 168 (such as a “mouse”). Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, serial port, scanner, camera, etc. These and other input devices are connected to the processing unit 132 through a user input interface 170 that is coupled to bus 136, but may be connected by other interface and bus structures, such as a parallel port, game port, or a universal serial bus (USB).
  • A [0031] monitor 172 or other type of display device is also connected to bus 136 via an interface, such as a video adapter 174. In addition to monitor 172, personal computers typically include other peripheral output devices (not shown), such as speakers and printers, which may be connected through output peripheral interface 175.
  • [0032] Computer 130 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 182. Remote computer 182 may include many or all of the elements and features described herein relative to computer 130.
  • Logical connections shown in FIG. 1 are a local area network (LAN) [0033] 177 and a general wide area network (WAN) 179. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets, and the Internet.
  • When used in a LAN networking environment, [0034] computer 130 is connected to LAN 177 via network interface or adapter 186. When used in a WAN networking environment, the computer typically includes a modem 178 or other means for establishing communications over WAN 179. Modem 178, which may be internal or external, may be connected to system bus 136 via the user input interface 170 or other appropriate mechanism.
  • Depicted in FIG. 1, is a specific implementation of a WAN via the Internet. Here, [0035] computer 130 employs modem 178 to establish communications with at least one remote computer 182 via the Internet 180.
  • In a networked environment, program modules depicted relative to [0036] computer 130, or portions thereof, may be stored in a remote memory storage device. Thus, e.g., as depicted in FIG. 1, remote application programs 189 may reside on a memory device of remote computer 182. It will be appreciated that the network connections shown and described are exemplary and other means of establishing a communications link between the computers may be used.
  • Reference is now to FIG. 2, which depicts illustrative representations of graphical user interfaces (GUIs) having user input portions and information being displayed in non-modal messages, in accordance with certain exemplary implementations of the present invention. [0037]
  • By way of example, a [0038] GUI 202 is represented as being displayed by a display device (dashed-line box) 200. Within GUI 202 is a plurality of user input portions 204 a-n and 208. In certain implementations, user input portions 204 a-n and/or 208 take the form of a data entry field suitable for the user to type or otherwise enter alphanumeric character strings and the like. Thus, for example, user input portion 208 is illustrated as being a user input field designed to allow the user to enter a numerical number relating to a month of the year. A prompt 206 is shown as soliciting such user input.
  • In other implementations, for example, user input portions [0039] 204 a-n may include user selectable/activated buttons, knobs, sliders, or other like graphically displayed user input mechanisms. It should be recognized, therefore, that the rectangular shaped dashed-line boxes defining user input portions in FIG. 2 are merely representative shapes and that the user input portions may take on any applicable shape, pattern, color, etc. Moreover, certain GUIs may only have a single user input portion, while other implementations have many user input portions.
  • As illustrated, [0040] user input portion 208 includes a user input (data) of “15”. In accordance with certain aspects of the present invention, this user input value has been determined to represent an invalid entry since prompt 206 is requesting that the user enter a numerical identifier for a month of the year and only integers between 1 and 12 are valid entries. Accordingly, at a determined validation moment associated with GUI 202 and/or user input portion 208, the user input data (or lack thereof) is analyzed to determine if it is valid or invalid. If the user input data is determined to be valid, then the process associated with and/or supported by user input portion 208 is allowed to continue in some manner. If the user input data is determined to be invalid, then a non-modal message 210 is generated and displayed. In the example in FIG. 2, non-modal message 210 takes the shape of a balloon message having a tip pointing to or otherwise directing the user to user portion 208 which currently contains invalid user input data (i.e., the number “15”). Included in this exemplary visible graphical non-modal message 210 is message information that reads “Please enter an integer between 1 and 12 ”.
  • In accordance with certain aspects of the present invention, when [0041] non-modal message 210 is displayed the “focus” of GUI 202 is placed, moved or otherwise applied to user input field 208. Thus, for example, in certain implementations user input portion 208 or the current data therein may be highlighted or changed in some visible manner to help the user to identify the user input portion associated with non-modal message 210. In certain exemplary implementations, a cursor or like visible item can be placed in user input portion 208 and logic supporting GUI 202 operatively configured to receive new/revised user input data.
  • To better serve the user during this non-modal message guided user input process, [0042] non-modal message 210 is maintained/displayed until a valid user input has been provided and/or the focus of GUI 202 is moved/removed from user input portion 208. Unlike conventional non-modal messages the message information remains visible while the user provides new input(s) and until the user provides valid user inputs. Non-modal message 210 is no longer displayed once the user has input valid user input. If the user decides to redirect the focus of GUI 202, then non-modal message 210 will stop being displayed. However, if the user has failed to provide requisite valid user inputs, then non-modal message 210 will be displayed again. The user can change or move the focus of GUI 202 by selectively moving and/or activating a pointing device such as a mouse, touch-pad, trackball, or the like, and/or striking one or more input keys on a keyboard or other like mechanism. For example, in certain implementations, the user may hit a “tab” key to selectively move the focus of GUI 202 to another user input portion.
  • In still other implementations, the focus of [0043] GUI 202 can be automatically moved to another portion within GUI 202. For example, the focus of GUI 202 may change after the passage of a certain amount of time. Non-modal message 210 may also time out in some manner as may be needed.
  • FIG. 2 also includes a portion of an [0044] exemplary GUI 220 further illustrating certain features associated with certain implementations of a non-modal message 224 that is displayed in reference to user input field 222. Here, at an applicable validation moment, it was determined that the user failed to provide requisite valid user input to user input field 222. Hence, non-modal message 224 has been displayed. In this example, non-modal message 224 includes a graphical icon 226, an identifier 228 and message information 230. Graphical icon 226 in this example visibly identifies that an error has occurred. Identifier 228 provides a title or summary, for example, of the error (here, e.g., “Field is mandatory.”). Message information 230 in this example further elaborates on the error by stating that “You must enter a value for Open Build.” Although not visually illustrated by the screen shot of GUI 220, the focus of GUI 220 is user input field 222.
  • Attention is now drawn to FIG. 3, which is a flow diagram depicting a [0045] process 300 for displaying information associated with user input portions of a GUI, in accordance with certain exemplary implementations of the present invention.
  • In [0046] step 302, at least one user input portion is displayed within a GUI. In step 304, user input associated with the user input portion is received. In step 306, a determination is made that an input validation moment associated with the GUI and/or user input portion has been reached. For example, an input validation moment may be reached after the passage of a period of time with or without user inputs received in step 304. An input validation moment may be associated with the user selecting a particular GUI mechanism, such as, for example, a form “complete” button, an “enter” button, a “submit” button, a “send” button, etc. Note that process 300 may move from step 302 directly to step 306 without step 304, for example, if the user does not provide user input for the user input portion.
  • In [0047] step 308, an inquiry is made to determine if a user input was required for the user input portion. If the answer to the inquiry is “No”, then the user input or lack thereof was not required and hence is inherently valid. As such, process 300 may return to step 302, for example, to display or process other user input portions or features of the GUI. If the answer to the inquiry in step 308 is “Yes”, then user input is required for the user input portion being analyzed and process 300 continues with step 310.
  • In [0048] step 310 an inquiry is made to determine if the current user input associated with the user input portion is valid. If the received user input is determined to be valid (i.e., the answer to inquiry 310 is “Yes”), then process 300 continues with step 312, wherein if a non-modal message is displayed it is dismissed or closed and process 300 is allowed to return to step 302, for example. If the received user input is determined to be invalid (i.e., the answer to inquiry 310 is “No”), then process 300 continues with step 314. The user input may be invalid if, for example, it is missing (not entered/received yet) and/or it fails to meet certain validation criteria associated with the user input portion.
  • An inquiry is made in [0049] step 314 to determine if any non-modal messages are currently open. If the answer is “Yes”, then process 300 continues with step 316, wherein the open non-modal message is closed. From step 316, process 300 proceeds to step 318. If the answer to the inquiry in step 314 is “No”, then process 300 continues to step 318.
  • In [0050] step 318, a non-modal message is displayed with regard to the user input portion, which according to the analysis of process 300 currently contains invalid user input and/or is missing valid user input. As part of step 318, the focus of the GUI can be moved or otherwise applied to the user input portion. Following step 318, process 300 continues with step 320, wherein new user input is received. The non-modal message displayed in step 318 is continually displayed until it is subsequently removed in either step 312 or step 316, and/or the focus of the GUI is moved/removed from the user input portion. Following step 320, process 300 returns to step 310 to determine if the newly received user input is valid or invalid.
  • Reference is now made to FIG. 4, which is a block diagram depicting a [0051] device 400 configured to interact with a user and display information associated with user input portions of a GUI, in accordance with certain further exemplary implementations of the present invention.
  • [0052] Device 400 is representative of any device that receives or is otherwise programmable to display/operate according to GUI data 402. GUI data 402 may therefore be provided on a computer-readable media, transmitted over a network, etc. Here, as illustratively represented, GUI data 402 includes GUI logic data 404, valid user input information 406, guidance message information 408, and error message information 410. GUI data 402 includes, for example, the computer-implementable instructions associated with the GUI and/or the process(es) supported by the GUI. Valid user input information 406 includes, for example, information suitable to help GUI data 402 or other associated logic make decisions as in steps 306-310 (FIG. 3) regarding the validity/invalidity of user inputs or lack thereof with respect to the user input portion being analyzed. Thus, for example, valid user input information 406 may define valid (or invalid) types of user inputs. In certain implementations, therefore, valid user input information 406 includes one or more validation/invalidation parameters that can be compared to the current user input to make such decisions.
  • [0053] Guidance message information 408 includes information that is displayed in the non-modal message in step 318. Here, for example, guidance message information 408 may help guide the user to enter valid user input. For example, guidance information may be displayed when the user has failed to input any user input. Error message information 410 includes information that is displayed in the non-modal message in step 318, when a particular error is detected. For example, error information may be displayed when the user has input invalid user input. In certain implementations, guidance message information 408 and error message information 410 are combined.
  • As illustrated, [0054] GUI data 402 is provided to a memory 412 and then processed accordingly by a processor 414 and a corresponding display is presented through a display device 416.
  • One advantage to [0055] GUI data 402 is that a web page or other like markup language file can be downloaded to (client) device 400 over a network. The GUI that is presented can be checked/processed locally to determine if valid input has been received, without requiring additional processing, for example, by sending the user inputs to a server device for validation.
  • Although some preferred implementations of the various methods and apparatuses of the present invention have been illustrated in the accompanying Drawings and described in the foregoing Detailed Description, it will be understood that the invention is not limited to the exemplary embodiments disclosed, but is capable of numerous rearrangements, modifications and substitutions without departing from the spirit of the invention as set forth and defined by the following claims. [0056]

Claims (30)

1. A method comprising:
displaying at least one user input portion within a graphical user interface (GUI);
determining if said user input portion is in an invalid state by determining that valid user input associated with said user input portion has not been received;
displaying a non-modal message within said GUI, said non-modal message being visibly graphically associated with said user input portion;
automatically applying a focus of said GUI on said user input portion; and
as long as said focus of said GUI remains on said user input portion, continue displaying said non-modal message until said user input portion is determined to be in a valid state.
2. The method as recited in claim 1, wherein determining that valid user input associated with said user input portion has not been received further includes:
receiving user input associated with said user input portion; and
comparing said received user input with validation information associated with said user input portion.
3. The method as recited in claim 2, wherein said validation information includes at least one validation parameter.
4. The method as recited in claim 1, wherein said user input portion includes at least one user input field.
5. The method as recited in claim 1, wherein said focus of said GUI places a cursor in said user input portion.
6. The method as recited in claim 1, wherein said focus of said GUI can be moved from said user input portion through focus moving user input.
7. The method as recited in claim 1, wherein said non-modal message includes non-modal information regarding at least one invalidity error associated with said user input portion.
8. The method as recited in claim 7, wherein said non-modal information includes at least one form of identifying information selected from a group of different forms of identifying information comprising a graphical icon, an identifier, and message information.
9. The method as recited in claim 8, wherein said message information includes validation information associated with said user input portion.
10. The method as recited in claim 1, wherein said non-modal message includes a balloon message.
11. A computer-readable medium having computer-executable instructions for performing acts comprising:
causing a graphical user interface (GUI) to be displayed, said GUI having at least one user input portion visibly graphically provided therein;
selectively causing a focus of said GUI to be applied on said user input portion if valid user input associated with said user input portion has not been received; and
displaying a non-modal message within said GUI that is visibly graphically connected to said user input portion until said user input portion is determined to be in a valid state, as long as said focus of said GUI remains on said user input portion.
12. The computer-readable medium as recited in claim 11, wherein selectively causing said focus of said GUI to be applied on said user input portion if valid user input associated with said user input portion has not been received further includes:
receiving user input associated with said user input portion; and
comparing said received user input with validation information associated with said user input portion.
13. The computer-readable medium as recited in claim 12, wherein said validation information includes at least one validation parameter.
14. The computer-readable medium as recited in claim 11, wherein said user input portion includes at least one user input field.
15. The computer-readable medium as recited in claim wherein said focus of said GUI places a cursor in said user input portion.
16. The computer-readable medium as recited in claim 11, wherein said focus of said GUI can be moved from said user input portion through focus moving user input.
17. The computer-readable medium as recited in claim 11, wherein said non-modal message includes non-modal information regarding at least one invalidity error associated with said user input portion.
18. The computer-readable medium as recited in claim 17, wherein said non-modal information includes at least one form of identifying information selected from a group of different forms of identifying information comprising a graphical icon, an identifier, and message information.
19. The computer-readable medium as recited in claim 18, wherein said message information includes validation information associated with said user input portion.
20. The computer-readable medium as recited in claim 11, wherein said non-modal message includes a balloon message.
21. An apparatus comprising:
memory;
a display device;
a user input device; and
logic operatively coupled to said memory, said display device and said user input device, said logic being configured to:
cause a graphical user interface (GUI) to be visibly presented via said display device, said GUI including at least one user input portion,
determine if said user input portion is in an invalid state by determining that valid user input associated with said user input portion has not been received,
cause a non-modal message to be presented within said GUI, said non-modal message being visibly associated with said user input portion;
automatically apply a user input focus of said GUI on said user input portion, and
while said focus of said GUI remains on said user input portion, continue presenting said non-modal message until said user input portion is determined to be in a valid state.
22. The apparatus as recited in claim 21, wherein said logic is further configured to:
compare user inputs received from said user input device with validation information within said memory, said validation information being associated with said user input portion.
23. The apparatus as recited in claim 22, wherein said validation information includes at least one validation parameter.
24. The apparatus as recited in claim 21, wherein said user input portion includes at least one user input field.
25. The apparatus as recited in claim 21, wherein said user input focus of said GUI causes said logic to visibly associate at least one graphical user input indicator with said user input portion.
26. The apparatus as recited in claim 21, wherein said logic is configured to allow user input focus of said GUI to be removed from said user input portion based on user inputs from said user input device.
27. The apparatus as recited in claim 21, wherein said non-modal message includes non-modal information regarding at least one invalidity error associated with said user input portion.
28. The apparatus as recited in claim 27, wherein said non-modal information includes at least one form of identifying information selected from a group of different forms of identifying information comprising a graphical icon, an identifier, and message information.
29. The apparatus as recited in claim 28, wherein said message information includes validation information associated with said user input portion.
30. The apparatus as recited in claim 21, wherein said non-modal message includes a balloon message.
US10/143,325 2002-05-09 2002-05-09 Methods and apparatuses for providing message information in graphical user interfaces based on user inputs Active 2029-09-21 US7890865B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US10/143,325 US7890865B2 (en) 2002-05-09 2002-05-09 Methods and apparatuses for providing message information in graphical user interfaces based on user inputs
US12/980,719 US20110093782A1 (en) 2002-05-09 2010-12-29 Methods and Apparatuses For Providing Message Information In Graphical User Interfaces Based On User Inputs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10/143,325 US7890865B2 (en) 2002-05-09 2002-05-09 Methods and apparatuses for providing message information in graphical user interfaces based on user inputs

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/980,719 Continuation US20110093782A1 (en) 2002-05-09 2010-12-29 Methods and Apparatuses For Providing Message Information In Graphical User Interfaces Based On User Inputs

Publications (2)

Publication Number Publication Date
US20030210260A1 true US20030210260A1 (en) 2003-11-13
US7890865B2 US7890865B2 (en) 2011-02-15

Family

ID=29400102

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/143,325 Active 2029-09-21 US7890865B2 (en) 2002-05-09 2002-05-09 Methods and apparatuses for providing message information in graphical user interfaces based on user inputs
US12/980,719 Abandoned US20110093782A1 (en) 2002-05-09 2010-12-29 Methods and Apparatuses For Providing Message Information In Graphical User Interfaces Based On User Inputs

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/980,719 Abandoned US20110093782A1 (en) 2002-05-09 2010-12-29 Methods and Apparatuses For Providing Message Information In Graphical User Interfaces Based On User Inputs

Country Status (1)

Country Link
US (2) US7890865B2 (en)

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040006744A1 (en) * 2002-06-27 2004-01-08 Microsoft Corporation System and method for validating an XML document and reporting schema violations
US20040019875A1 (en) * 2002-04-29 2004-01-29 Welch Keith C. Masked edit control for use in a graphical programming environment
US20050088410A1 (en) * 2003-10-23 2005-04-28 Apple Computer, Inc. Dynamically changing cursor for user interface
US20060026531A1 (en) * 2004-07-29 2006-02-02 Sony Coporation State-based computer help utility
US20060230321A1 (en) * 2005-03-29 2006-10-12 Microsoft Corporation User interface panel for hung applications
US20060230324A1 (en) * 2005-04-06 2006-10-12 Microsoft Corporation Visual indication for hung applications
US20060239248A1 (en) * 2005-04-26 2006-10-26 Cisco Technology, Inc. System and method for displaying sticky notes on phone
US20060288297A1 (en) * 1999-08-12 2006-12-21 Robert Haitani System, method and technique for enabling users to interact and edit address fields of messaging applications
US20060288298A1 (en) * 1999-08-12 2006-12-21 Robert Haitani System, method and technique for enabling users to interact with address fields of messaging applications
US20070032267A1 (en) * 2005-08-08 2007-02-08 Robert Haitani Contact-centric user-interface features for computing devices
WO2007019538A2 (en) * 2005-08-08 2007-02-15 Palm, Inc. User interface for a computing device
EP1785856A2 (en) * 2005-10-24 2007-05-16 Sap Ag Batch processing for wizards
US20070157116A1 (en) * 2005-12-30 2007-07-05 Johnson Clare C System and method for visual messaging
US20070174778A1 (en) * 2005-12-30 2007-07-26 Johnson Clare C System and method for combining multiple software panes
US20100010740A1 (en) * 2005-12-02 2010-01-14 Palm, Inc. Permission module on mobile computing device
US7653883B2 (en) 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device
US20100030653A1 (en) * 2008-07-29 2010-02-04 W.W. Grainger, Inc. System and method for detecting a possible error in a customer provided product order quantity
US20100138704A1 (en) * 2005-12-30 2010-06-03 Sap Ag User interface messaging system and method permitting deferral of message resolution
US20100205530A1 (en) * 2009-02-09 2010-08-12 Emma Noya Butin Device, system, and method for providing interactive guidance with execution of operations
US20100205529A1 (en) * 2009-02-09 2010-08-12 Emma Noya Butin Device, system, and method for creating interactive guidance with execution of operations
US20110047488A1 (en) * 2009-08-24 2011-02-24 Emma Butin Display-independent recognition of graphical user interface control
US20110047462A1 (en) * 2009-08-24 2011-02-24 Emma Butin Display-independent computerized guidance
US20110047514A1 (en) * 2009-08-24 2011-02-24 Emma Butin Recording display-independent computerized guidance
US8005194B2 (en) 2005-12-21 2011-08-23 Hewlett-Packard Development Company, L.P. Technique for handling incoming reply messages
WO2012033677A2 (en) * 2010-09-08 2012-03-15 Microsoft Corporation Notification bar user interface control
KR101164194B1 (en) 2008-12-11 2012-07-10 한국전자통신연구원 method for static allocating stack based on multi thread
US8239784B2 (en) 2004-07-30 2012-08-07 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US20140006882A1 (en) * 2006-03-13 2014-01-02 Fujitsu Limited Screen generation program, screen generation apparatus, and screen generation method
US8677286B2 (en) 2003-05-01 2014-03-18 Hewlett-Packard Development Company, L.P. Dynamic sizing user interface method and system for data display
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US9239677B2 (en) 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US9710435B2 (en) 2010-10-29 2017-07-18 P. Karl Halton Object-field-based mathematics system
US20190377588A1 (en) * 2018-06-06 2019-12-12 Oracle International Corporation Smart context aware support engine for applications
US20220397987A1 (en) * 2019-11-27 2022-12-15 Nippon Telegraph And Telephone Corporation Input display system, auxiliary information display method and program
US20220397986A1 (en) * 2019-11-27 2022-12-15 Nippon Telegraph And Telephone Corporation Input display system, auxiliary information display method and program
US12067354B2 (en) * 2016-08-04 2024-08-20 HAB Innovations, Inc. Simplifying complex input strings

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7890865B2 (en) * 2002-05-09 2011-02-15 Microsoft Corporation Methods and apparatuses for providing message information in graphical user interfaces based on user inputs
US10218582B2 (en) * 2013-09-26 2019-02-26 Apple Inc. Notifications with input-based completion
US10114519B2 (en) 2016-05-03 2018-10-30 Microsoft Technology Licensing, Llc Contextual content presentation based on microenvironment interactions
KR20180083131A (en) 2017-01-12 2018-07-20 에이치피프린팅코리아 주식회사 Display apparatus and method for controlling the display apparatus thereof
US20210406828A1 (en) * 2020-06-24 2021-12-30 Mitchell International, Inc. Vehicle repair estimating tool with near-real-time compliance

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5008810A (en) * 1988-09-29 1991-04-16 Process Modeling Investment Corp. System for displaying different subsets of screen views, entering different amount of information, and determining correctness of input dependent upon current user input
US5121475A (en) * 1988-04-08 1992-06-09 International Business Machines Inc. Methods of dynamically generating user messages utilizing error log data with a computer system
US5546521A (en) * 1991-10-15 1996-08-13 International Business Machines Corporation Dynamic presentation of contextual help and status information
US5557731A (en) * 1993-12-28 1996-09-17 International Business Machines Corporation Method and system for detecting undefined objects in an application
US5754176A (en) * 1995-10-02 1998-05-19 Ast Research, Inc. Pop-up help system for a computer graphical user interface
US5995101A (en) * 1997-10-29 1999-11-30 Adobe Systems Incorporated Multi-level tool tip
US20020091993A1 (en) * 2000-09-29 2002-07-11 International Business Machines Corporation Contextual help information
US20020175955A1 (en) * 1996-05-10 2002-11-28 Arno Gourdol Graphical user interface having contextual menus
US20030084115A1 (en) * 2001-09-26 2003-05-01 Wood Timothy E. Facilitating contextual help in a browser environment
US6609106B1 (en) * 1999-05-07 2003-08-19 Steven C. Robertson System and method for providing electronic multi-merchant gift registry services over a distributed network
US6662340B2 (en) * 2000-04-28 2003-12-09 America Online, Incorporated Client-side form filler that populates form fields based on analyzing visible field labels and visible display format hints without previous examination or mapping of the form
US20030229608A1 (en) * 2002-06-06 2003-12-11 Microsoft Corporation Providing contextually sensitive tools and help content in computer-generated documents
US20040006480A1 (en) * 2002-07-05 2004-01-08 Patrick Ehlen System and method of handling problematic input during context-sensitive help for multi-modal dialog systems
US6763496B1 (en) * 1999-03-31 2004-07-13 Microsoft Corporation Method for promoting contextual information to display pages containing hyperlinks
US20050149395A1 (en) * 2003-10-29 2005-07-07 Kontera Technologies, Inc. System and method for real-time web page context analysis for the real-time insertion of textual markup objects and dynamic content
US20060026534A1 (en) * 2000-06-21 2006-02-02 Microsoft Corporation Providing information to computer users

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5740035A (en) 1991-07-23 1998-04-14 Control Data Corporation Self-administered survey systems, methods and devices
US5425102A (en) 1994-06-09 1995-06-13 Datasonix Corporation Computer security apparatus with password hints
US5895455A (en) 1995-08-11 1999-04-20 Wachovia Corporation Document image display system and method
JPH09153099A (en) 1995-09-29 1997-06-10 Toshiba Corp Method and system for transferring information, and method and device for information input
US5793952A (en) 1996-05-17 1998-08-11 Sun Microsystems, Inc. Method and apparatus for providing a secure remote password graphic interface
AU3214697A (en) 1996-06-03 1998-01-05 Electronic Data Systems Corporation Automated password reset
US5736984A (en) 1996-07-03 1998-04-07 Sun Microsystems, Inc. Method and system for embedded feedback message and graphical processing element
US6100885A (en) 1996-07-06 2000-08-08 International Business Machines Corporation Supporting modification of properties via a computer system's user interface
US6337702B1 (en) 1996-10-23 2002-01-08 International Business Machines Corporation Method and system for graphically indicating a valid input within a graphical user interface
US5956709A (en) 1997-07-28 1999-09-21 Xue; Yansheng Dynamic data assembling on internet client side
GB2341952B (en) 1998-09-24 2003-05-14 Ibm Multi-layer entry fields
US6100855A (en) * 1999-02-26 2000-08-08 Marconi Aerospace Defence Systems, Inc. Ground plane for GPS patch antenna
US7890865B2 (en) * 2002-05-09 2011-02-15 Microsoft Corporation Methods and apparatuses for providing message information in graphical user interfaces based on user inputs

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5121475A (en) * 1988-04-08 1992-06-09 International Business Machines Inc. Methods of dynamically generating user messages utilizing error log data with a computer system
US5008810A (en) * 1988-09-29 1991-04-16 Process Modeling Investment Corp. System for displaying different subsets of screen views, entering different amount of information, and determining correctness of input dependent upon current user input
US5546521A (en) * 1991-10-15 1996-08-13 International Business Machines Corporation Dynamic presentation of contextual help and status information
US5557731A (en) * 1993-12-28 1996-09-17 International Business Machines Corporation Method and system for detecting undefined objects in an application
US5754176A (en) * 1995-10-02 1998-05-19 Ast Research, Inc. Pop-up help system for a computer graphical user interface
US6493006B1 (en) * 1996-05-10 2002-12-10 Apple Computer, Inc. Graphical user interface having contextual menus
US20020175955A1 (en) * 1996-05-10 2002-11-28 Arno Gourdol Graphical user interface having contextual menus
US5995101A (en) * 1997-10-29 1999-11-30 Adobe Systems Incorporated Multi-level tool tip
US6763496B1 (en) * 1999-03-31 2004-07-13 Microsoft Corporation Method for promoting contextual information to display pages containing hyperlinks
US6609106B1 (en) * 1999-05-07 2003-08-19 Steven C. Robertson System and method for providing electronic multi-merchant gift registry services over a distributed network
US6662340B2 (en) * 2000-04-28 2003-12-09 America Online, Incorporated Client-side form filler that populates form fields based on analyzing visible field labels and visible display format hints without previous examination or mapping of the form
US20060026534A1 (en) * 2000-06-21 2006-02-02 Microsoft Corporation Providing information to computer users
US20020091993A1 (en) * 2000-09-29 2002-07-11 International Business Machines Corporation Contextual help information
US20030084115A1 (en) * 2001-09-26 2003-05-01 Wood Timothy E. Facilitating contextual help in a browser environment
US20030229608A1 (en) * 2002-06-06 2003-12-11 Microsoft Corporation Providing contextually sensitive tools and help content in computer-generated documents
US20040006480A1 (en) * 2002-07-05 2004-01-08 Patrick Ehlen System and method of handling problematic input during context-sensitive help for multi-modal dialog systems
US20050149395A1 (en) * 2003-10-29 2005-07-07 Kontera Technologies, Inc. System and method for real-time web page context analysis for the real-time insertion of textual markup objects and dynamic content

Cited By (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US20060288297A1 (en) * 1999-08-12 2006-12-21 Robert Haitani System, method and technique for enabling users to interact and edit address fields of messaging applications
US20060288298A1 (en) * 1999-08-12 2006-12-21 Robert Haitani System, method and technique for enabling users to interact with address fields of messaging applications
US9606668B2 (en) 2002-02-07 2017-03-28 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20040019875A1 (en) * 2002-04-29 2004-01-29 Welch Keith C. Masked edit control for use in a graphical programming environment
US20040006744A1 (en) * 2002-06-27 2004-01-08 Microsoft Corporation System and method for validating an XML document and reporting schema violations
US7373595B2 (en) * 2002-06-27 2008-05-13 Microsoft Corporation System and method for validating an XML document and reporting schema violations
US8677286B2 (en) 2003-05-01 2014-03-18 Hewlett-Packard Development Company, L.P. Dynamic sizing user interface method and system for data display
US8230366B2 (en) * 2003-10-23 2012-07-24 Apple Inc. Dynamically changing cursor for user interface
US20050088410A1 (en) * 2003-10-23 2005-04-28 Apple Computer, Inc. Dynamically changing cursor for user interface
US9239677B2 (en) 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
EP1782167A4 (en) * 2004-07-29 2012-10-03 Sony Electronics Inc State-based computer help utility
WO2006019721A3 (en) * 2004-07-29 2006-09-08 Sony Electronics Inc State-based computer help utility
WO2006019721A2 (en) 2004-07-29 2006-02-23 Sony Electronics Inc. State-based computer help utility
EP1782167A2 (en) * 2004-07-29 2007-05-09 Sony Electronics, Inc. State-based computer help utility
US20060026531A1 (en) * 2004-07-29 2006-02-02 Sony Coporation State-based computer help utility
US8239784B2 (en) 2004-07-30 2012-08-07 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US9348458B2 (en) 2004-07-30 2016-05-24 Apple Inc. Gestures for touch sensitive input devices
US10042418B2 (en) 2004-07-30 2018-08-07 Apple Inc. Proximity detector in handheld device
US11036282B2 (en) 2004-07-30 2021-06-15 Apple Inc. Proximity detector in handheld device
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US8612856B2 (en) 2004-07-30 2013-12-17 Apple Inc. Proximity detector in handheld device
US7653883B2 (en) 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device
US7549087B2 (en) 2005-03-29 2009-06-16 Microsoft Corporation User interface panel for hung applications
US20060230321A1 (en) * 2005-03-29 2006-10-12 Microsoft Corporation User interface panel for hung applications
US7613957B2 (en) * 2005-04-06 2009-11-03 Microsoft Corporation Visual indication for hung applications
US20060230324A1 (en) * 2005-04-06 2006-10-12 Microsoft Corporation Visual indication for hung applications
US20060239248A1 (en) * 2005-04-26 2006-10-26 Cisco Technology, Inc. System and method for displaying sticky notes on phone
US7698644B2 (en) * 2005-04-26 2010-04-13 Cisco Technology, Inc. System and method for displaying sticky notes on a phone
WO2007019538A3 (en) * 2005-08-08 2007-09-07 Palm Inc User interface for a computing device
US8583175B2 (en) 2005-08-08 2013-11-12 Palm, Inc. Contact-centric user-interface for computing devices
US20070032267A1 (en) * 2005-08-08 2007-02-08 Robert Haitani Contact-centric user-interface features for computing devices
US7680513B2 (en) 2005-08-08 2010-03-16 Palm, Inc. Contact-centric user-interface features for computing devices
WO2007019538A2 (en) * 2005-08-08 2007-02-15 Palm, Inc. User interface for a computing device
US20100124915A1 (en) * 2005-08-08 2010-05-20 Robert Haitani Contact-centric user-interface features for computing devices
US8280437B2 (en) 2005-08-08 2012-10-02 Hewlett-Packard Development Company, L.P. Contact-centric user-interface features for computing devices
US8078993B2 (en) 2005-08-08 2011-12-13 Hewlett-Packard Development Company, L.P. Operating multiple views on a computing device in connection with a wireless communication session
US8099129B2 (en) 2005-08-08 2012-01-17 Hewlett-Packard Development Company, L.P. Contact-centric user-interface features for computing devices
US20070049335A1 (en) * 2005-08-08 2007-03-01 Robert Haitani Operating multiple views on a computing device in connection with a wireless communication session
EP1785856A2 (en) * 2005-10-24 2007-05-16 Sap Ag Batch processing for wizards
EP1785856A3 (en) * 2005-10-24 2007-08-08 Sap Ag Batch processing for wizards
US20100010740A1 (en) * 2005-12-02 2010-01-14 Palm, Inc. Permission module on mobile computing device
US20100035596A1 (en) * 2005-12-02 2010-02-11 Palm, Inc. Handheld navigation unit with telephone call
US8005194B2 (en) 2005-12-21 2011-08-23 Hewlett-Packard Development Company, L.P. Technique for handling incoming reply messages
US7917817B2 (en) 2005-12-30 2011-03-29 Sap Ag User interface messaging system and method permitting deferral of message resolution
US20100138704A1 (en) * 2005-12-30 2010-06-03 Sap Ag User interface messaging system and method permitting deferral of message resolution
US20070157116A1 (en) * 2005-12-30 2007-07-05 Johnson Clare C System and method for visual messaging
US9298476B2 (en) 2005-12-30 2016-03-29 Sap Se System and method for combining multiple software panes
US20070174778A1 (en) * 2005-12-30 2007-07-26 Johnson Clare C System and method for combining multiple software panes
US20140006882A1 (en) * 2006-03-13 2014-01-02 Fujitsu Limited Screen generation program, screen generation apparatus, and screen generation method
US10304118B2 (en) 2008-07-29 2019-05-28 W.W. Grainger, Inc. System and method for detecting a possible error in a customer provided product order quantity
US20100030653A1 (en) * 2008-07-29 2010-02-04 W.W. Grainger, Inc. System and method for detecting a possible error in a customer provided product order quantity
US8429018B2 (en) * 2008-07-29 2013-04-23 W.W. Grainger, Inc. System and method for detecting a possible error in a customer provided product order quantity
KR101164194B1 (en) 2008-12-11 2012-07-10 한국전자통신연구원 method for static allocating stack based on multi thread
US9569231B2 (en) * 2009-02-09 2017-02-14 Kryon Systems Ltd. Device, system, and method for providing interactive guidance with execution of operations
US20100205529A1 (en) * 2009-02-09 2010-08-12 Emma Noya Butin Device, system, and method for creating interactive guidance with execution of operations
US20100205530A1 (en) * 2009-02-09 2010-08-12 Emma Noya Butin Device, system, and method for providing interactive guidance with execution of operations
US8918739B2 (en) 2009-08-24 2014-12-23 Kryon Systems Ltd. Display-independent recognition of graphical user interface control
US9405558B2 (en) 2009-08-24 2016-08-02 Kryon Systems Ltd. Display-independent computerized guidance
US9703462B2 (en) 2009-08-24 2017-07-11 Kryon Systems Ltd. Display-independent recognition of graphical user interface control
US20110047514A1 (en) * 2009-08-24 2011-02-24 Emma Butin Recording display-independent computerized guidance
US20110047462A1 (en) * 2009-08-24 2011-02-24 Emma Butin Display-independent computerized guidance
US9098313B2 (en) * 2009-08-24 2015-08-04 Kryon Systems Ltd. Recording display-independent computerized guidance
US20110047488A1 (en) * 2009-08-24 2011-02-24 Emma Butin Display-independent recognition of graphical user interface control
WO2012033677A3 (en) * 2010-09-08 2012-06-14 Microsoft Corporation Notification bar user interface control
WO2012033677A2 (en) * 2010-09-08 2012-03-15 Microsoft Corporation Notification bar user interface control
US9710435B2 (en) 2010-10-29 2017-07-18 P. Karl Halton Object-field-based mathematics system
US12067354B2 (en) * 2016-08-04 2024-08-20 HAB Innovations, Inc. Simplifying complex input strings
US20190377588A1 (en) * 2018-06-06 2019-12-12 Oracle International Corporation Smart context aware support engine for applications
US11068286B2 (en) * 2018-06-06 2021-07-20 Oracle International Corporation Smart context aware support engine for applications
US20220397987A1 (en) * 2019-11-27 2022-12-15 Nippon Telegraph And Telephone Corporation Input display system, auxiliary information display method and program
US20220397986A1 (en) * 2019-11-27 2022-12-15 Nippon Telegraph And Telephone Corporation Input display system, auxiliary information display method and program
US11789581B2 (en) * 2019-11-27 2023-10-17 Nippon Telegraph And Telephone Corporation Input display system, auxiliary information display method and program
US11822762B2 (en) * 2019-11-27 2023-11-21 Nippon Telegraph And Telephone Corporation Input display system, auxiliary information display method and program

Also Published As

Publication number Publication date
US20110093782A1 (en) 2011-04-21
US7890865B2 (en) 2011-02-15

Similar Documents

Publication Publication Date Title
US7890865B2 (en) Methods and apparatuses for providing message information in graphical user interfaces based on user inputs
US11847410B2 (en) Viewing file modifications
US9805005B1 (en) Access-control-discontinuous hyperlink handling system and methods
US6928619B2 (en) Method and apparatus for managing input focus and z-order
US4862390A (en) Method and apparatus for selection of one from a plurality of entries listed on a computer display
US7712049B2 (en) Two-dimensional radial user interface for computer software applications
US7143350B2 (en) Method and system for character sequence checking according to a selected language
Scarr et al. Dips and ceilings: understanding and supporting transitions to expertise in user interfaces
US20040243415A1 (en) Architecture for a speech input method editor for handheld portable devices
US20080155464A1 (en) Method and system for providing a scroll-bar pop-up with quick find for rapid access of sorted list data
US20060282818A1 (en) Interactive formula builder
US20070179775A1 (en) Method and system for translating a software application into an alternative language
JP2005520228A (en) System and method for providing prominent image elements in a graphical user interface display
US20070277118A1 (en) Providing suggestion lists for phonetic input
US20040239638A1 (en) System and method for displaying, completing and executing keyboard key combinations
AU2020200228A1 (en) Document changes
US20070292031A1 (en) Collecting and utilizing user correction feedback to improve handwriting recognition
AU2023201508B2 (en) Document changes
JP2715971B2 (en) Information input device
US6636241B1 (en) Method, system and program product for enhancing a computer user's comprehension of visually presented data
Salminen Design of localization web environments
WO2019239400A1 (en) Expression editor for mathematical statement forms
Morgado et al. Understanding Visual Basic for Applications (VBA)
Schicht et al. Flying start with SAP R/3
Alexander Easy Microsoft Excel 2010

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PALMER, STEVE;TOLKOV, VALERY;REEL/FRAME:012903/0578

Effective date: 20020508

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034541/0477

Effective date: 20141014

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552)

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12