WO2012079779A1 - User interface - Google Patents

User interface Download PDF

Info

Publication number
WO2012079779A1
WO2012079779A1 PCT/EP2011/054273 EP2011054273W WO2012079779A1 WO 2012079779 A1 WO2012079779 A1 WO 2012079779A1 EP 2011054273 W EP2011054273 W EP 2011054273W WO 2012079779 A1 WO2012079779 A1 WO 2012079779A1
Authority
WO
WIPO (PCT)
Prior art keywords
dialog window
user
new dialog
window
operating system
Prior art date
Application number
PCT/EP2011/054273
Other languages
French (fr)
Inventor
Liviu-Emanuel Lazarescu
Original Assignee
Liviu-Emanuel Lazarescu
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liviu-Emanuel Lazarescu filed Critical Liviu-Emanuel Lazarescu
Publication of WO2012079779A1 publication Critical patent/WO2012079779A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces

Definitions

  • This invention relates generally to user interfaces. Particularly, but not exclusively, it pertains to a method and a system for processing dialog windows displayed on multi-functional digital devices.
  • UI user-interface
  • a method for processing at least one new dialog window with respect to a prior generated user- interface displayed in an operating system of a digital device the operating system being used by a user, the method comprises detecting an interrupt event which leads to generation of the at least one new dialog window and delaying or hiding the display of the at least a portion of the at least one new dialog window in response to detection of the interrupt event until a signal is received.
  • a method for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device the operating system being used by a user, the method comprises detecting an event which leads to generation and display of the at least one new dialog window, and disabling at least a portion of the at least one new dialog window in response to detection of the interrupt event until a signal is received.
  • a computer program being configured to perform a method according to any one of claims 1 to 37.
  • an apparatus for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device comprising an event detector for detecting an interrupt event, an interface generator for generating at least one new dialog window in response to detection of the interrupt event and, a processor for delaying or hiding the display of at least a portion of the at least one new dialog window event until a signal is received.
  • an apparatus for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device comprising an event detector for detecting an event, an interface generator for generating at least one new dialog window in response to detection of the interrupt event and, a processor for disabling at least a portion of the at least one new dialog window until a signal is received.
  • a method for processing at least one new dialog window with respect to a prior generated user- interface displayed in an operating system of a digital device according to claim 22, claim 25, claim 34 or claim 35.
  • FIG. 1 illustrates a schematic diagram of a network scenario where user-interface (UI) processing software in accordance with embodiments of the invention is deployed in a plurality of digital devices;
  • UI user-interface
  • Figures 2 to 9 are different embodiments showing how the UI processing software as deployed in Figure 1 functions;
  • Figures 10 to 17 are flow diagrams illustrating the corresponding schemes adopted by the different embodiments of the UI processing software depicted in Figures 2 to 9; and Figure 18 is a schematic block diagram of an apparatus according to an embodiment of the invention.
  • FIG. 1 illustrates a schematic diagram 100 of an exemplary network scenario where user-interface (UI) processing software in accordance with embodiments of the invention, is installed and deployed in a plurality of digital devices.
  • the UI processing software may be known as "Natural Input Technology (NIT)".
  • NIT Natural Input Technology
  • the digital devices are preferably personal communication devices such as a laptop 102, a Tablet PC 104, a mobile phone 106, an iMac ® 108, a smartphone 110 and an
  • these digital devices are installed with an operating system (OS) such as Microsoft Windows®, Apple Mac ® OS X, UNIX ® , GNU/Linux ® , BSD systems (FreeBSD, NetBSD or OpenBSD), Google Android ® , Symbian ® or Google Chrome ® OS, which includes various software components and/or drivers for controlling and managing real-time system tasks (e.g. memory management, storage device control, power management and the like) and facilitating intercommunications between various hardware and software components of the digital devices.
  • OS operating system
  • Microsoft Windows® such as Microsoft Windows®, Apple Mac ® OS X, UNIX ® , GNU/Linux ® , BSD systems (FreeBSD, NetBSD or OpenBSD), Google Android ® , Symbian ® or Google Chrome ® OS
  • software components and/or drivers for controlling and managing real-time system tasks (e.g. memory management, storage device control, power management and the like)
  • the UI processing software may however, also be installed in any modern, multi-functional digital devices that are pre-configured to have a user interface, such as a WebTV, a Global Positioning System (GPS) navigation system, a touch screen refrigerator or an in-car entertainment system (all not shown).
  • a WebTV a Global Positioning System
  • GPS Global Positioning System
  • Such multi-functional digital devices are typically installed with an embedded real-time OS (e.g. RT-LINUX ® , MicroC/OS-II, QNX, TRON, Windows CE or Vx Works) which generally contains most of the important functionalities found in conventional desktop-based operating systems.
  • RT-LINUX ® MicroC/OS-II, QNX, TRON, Windows CE or Vx Works
  • the communication network 114 may be a local area network (LAN), a personal area network (PAN), a cellular-based network or the Internet. Accordingly, the type of communication link 116 used by the respective digital devices for connecting to the network 114 or to each other depends on the configuration of the network 114 and the type of digital device (i.e. mobile or deskbound). In one embodiment, the communication link 116 is preferably established wirelessly, using a communication protocol such as
  • the communication link 116 may also be established via conventional wired means (e.g. twisted pair wire, coaxial cable or optical fibre cable).
  • the UI processing software may also be installed and used in standalone digital devices (i.e. not connected to any other digital devices or to the
  • Figure 2 shows a first embodiment 200 of the UI processing software of the invention. As illustrated, Figure 2 shows a display screen 202 of any one of the afore-described digital devices, in which a first dialog window 204 and a second dialog window 206 are processed with respect to each other in different transition states according to the elapse of time. The transit of time is represented by the arrow 208.
  • the dialog window may also equivalently be termed a "dialog box".
  • a first state i.e. the diagram on the left of arrow 208
  • the first dialog window 204 belonging to first software (or program) is initially generated and operated by a user (not shown) of the digital device, which is preloaded with and runs a preferred operating system from the list as described above.
  • the operating system is generally pre-configured and installed with software agents known as daemons which typically run as background processes and are not in direct control by the user.
  • daemons are event- driven/event-based. They serve the function of monitoring and responding to both hardware and software requests/interrupt events such as configuring hardware, running scheduled tasks or simply polling for a change in status of software variables corresponding to triggering of events when certain pre-determined conditions are fulfilled.
  • the second dialog window 206 is generated by second software different from that responsible for the generation of the first dialog window 204.
  • the first software may be a word processing program (e.g. Microsoft Word ® ) and the second software may be a phone-call receiving program.
  • the first dialog window 204 and second dialog window 206 may, however, be generated by the same software.
  • the second dialog window 206 may either be modal or modeless. Note that this characteristic applies also to the subsequent second dialog windows to be described in the following embodiments.
  • the generation of the second dialog window 206 is a result of a specific daemon detecting a new interrupt event, such as an incoming phone call, being received in the digital device (e.g. the smartphone 110).
  • the request is directed to the relevant software, such as the phone-call receiving program, which then responds accordingly by alerting the user to the new event by generating the second dialog window 206 which may query, for example whether the user wishes to accept the incoming call while he is simultaneously working on a document in the word processing program.
  • relevant software such as the phone-call receiving program
  • buttons 210a e.g. "OK” and "Cancel"
  • the predetermined time period may be defined according to a factory configuration of the digital device or through a control settings applet.
  • the predetermined time period may, for example be defined in the range of between one millisecond to thirty seconds.
  • an alert (not shown) may also optionally be generated, particularly if the digital device is the mobile phone 106 or smartphone 110.
  • the alert may be audible-based (e.g. ringing sound of the device or a simple chime), vibration-based or a combination.
  • the predetermined time period is adjustable by the user, for example being made available and accessible through a control settings applet, in the form of a slider bar where the selectable timings are clearly indicated.
  • the inactivated buttons 210a are then switched to as activated buttons 210b, shown in the second state (i.e. the diagram on the right of arrow 208) after a signal is received.
  • the signal corresponds to the expiry of the predetermined time period.
  • the invention provides that the user is unlikely to accidentally perform an unintentional action by inevitably acting on a request relating to the second dialog window 206 (e.g. rejecting an incoming call by pressing the "Cancel" button) which appears suddenly on the foreground of the UI in the display screen 202 when he is simultaneously operating the first dialog window 204.
  • each of the buttons 210a is replaced by a key combinations/ sequences button that requires the user to provide a corresponding input through the keyboards or keypads (physical or digital-based) of the digital devices.
  • this particular embodiment is considered more appropriate for devices such as the laptop 102, iMac ® 108, mobile phone 106, IBM ® -compatible desktop personal computer (PC) 112 or, the Tablet PC 104.
  • PC desktop personal computer
  • the user supplies a specific key combination (e.g.
  • the password or input key combination may either be predetermined according to a factory configuration or subsequently defined by the user through a control settings applet.
  • the respective key combination/ sequence buttons replacing the buttons 210a may also be converted to the normal activation buttons 210b after the expiry of a predetermined time period. It is also to be appreciated that if this modification is applied to a digital device which is the mobile phone 106, then each of the normal activation buttons 210b corresponds to and can be activated by a pre-specific button of the keypad on the mobile phone 106.
  • buttons 210a are each replaced by a text input box (not shown), which on one side (e.g. at the top) displays a predetermined string of text, where in order to execute the command corresponding to that particular button, the user is required to subsequently provide the correct string of text as shown into the text input box.
  • a text input box not shown
  • the user is required to subsequently provide the correct string of text as shown into the text input box.
  • CAPTCHA works as commonly known in the art.
  • Figure 3 shows a second embodiment 300 of the UI processing software of the invention.
  • Figure 3 shows a display screen 302, in which a first dialog window 304 and a second dialog window 306 are processed with respect to each other in different transition states according to different time intervals. The transit of time is represented by the arrow 308.
  • the second embodiment 300 is largely similar to the first embodiment 200 of Figure 2, except that when the second dialog window 306 (corresponding to a detected interrupt event by the daemon) is generated and presented to the user in the foreground together with the first dialog window 304 (the diagram on the left of arrow 308), the buttons 310 are instead not displayed (i.e. not visible to the user) as opposed to in Figure 2 where the buttons 210a are inactivated.
  • buttons 310 of the second dialog window 306 are not shown to the user for a predetermined time period until a signal corresponding to the expiry of the time period is received (i.e. the diagram on the right of arrow 308).
  • Figure 4 shows a third embodiment 400 of the UI processing software of the invention, in which the processing of a first dialog window 402 and a second dialog window 404 with respect to each other in different time intervals on a display screen 406 is illustrated in the different transition states. Similarly, the transit of time is represented by the arrow 408.
  • the first state i.e. the diagram on the left of arrow 408
  • the second dialog window 404 (generated corresponding to a programmed response on event detection) is not displayed in the foreground, converse to the previous two embodiments 200, 300 in Figures 2 and 3. Rather, a notification window 410 is displayed to the user.
  • an alert 412 may also be generated together with the notification window 410.
  • the alert 412 may be audible-based (e.g. a chime or the ringing sound of the mobile phone 106 if the interrupt event is triggered by an incoming call), vibration-based, visual-based (e.g. flashing lights from buttons of the digital device) or a combination.
  • the notification window 410 may comprise an image (e.g. an exclamation mark or a mail-envelope) which would serve to highlight to the user that a corresponding interrupt event has been detected and requests his separate attention from what he is currently working on in the first dialog window 402.
  • the image may also alternatively be pre-configured to be associated with the type of event being detected.
  • the image to be used on the notification window 410 may also be configurable by the user according to his preferences, from a control settings applet.
  • the notification window 410 may display a message alerting the user to the interrupt event.
  • the notification window 410 is displayed to the user for a predetermined time period, which may be defined according to a factory configuration of the digital device or through a control settings applet, as described above.
  • the user may dismiss the notification window 410 by pressing the close-icon 414 if he does not wish to wait for the expiry of the predetermined time period.
  • the notification window 410 is dismissed and the second dialog window 404, with enabled buttons 416, is then displayed together with the first dialog window 402 to the user as shown in the second state (i.e. the diagram on the right of arrow 408) of Figure 4.
  • Figure 5 shows again in different transition states, the processing of a first dialog window 502 and a second dialog window 504 with respect to each other in different time intervals on a display screen 506 and the transit of time being represented by the arrow 508.
  • a fourth embodiment 500 of the UI processing software in Figure 5 functions largely in the same manner, except with one difference.
  • workings of the fourth embodiment 500 which are similar to those of the third embodiment 400 described in Figure 4 are therefore not explained again.
  • the fourth embodiment 500 in the first state i.e. the diagram on the left of arrow 508 only generates an alert 510 and waits for a predetermined time period. Consequently when the time period has expired (i.e. a corresponding signal is received), the second dialog window 504, with enabled buttons 512, are thereafter displayed alongside with the first dialog window 502 as clearly shown in the second state (i.e. the diagram on the right of arrow 508) of Figure 5.
  • buttons on the keypad which are mapped to the respective enabled/activated buttons 210b, 310, 416, 512 as contained in the second dialog windows 206, 306, 404, 504, then consequently only become active after the time period has expired.
  • the UI processing software or operating system ignores any signals received as a result of pressing the related buttons on the keypad or the related buttons are simply inactivated briefly (i.e. signal generation is disabled) for this purpose.
  • Figure 6 shows a fifth embodiment 600 of the UI processing software, where the processing of a first dialog window 602 and a second dialog window 604 with respect to each other on a display screen 606 is depicted, in different transition states according to the elapse of time as represented by the arrow 608.
  • the first state i.e. the diagram on the left of arrow 608
  • a user response dialog window 610 is shown to the user.
  • the user response dialog window 610 requires the user to provide either a password or a specific input keys combination (e.g.
  • the password or input keys combination may either be predetermined according to a factory configuration or subsequently defined by the user through a control settings applet.
  • the user response dialog window 610 disappears and the second dialog window 604 (together with the buttons 612 which may be pressed immediately) is then displayed to the user as shown in the second state (i.e. the diagram on the right of arrow 608).
  • Figure 7 shows a sixth embodiment 700 of the UI processing software, where the processing in different transition states of a first dialog window 702 and a second dialog window 704 with respect to one another on a display screen 706 is illustrated.
  • the elapse of time is indicated by the arrow 708.
  • the first state i.e. the diagram on the left of arrow 708
  • an unlock dialog window 710 is then displayed on a designated area on the display screen 706.
  • the unlock dialog window 710 may comprise a slider 712 and an associated gesture-cue indicator 714.
  • the display of the gesture-cue indicator 714 may also optionally be omitted according to another alternative embodiment.
  • the user In order to bring up the display of the second dialog window 704, the user needs to actuate the slider 712 by providing a gesture, i.e. by sliding the slider 712 towards the extreme right side.
  • the gesture-cue indicator 714 provides an indication of how the slider 712 is activated should the user not be knowledgeable or well-informed on the use of such a feature.
  • the actuation of the slider 712 may be performed by using fingers or a stylus if the digital device is touch-screen based such as the Tablet PC 104 or other input devices (e.g. a mouse) if the digital device is a conventional machine such as the IBM ® -compatible desktop personal computer (PC) 112.
  • PC desktop personal computer
  • other types of response mechanisms may be used in place of the slider 712, such as a circular pad or a diagonal slider (all not shown).
  • the type of gesture to be thus provided by the user will differ for example, a sweeping gesture or a circular gesture.
  • This is correspondingly then reflected in the gesture-cue indicator 714 shown in the unlock dialog window 710.
  • the gesture-cue indicator 714 may also be located outside the unlock dialog window 710, in proximity.
  • the unlock dialog window 710 disappears and the second dialog window 704, with response buttons 716, is then displayed (refer to the diagram on the right of arrow 708).
  • the sixth embodiment 700 is also applicable for use in the embodiments 200, 300 as described in Figures 2 and 3 respectively, where the unlock dialog window 710 then replaces the predetermined time period.
  • Figure 8 shows seventh and eighth embodiments 800a, 800b of the UI processing software, where the second dialog window 802 appears with respect to the first dialog window 804 without any time delay on the display screen 806 upon detection of an interrupt event.
  • Either one of the embodiments 800a, 800b may be adopted depending on the specific implementation of the UI processing software.
  • sliders 808a, 808b corresponding to similar key functions (e.g. "OK” and "Cancel") are used. More specifically, the first embodiment 800a uses two sliders 808a, each in place of the conventional buttons whereas the second embodiment 800b utilizes only one slider 808b to represent the typical buttons. Particular instructions on how to activate the sliders 808a, 808b may be provided through a message box 810 located in proximity to the sliders 808a, 808b (e.g. at the top as shown). An example of the instructions contained in the message box 810 in this case may be: "Please drag the left slider to indicate an affirmative answer or the right slider to indicate a negative answer".
  • the message box 810 may not be provided as the user is able to intuitively figure out how to operate the sliders 808a, 808b.
  • graphics, animations or other optional means as known to the skilled person may also be used to instruct the user on how to operate the sliders 808a, 808b.
  • Figure 9 shows a ninth embodiment 900, which is more applicable for touch-screen- based digital devices such as the smartphone 110, Tablet PC 104 or some laptops 102.
  • the ninth embodiment 900 is considered a variation of the embodiments in Figure 8 where the second dialog window 902 also appears with respect to the first dialog window 904 without any time delay, on the display screen 906 upon detection of an interrupt event.
  • the difference in the ninth embodiment over that in Figure 8 is that the sliders 808a, 808b in Figure 8 are now replaced by rectangular response areas 908a, 908b as shown, in which sliding/ sweeping gestures are to be provided by the user, either using a stylus or his fingers.
  • a message box 910 which displays the type of interrupt event may also optionally be displayed (e.g. "An incoming call from Norton").
  • the response areas 908a, 908b may be located proximal to each other.
  • the response areas may be rectangular as shown or of a different shape (e.g. circular) .
  • the gesture required for each of the rectangular response areas 908a, 908b is pre-programmed to be distinctly different so that the user is unlikely to mistakenly trigger any one of them (e.g. a right sliding gesture is required to trigger the "OK" command while a left sliding gesture is required for the "Cancel” command).
  • the type of gesture required may also be indicated through an animation or image which is preferably displayed within the response areas 908a, 908b. However, such images may also alternatively be provided in proximity to the response areas 908a, 908b.
  • buttons may initially be inactivated for a certain time interval, after which they become active. More importantly, it is to be appreciated that this embodiment is particularly suitable for implementation in a digital device installed with an operating system with relatively complex functionalities, such as those in the Tablet PC 104, smartphone 110, laptop 102, iMac ® 108 or IBM ® -compatible PC 112.
  • the balloon message that appears in the notification area contains a text message and buttons of a typical dialog window such as any one of those described in the foregoing embodiments.
  • the buttons are initially disabled or not displayed until the expiry of a predetermined time period, after which they are then activated or visible to the user. Therefore, instead of displaying a dialog window with a text message and inactivated buttons (such as the embodiment in Figure 2) or without buttons (such as the embodiment in Figure 3) for a predefined time period, it follows that in this particular embodiment that the text and buttons are displayed in a notification interface such as a balloon message that appears in the notification area of the operating system.
  • the predefined time period may range in this case from 0 seconds to 30 seconds.
  • buttons contained in the balloon message are accessible via a computer mouse or other like input methods (e.g. haptic-based) and not through keyboards typically used by personal computers or laptops.
  • an alert may also optionally be generated.
  • the alert may be audible-based, vibration-based, visual-based (e.g. flashing lights from buttons of the digital device) or a combination.
  • This variation of the tenth embodiment is more appropriately used on digital devices such as the Tablet PC 104, smartphone 110, laptop 102, iMac ® 108 and the IBM ® -compatible PC 112.
  • a second dialog window corresponding to an interrupt event is generated and immediately displayed, overlaying the other prior generated windows on the desktop of the operating system.
  • the second dialog window is generated such that it is not selected or not active (i.e. lacks a current window focus).
  • this specific embodiment may not protect against clicks made using a computer mouse (e.g. clicks made in an area where the buttons of the no-current- focus window are, will thus inevitably lead to pressing of the buttons) but might protect against unwanted/ unintended commands given by the user through the
  • keyboard/keypad This embodiment is considered to be more applicable to digital devices with keyboards/keypads, including those with keyboards/keypads and also with a touch-screen, such as the Nokia ® N97 mobile phone.
  • the second dialog window appears, together with activated buttons, without any time delay.
  • a confirmation window appears which queries the user to confirm if he really wishes to proceed with the indicated action.
  • the specific command is then promptly executed. Otherwise, the indicated action is cancelled and the system returns to a previous state.
  • the confirmation window can optionally be positioned on a different area of the display screen of the digital device, away from where the second dialog window is displayed. This then prevents the user from performing a quick succession doubleclicks or clicks in two spots close together, which might otherwise potentially act on both the buttons in the second dialog window and confirmation window.
  • a predetermined time period may be defined so that there is a visible time lag from when the buttons in the second dialog window are clicked till the display of the confirmation window. Furthermore during this time period, the second dialog window may either briefly disappear or continue to be displayed, but with inactivated buttons.
  • This particular embodiment is considered more suitable to be used by digital devices running substantially complex operating systems (e.g. the IBM ® -compatible PC 112, iMac ® 108 or laptop 102), touch-screen based devices or, phones with keypads.
  • buttons of the dialog windows correspond to specific keys.
  • pressing the buttons are inactive in the dialog window, pressing the buttons.
  • the first dialog window may also be the main window of a program (e.g. Adobe Acrobat ® ) or the desktop user-interface of an operating system installed on any one of the digital devices and the second dialog window is typically a window which comprises buttons.
  • a program e.g. Adobe Acrobat ®
  • the second dialog window is typically a window which comprises buttons.
  • FIG 10 is a flow diagram outlining the steps of a scheme 1000 in accordance with which the first embodiment 200 in Figure 2 is operated.
  • the UI processing software monitors to detect new interrupt events generated in the background, the detection being performed via a specific daemon as preconfigured in the operating system of the associated digital device.
  • the scheme 1000 Upon detection of a new event, the scheme 1000 then inactivates the buttons 210a of the second dialog window 206 in a step 1004, which is generated by a program for corresponding to the new event.
  • the buttons 210a are to be inactivated for a predetermined time period, which is either defined according to a factory configuration or adjustable by the user.
  • the second dialog window 206 is displayed to the user, but with inactivated buttons 210a, in a step 1006. Consequent to the expiry of the predetermined time period (i.e. receipt of a corresponding signal), the buttons 210b are then reactivated in a step 1008, so that the user may now appropriately respond to the event through the second dialog window 206.
  • FIG. 11 Another flow diagram in Figure 11 outlines the steps of a scheme 1100, which is adopted by the second embodiment 300 in Figure 3.
  • the UI processing software similarly monitors for and detects new interrupt events generated in the background, the detection being performed via a specific daemon as preconfigured in the operating system of the associated digital device.
  • the scheme 1100 does not show the buttons 310 of the second dialog window 306 (which is generated corresponding to the new event) for a predetermined time period in a step 1104.
  • the predetermined time period is one of defined according to a factory configuration and adjustable by the user.
  • the second dialog window 306 is then displayed to the user.
  • the buttons 310 are then shown and displayed to the user in a final step 1108.
  • a flow diagram outlines the steps of a scheme 1200, in accordance with which the third embodiment 400 in Figure 4 is operated.
  • the UI processing software monitors and detects new interrupt events generated in the background. The detection is preferably performed via a specific daemon as preconfigured in the operating system installed on the digital device.
  • the scheme 1200 delays the display of the second dialog window 404 in a step 1204.
  • an alert 412 is optionally generated in the digital device.
  • the alert 412 may be audible-based, visual-based, vibration-based or a combination.
  • a notification window 410 is then displayed to the user for alerting him that a new event has been received.
  • the notification window 410 is displayed for a predetermined time period according to a step 1210, which either is preconfigured in the factory or is adjustable by the user.
  • a step 1212 the notification window 410 is dismissed and the second dialog window 404 (complete with response buttons 416) is then displayed to the user when an associated signal is received. The signal is generated either when the predetermined time period expires or when the user dismisses the notification window 410 by pressing the close-icon 414.
  • the flow diagram outlines the steps of a scheme 1300 adopted by the fourth embodiment 500 in Figure 5.
  • the UI processing software monitors and detects new interrupt events generated in the background, the detection being performed through a specific daemon as preconfigured in the operating system of the digital device operated by the user.
  • the scheme 1300 delays the display of the second dialog window 504 in a step 1304.
  • an alert 510 is generated in the digital device.
  • the alert 510 may be one of audible-based, visual- based, vibration-based and a combination.
  • the scheme 1300 then waits for a predetermined time period in a step 1308.
  • the predetermined time period may be adjustable by the user or follows a factory pre-configuration.
  • the second dialog window 504, along with the buttons 512, is then displayed to the user when the time period expires (i.e. a signal is received).
  • FIG. 14 Yet another embodiment is shown in Figure 14 where the steps of a scheme 1400 are outlined in the flow diagram.
  • the scheme 1400 is adopted by the fifth embodiment 600 in Figure 6.
  • the UI processing software monitors and detects new interrupt events generated in the background, performing the detection using a specific daemon as preconfigured in the operating system of the digital device used by the user.
  • the scheme 1400 delays the display of the second dialog window 604 in a step 1404.
  • a user response dialog window 610 is displayed, which requests and waits for a response from the user in a step 1406.
  • the response may be in the form of providing a password or supplying an input keys combination to the operating system.
  • the user response dialog window 610 remains.
  • the user response dialog window 610 is dismissed and the second dialog window 604 is subsequently displayed to the user when a signal is received by the operating system.
  • the signal in this case corresponds to the receipt and successful authentication of a response provided by the user.
  • a step 1502 the UI processing software constantly monitors the background to detect new interrupt events. The detection is carried out using a specific daemon as preconfigured in the operating system of the digital device used by the user.
  • display of the second dialog window 704 is delayed in a step 1504.
  • the type of response required from the user is the same as described afore in Figure 7 and is not repeated here.
  • the unlock dialog window 710 disappears and the second dialog window 704 is consequently displayed when a signal is received by the operating system in a step 1508.
  • the signal is generated when the user provides a correct response.
  • the embodiment in Figure 16 shows the steps of another scheme 1600 outlined in the illustrated flow diagram, which is adopted by the embodiments in Figures 8 and 9.
  • the UI processing software constantly monitors the background in order to detect new interrupt events. The detection is carried out using a specific daemon as preconfigured in the operating system of a digital device in use.
  • the second dialog window 802, 902 is displayed in a step 1604.
  • the second dialog window contains the sliders 808a, 808b of Figure 8 or the response areas 908a, 908b of Figure 9.
  • the scheme 1600 then waits for the user to submit a response.
  • the response may consist of dragging the sliders 808a, 808b to a desired opposite end or activating the response areas 908a, 908b using an appropriate gesture, such as a sliding gesture or a sweeping gesture.
  • an appropriate gesture such as a sliding gesture or a sweeping gesture.
  • the associated commands mapped to the respective sliders 808a, 808b or response areas 908a, 908b are then executed in a final step 1608.
  • Figure 17 shows the steps of a scheme 1700 as illustrated, in which in a step 1702, the UI processing software monitors the background to detect the occurrence of new interrupt events. The detection is performed using a specific daemon as preconfigured in the operating system of a digital device in use.
  • the second dialog window corresponding to the detected event is displayed in a step 1704.
  • the second dialog window may contain conventional buttons, the sliders 808a, 808b of Figure 8 or the response areas 908a, 908b of Figure 9.
  • a confirmation window is then displayed to the user in a step 1706.
  • the confirmation window the user is queried to confirm whether he wishes to proceed with the indicated action as submitted.
  • the scheme 1700 thus waits for any response to be submitted through the confirmation window in a step 1708.
  • each of the previously presented embodiments and their variations may be realized as computer readable code (i.e. programming instructions) on a computer readable storage medium.
  • the computer readable storage medium is any data storage device that can store data which can thereafter be read by a computer system, including both transfer and non-transfer devices. Examples of the computer readable storage medium include read-only memory, random-access memory, CD-ROMs, Flash memory cards, DVDs, Blu-ray Discs, magnetic tapes, optical data storage devices, and carrier waves.
  • the computer readable storage medium can also be distributed over network-coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
  • Figure 18 shows a schematic of an embodiment of an apparatus 1800 for performing the process of the invention.
  • the apparatus 1800 comprises a number of functional modules which may be implemented in hardware, software, or as a combination of the two.
  • the apparatus 1800 shown in Figure 18 is part of a multi-functional device such as the mobile phone 106, with a processor 1802 being controlled by a global controller (not shown) of the device.
  • the apparatus 1800 of the present embodiment comprises an interrupt event detector 1804, which may be coupled to components/ sensors 1806 of the device such as the signal receiver of the mobile phone 106, and monitors the detection of an interrupt event. The monitoring may be performed continually, or at
  • the apparatus 1800 also comprises an interface generator 1808.
  • the interface generator 1808 is coupled to a storage device 1810 such as a ROM which stores predetermined interface layouts and instructions to enable the layouts to be generated by the interface generator 1808, and the ROM may also store additional information which specifies the event and device operations to which the interface layouts correspond.
  • the ROM stores an interface for an interrupt event, such as "incoming call", together with non- interrupt events, such as the interface for generation of a text message.
  • a signal is generated by the interrupt event detector 1804 which is transferred to the processor 1802.
  • the processor 1802 interprets the received signal and transfers a command to the interface generator 1808 to extract instructions from the ROM for generating a new interface which corresponds to the event associated with the detected event.
  • the interface generator 1808 generates the appropriate new interface and sends a signal to the processor 1802 so that it can be provided to a display unit (not shown) and presented to a user, either separately, or together with a previously generated interface.
  • the processor 1802 controls the provision of the interface to the display unit in one or more ways as described above. For example, the generated interface may be modified so that various components of the interface are disabled for a predetermined time, or are not made visible until a predetermined event such as a password input occurs.
  • the processor may be coupled to a condition monitor, which determines whether the condition under which the generated interface can be displayed has occurred. If the condition has occurred, a signal is provided to the processor 1802, and the processor 1802, on receipt of the signal, controls the display of the generated interface.
  • the condition monitor may be implemented as part of the processor 1802, which may be associated with its own in-built clock, for example, in order to determine whether a predetermined time has elapsed.
  • the condition under which the generated interface can be displayed can be set by the user, through an input means 1812, or can be programmed according to a factory setting of the device.
  • the present invention can also be used in digital devices which use operating systems or software programs adapted to/specific to/which support 3D gestures as an input method. Furthermore, it can also be used in all digital devices which currently have touch-screen technology or will have this technology in the future, and not just the ones explicitly referred to in this application.
  • the technology can also be adapted to be used with digital devices which will use new methods of input that will be developed in the future.
  • the embodiments of the technology which will be used in such cases will be relatively similar to the ones described above, being adapted to the respective input method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application describes a dialog windows processing method for detecting an interrupt event on a digital device, which leads to the generation of a new dialog window with respect to a prior generated user interface. The method delays the display of the new dialog window or disables the buttons of the new dialog window until a subsequent signal is received, among other techniques. The signal may correspond to the expiry of a time period or a response action initiated by the user of the digital device, such as providing a password. Additionally, an alert may optionally be provided for highlighting to a user the detection of the interrupt event.

Description

User interface
Field of Invention
This invention relates generally to user interfaces. Particularly, but not exclusively, it pertains to a method and a system for processing dialog windows displayed on multi-functional digital devices.
Background
Recently, the use of multi-functional digital devices has proliferated and they are increasingly being utilized by people in almost every aspect of their lives. Use of digital devices these days can typically be found in the living room/kitchen (e.g. WebTV or touch-screen refrigerators), on vehicles (e.g. in-car navigation units) or simply as personal communication devices (e.g. mobile phones and laptops). All these devices have a commonality in that all of them in one way or another implement a user-interface (UI) for enabling user interaction.
One problem with most existing implementations of user interfaces is that they suffer from the issue of overlapping alert notifications, in the form of dialog windows, from different software thereby leading users to accidentally trigger unintended actions. Take for example a scenario where a user is currently working on a document and at one moment hits the "carriage return" key to begin a new paragraph. Near simultaneously, a dialog window from another program running in the background appears. Thus, instead of beginning a new paragraph, the user unintentionally hits the "OK" button of the dialog window and unwittingly triggers a command in the program.
In another scenario, in the midst of typing out a SMS message on a mobile phone, a user presses the "Delete" key to erase a mistyped character or exit from a menu and, he receives an incoming call while the "Delete" key is being depressed. As a result, instead of erasing the character, the user inevitably rejects and misses the call, causing inconvenience to both the user and caller. A plethora of examples of UIs suffering from similar problems as described in the above scenarios are well known in the literature. Therefore, in view of the foregoing problems, an improved method and system for processing and displaying the dialog windows generated by different programs, thereby presenting a better UI experience to users of digital devices, would thus be useful and advantageous.
Summary
According to a first aspect of the present invention, there is provided a method for processing at least one new dialog window with respect to a prior generated user- interface displayed in an operating system of a digital device, the operating system being used by a user, the method comprises detecting an interrupt event which leads to generation of the at least one new dialog window and delaying or hiding the display of the at least a portion of the at least one new dialog window in response to detection of the interrupt event until a signal is received.
According to a second aspect of the present invention, there is provided a method for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device, the operating system being used by a user, the method comprises detecting an event which leads to generation and display of the at least one new dialog window, and disabling at least a portion of the at least one new dialog window in response to detection of the interrupt event until a signal is received.
According to a third aspect of the present invention, there is provided a computer program being configured to perform a method according to any one of claims 1 to 37.
According to a fourth aspect of the present invention, there is provided an apparatus for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device, the apparatus comprising an event detector for detecting an interrupt event, an interface generator for generating at least one new dialog window in response to detection of the interrupt event and, a processor for delaying or hiding the display of at least a portion of the at least one new dialog window event until a signal is received.
According to a fifth aspect of the present invention, there is provided an apparatus for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device, the apparatus comprising an event detector for detecting an event, an interface generator for generating at least one new dialog window in response to detection of the interrupt event and, a processor for disabling at least a portion of the at least one new dialog window until a signal is received.
According to a sixth aspect of the present invention, there is provided a method for processing at least one new dialog window with respect to a prior generated user- interface displayed in an operating system of a digital device, according to claim 22, claim 25, claim 34 or claim 35.
According to a seventh aspect of the present invention, there is provided an apparatus according to claim 39 or claim 40. These and other aspects of the invention will be apparent from and elucidated with reference to the embodiments described hereinafter.
Brief Description of the Drawings
Embodiments of the invention are disclosed hereinafter with reference to the drawings, in which:
Figure 1 illustrates a schematic diagram of a network scenario where user-interface (UI) processing software in accordance with embodiments of the invention is deployed in a plurality of digital devices;
Figures 2 to 9 are different embodiments showing how the UI processing software as deployed in Figure 1 functions;
Figures 10 to 17 are flow diagrams illustrating the corresponding schemes adopted by the different embodiments of the UI processing software depicted in Figures 2 to 9; and Figure 18 is a schematic block diagram of an apparatus according to an embodiment of the invention.
Detailed Description
Figure 1 illustrates a schematic diagram 100 of an exemplary network scenario where user-interface (UI) processing software in accordance with embodiments of the invention, is installed and deployed in a plurality of digital devices. The UI processing software may be known as "Natural Input Technology (NIT)". The digital devices are preferably personal communication devices such as a laptop 102, a Tablet PC 104, a mobile phone 106, an iMac® 108, a smartphone 110 and an
IBM®-compatible desktop personal computer (PC) 112. In addition, these digital devices are installed with an operating system (OS) such as Microsoft Windows®, Apple Mac® OS X, UNIX®, GNU/Linux®, BSD systems (FreeBSD, NetBSD or OpenBSD), Google Android®, Symbian® or Google Chrome® OS, which includes various software components and/or drivers for controlling and managing real-time system tasks (e.g. memory management, storage device control, power management and the like) and facilitating intercommunications between various hardware and software components of the digital devices. Not limited to the above, the UI processing software may however, also be installed in any modern, multi-functional digital devices that are pre-configured to have a user interface, such as a WebTV, a Global Positioning System (GPS) navigation system, a touch screen refrigerator or an in-car entertainment system (all not shown). Such multi-functional digital devices are typically installed with an embedded real-time OS (e.g. RT-LINUX®, MicroC/OS-II, QNX, TRON, Windows CE or Vx Works) which generally contains most of the important functionalities found in conventional desktop-based operating systems.
The digital devices are shown in Figure 1 to be interconnected through a
communication network 114 via communication links 116, but the devices may also be connected directly to each other. The communication network 114 may be a local area network (LAN), a personal area network (PAN), a cellular-based network or the Internet. Accordingly, the type of communication link 116 used by the respective digital devices for connecting to the network 114 or to each other depends on the configuration of the network 114 and the type of digital device (i.e. mobile or deskbound). In one embodiment, the communication link 116 is preferably established wirelessly, using a communication protocol such as
Bluetooth®, wireless universal-serial-bus (WUSB), Wi-Fi (Wireless Fidelity), WiMax (Worldwide Interoperability for Microwave Access), Cellular technologies (e.g. GSM, UMTS, HSPA+ or LTE Advanced) or any equivalents known to persons skilled in the art. Alternatively, the communication link 116 may also be established via conventional wired means (e.g. twisted pair wire, coaxial cable or optical fibre cable).
Moreover, the UI processing software may also be installed and used in standalone digital devices (i.e. not connected to any other digital devices or to the
communication network 114). In addition, the technology of the UI processing software may be included as a specific component in any of the afore-described operating systems, possibly as, but not limited to, a special type of customized dialog window using the technology already included as part of the operating system, or may be included in the software program that uses the technology. Figure 2 shows a first embodiment 200 of the UI processing software of the invention. As illustrated, Figure 2 shows a display screen 202 of any one of the afore-described digital devices, in which a first dialog window 204 and a second dialog window 206 are processed with respect to each other in different transition states according to the elapse of time. The transit of time is represented by the arrow 208. It is to be appreciated that the dialog window may also equivalently be termed a "dialog box". In a first state (i.e. the diagram on the left of arrow 208), the first dialog window 204 belonging to first software (or program) is initially generated and operated by a user (not shown) of the digital device, which is preloaded with and runs a preferred operating system from the list as described above. The operating system is generally pre-configured and installed with software agents known as daemons which typically run as background processes and are not in direct control by the user. As known to skilled persons, daemons are event- driven/event-based. They serve the function of monitoring and responding to both hardware and software requests/interrupt events such as configuring hardware, running scheduled tasks or simply polling for a change in status of software variables corresponding to triggering of events when certain pre-determined conditions are fulfilled.
Subsequent to the generation of the first dialog window 204, the second dialog window 206 is generated by second software different from that responsible for the generation of the first dialog window 204. For example, the first software may be a word processing program (e.g. Microsoft Word®) and the second software may be a phone-call receiving program. The first dialog window 204 and second dialog window 206 may, however, be generated by the same software. The second dialog window 206 may either be modal or modeless. Note that this characteristic applies also to the subsequent second dialog windows to be described in the following embodiments. Generally, the generation of the second dialog window 206 is a result of a specific daemon detecting a new interrupt event, such as an incoming phone call, being received in the digital device (e.g. the smartphone 110). When that happens, the request is directed to the relevant software, such as the phone-call receiving program, which then responds accordingly by alerting the user to the new event by generating the second dialog window 206 which may query, for example whether the user wishes to accept the incoming call while he is simultaneously working on a document in the word processing program.
In the first embodiment 200, when the second dialog window 206 is initially generated and displayed, the buttons 210a (e.g. "OK" and "Cancel") of the second dialog window 206 are initially inactivated for a predetermined time period as shown by the dotted boxes representing the buttons 210a in Figure 2. The predetermined time period may be defined according to a factory configuration of the digital device or through a control settings applet. The predetermined time period may, for example be defined in the range of between one millisecond to thirty seconds. Furthermore, together with the generation of the second dialog window 206, an alert (not shown) may also optionally be generated, particularly if the digital device is the mobile phone 106 or smartphone 110. The alert may be audible-based (e.g. ringing sound of the device or a simple chime), vibration-based or a combination.
Alternatively, the predetermined time period is adjustable by the user, for example being made available and accessible through a control settings applet, in the form of a slider bar where the selectable timings are clearly indicated. The inactivated buttons 210a are then switched to as activated buttons 210b, shown in the second state (i.e. the diagram on the right of arrow 208) after a signal is received. In this case, the signal corresponds to the expiry of the predetermined time period. Thus, the invention provides that the user is unlikely to accidentally perform an unintentional action by inevitably acting on a request relating to the second dialog window 206 (e.g. rejecting an incoming call by pressing the "Cancel" button) which appears suddenly on the foreground of the UI in the display screen 202 when he is simultaneously operating the first dialog window 204.
In a modification of the first embodiment, instead of showing the second dialog window 206 with inactivated buttons 210a when being initially generated, each of the buttons 210a is replaced by a key combinations/ sequences button that requires the user to provide a corresponding input through the keyboards or keypads (physical or digital-based) of the digital devices. Thus, this particular embodiment is considered more appropriate for devices such as the laptop 102, iMac® 108, mobile phone 106, IBM®-compatible desktop personal computer (PC) 112 or, the Tablet PC 104. Thereafter, the user supplies a specific key combination (e.g. a combination of "E" plus "N" keys or a succession pressing of the keys "E" and "N" to activate the "OK" button or, "C" plus "A" keys or a succession pressing of keys "C" and "A" to activate the "Cancel" button of the second dialog window 206) in order to execute the command corresponding to the original particular button 210a. The password or input key combination may either be predetermined according to a factory configuration or subsequently defined by the user through a control settings applet.
The respective key combination/ sequence buttons replacing the buttons 210a may also be converted to the normal activation buttons 210b after the expiry of a predetermined time period. It is also to be appreciated that if this modification is applied to a digital device which is the mobile phone 106, then each of the normal activation buttons 210b corresponds to and can be activated by a pre-specific button of the keypad on the mobile phone 106.
Moreover, in another modification of the first embodiment, the inactivated buttons 210a are each replaced by a text input box (not shown), which on one side (e.g. at the top) displays a predetermined string of text, where in order to execute the command corresponding to that particular button, the user is required to subsequently provide the correct string of text as shown into the text input box. The technique of this embodiment can be referenced from how a software
CAPTCHA works as commonly known in the art.
Figure 3 shows a second embodiment 300 of the UI processing software of the invention. In particular, Figure 3 shows a display screen 302, in which a first dialog window 304 and a second dialog window 306 are processed with respect to each other in different transition states according to different time intervals. The transit of time is represented by the arrow 308. The second embodiment 300 is largely similar to the first embodiment 200 of Figure 2, except that when the second dialog window 306 (corresponding to a detected interrupt event by the daemon) is generated and presented to the user in the foreground together with the first dialog window 304 (the diagram on the left of arrow 308), the buttons 310 are instead not displayed (i.e. not visible to the user) as opposed to in Figure 2 where the buttons 210a are inactivated. Likewise in the first embodiment 200, the buttons 310 of the second dialog window 306 are not shown to the user for a predetermined time period until a signal corresponding to the expiry of the time period is received (i.e. the diagram on the right of arrow 308).
Figure 4 shows a third embodiment 400 of the UI processing software of the invention, in which the processing of a first dialog window 402 and a second dialog window 404 with respect to each other in different time intervals on a display screen 406 is illustrated in the different transition states. Similarly, the transit of time is represented by the arrow 408. In the first state (i.e. the diagram on the left of arrow 408), when a new interrupt event is detected after the generation of the first dialog window 402, the second dialog window 404 (generated corresponding to a programmed response on event detection) is not displayed in the foreground, converse to the previous two embodiments 200, 300 in Figures 2 and 3. Rather, a notification window 410 is displayed to the user. Optionally, an alert 412 may also be generated together with the notification window 410. The alert 412 may be audible-based (e.g. a chime or the ringing sound of the mobile phone 106 if the interrupt event is triggered by an incoming call), vibration-based, visual-based (e.g. flashing lights from buttons of the digital device) or a combination.
The notification window 410 may comprise an image (e.g. an exclamation mark or a mail-envelope) which would serve to highlight to the user that a corresponding interrupt event has been detected and requests his separate attention from what he is currently working on in the first dialog window 402. The image may also alternatively be pre-configured to be associated with the type of event being detected. Moreover, the image to be used on the notification window 410 may also be configurable by the user according to his preferences, from a control settings applet. Optionally, the notification window 410 may display a message alerting the user to the interrupt event. The notification window 410 is displayed to the user for a predetermined time period, which may be defined according to a factory configuration of the digital device or through a control settings applet, as described above. Alternatively, the user may dismiss the notification window 410 by pressing the close-icon 414 if he does not wish to wait for the expiry of the predetermined time period.
After the time period has expired (i.e. a corresponding signal is received), the notification window 410 is dismissed and the second dialog window 404, with enabled buttons 416, is then displayed together with the first dialog window 402 to the user as shown in the second state (i.e. the diagram on the right of arrow 408) of Figure 4.
Figure 5 shows again in different transition states, the processing of a first dialog window 502 and a second dialog window 504 with respect to each other in different time intervals on a display screen 506 and the transit of time being represented by the arrow 508. Similar to Figure 4, a fourth embodiment 500 of the UI processing software in Figure 5 functions largely in the same manner, except with one difference. For the sake of brevity, workings of the fourth embodiment 500 which are similar to those of the third embodiment 400 described in Figure 4 are therefore not explained again. In comparison with the third embodiment 400, the fourth embodiment 500 in the first state (i.e. the diagram on the left of arrow 508) only generates an alert 510 and waits for a predetermined time period. Consequently when the time period has expired (i.e. a corresponding signal is received), the second dialog window 504, with enabled buttons 512, are thereafter displayed alongside with the first dialog window 502 as clearly shown in the second state (i.e. the diagram on the right of arrow 508) of Figure 5.
It is also to be further appreciated that in any of the afore-described embodiments, if the UI processing software is used on a digital device which corresponds to the mobile phone 106 or any device equipped with a physical keypad, the buttons on the keypad which are mapped to the respective enabled/activated buttons 210b, 310, 416, 512 as contained in the second dialog windows 206, 306, 404, 504, then consequently only become active after the time period has expired. This means that during the time period, the UI processing software or operating system ignores any signals received as a result of pressing the related buttons on the keypad or the related buttons are simply inactivated briefly (i.e. signal generation is disabled) for this purpose. Further, Figure 6 shows a fifth embodiment 600 of the UI processing software, where the processing of a first dialog window 602 and a second dialog window 604 with respect to each other on a display screen 606 is depicted, in different transition states according to the elapse of time as represented by the arrow 608. In the first state (i.e. the diagram on the left of arrow 608), when a new interrupt event is detected after the generation of the first dialog window 602, a user response dialog window 610 is shown to the user. The user response dialog window 610 requires the user to provide either a password or a specific input keys combination (e.g. a combination of "A" plus "B" keys or a succession pressing of the keys "A" and "B") to the operating system in order to bring the second dialog window 604 into the display foreground. The password or input keys combination may either be predetermined according to a factory configuration or subsequently defined by the user through a control settings applet. When the correct password/input keys combination is received and successfully authenticated by the operating system (i.e. a corresponding signal is received), the user response dialog window 610 disappears and the second dialog window 604 (together with the buttons 612 which may be pressed immediately) is then displayed to the user as shown in the second state (i.e. the diagram on the right of arrow 608).
Figure 7 shows a sixth embodiment 700 of the UI processing software, where the processing in different transition states of a first dialog window 702 and a second dialog window 704 with respect to one another on a display screen 706 is illustrated. The elapse of time is indicated by the arrow 708. In the first state (i.e. the diagram on the left of arrow 708), on receiving a new interrupt event after the first dialog window 702 has been generated and in use by the user, an unlock dialog window 710 is then displayed on a designated area on the display screen 706. The unlock dialog window 710 may comprise a slider 712 and an associated gesture-cue indicator 714. The display of the gesture-cue indicator 714 may also optionally be omitted according to another alternative embodiment.
In order to bring up the display of the second dialog window 704, the user needs to actuate the slider 712 by providing a gesture, i.e. by sliding the slider 712 towards the extreme right side. Specifically, the gesture-cue indicator 714 provides an indication of how the slider 712 is activated should the user not be knowledgeable or well-informed on the use of such a feature. The actuation of the slider 712 may be performed by using fingers or a stylus if the digital device is touch-screen based such as the Tablet PC 104 or other input devices (e.g. a mouse) if the digital device is a conventional machine such as the IBM®-compatible desktop personal computer (PC) 112. Alternatively, other types of response mechanisms may be used in place of the slider 712, such as a circular pad or a diagonal slider (all not shown).
Specifically, the type of gesture to be thus provided by the user will differ for example, a sweeping gesture or a circular gesture. This is correspondingly then reflected in the gesture-cue indicator 714 shown in the unlock dialog window 710. Furthermore, the gesture-cue indicator 714 may also be located outside the unlock dialog window 710, in proximity. When the user correctly actuates the slider 712 (i.e. a corresponding signal is received), the unlock dialog window 710 disappears and the second dialog window 704, with response buttons 716, is then displayed (refer to the diagram on the right of arrow 708). In addition, the sixth embodiment 700 is also applicable for use in the embodiments 200, 300 as described in Figures 2 and 3 respectively, where the unlock dialog window 710 then replaces the predetermined time period.
Figure 8 shows seventh and eighth embodiments 800a, 800b of the UI processing software, where the second dialog window 802 appears with respect to the first dialog window 804 without any time delay on the display screen 806 upon detection of an interrupt event. Either one of the embodiments 800a, 800b may be adopted depending on the specific implementation of the UI processing software.
Importantly, instead of using conventional classic buttons, sliders 808a, 808b corresponding to similar key functions (e.g. "OK" and "Cancel") are used. More specifically, the first embodiment 800a uses two sliders 808a, each in place of the conventional buttons whereas the second embodiment 800b utilizes only one slider 808b to represent the typical buttons. Particular instructions on how to activate the sliders 808a, 808b may be provided through a message box 810 located in proximity to the sliders 808a, 808b (e.g. at the top as shown). An example of the instructions contained in the message box 810 in this case may be: "Please drag the left slider to indicate an affirmative answer or the right slider to indicate a negative answer".
Alternatively, the message box 810 may not be provided as the user is able to intuitively figure out how to operate the sliders 808a, 808b. In addition, graphics, animations or other optional means as known to the skilled person may also be used to instruct the user on how to operate the sliders 808a, 808b.
Alternatively, other types of response mechanisms may be used in place of the sliders 808a, 808b, such as circular pads or diagonal sliders (all not shown). Figure 9 shows a ninth embodiment 900, which is more applicable for touch-screen- based digital devices such as the smartphone 110, Tablet PC 104 or some laptops 102. The ninth embodiment 900 is considered a variation of the embodiments in Figure 8 where the second dialog window 902 also appears with respect to the first dialog window 904 without any time delay, on the display screen 906 upon detection of an interrupt event. The difference in the ninth embodiment over that in Figure 8 is that the sliders 808a, 808b in Figure 8 are now replaced by rectangular response areas 908a, 908b as shown, in which sliding/ sweeping gestures are to be provided by the user, either using a stylus or his fingers. A message box 910 which displays the type of interrupt event may also optionally be displayed (e.g. "An incoming call from Norton"). The response areas 908a, 908b may be located proximal to each other. The response areas may be rectangular as shown or of a different shape (e.g. circular) .
In addition, the gesture required for each of the rectangular response areas 908a, 908b is pre-programmed to be distinctly different so that the user is unlikely to mistakenly trigger any one of them (e.g. a right sliding gesture is required to trigger the "OK" command while a left sliding gesture is required for the "Cancel" command). The type of gesture required may also be indicated through an animation or image which is preferably displayed within the response areas 908a, 908b. However, such images may also alternatively be provided in proximity to the response areas 908a, 908b. Evidently, it is highlighted hereby that the main differences between the
embodiments in Figures 8 and 9 from that in Figure 7 are that there is no display of the unlock dialog window 710 for accepting a correct response, before the second dialog window can subsequently be shown to the user and that the buttons in the second dialog window are replaced by sliders or response areas.
Additionally, in a tenth embodiment (not shown), when a new interrupt event is detected, the user is notified by a balloon message or a status icon appearing in a designated notification area of the operating system, for example the notification area of the Windows taskbar in Microsoft Windows . A generated dialog window with enabled response buttons corresponding to the detected event is then displayed only if the user specifically clicks on the related balloon message or status icon. Optionally, the buttons may initially be inactivated for a certain time interval, after which they become active. More importantly, it is to be appreciated that this embodiment is particularly suitable for implementation in a digital device installed with an operating system with relatively complex functionalities, such as those in the Tablet PC 104, smartphone 110, laptop 102, iMac® 108 or IBM®-compatible PC 112.
One variation of the tenth embodiment (not shown) is that the balloon message that appears in the notification area contains a text message and buttons of a typical dialog window such as any one of those described in the foregoing embodiments. However, the buttons are initially disabled or not displayed until the expiry of a predetermined time period, after which they are then activated or visible to the user. Therefore, instead of displaying a dialog window with a text message and inactivated buttons (such as the embodiment in Figure 2) or without buttons (such as the embodiment in Figure 3) for a predefined time period, it follows that in this particular embodiment that the text and buttons are displayed in a notification interface such as a balloon message that appears in the notification area of the operating system. The predefined time period may range in this case from 0 seconds to 30 seconds. The 0 seconds option is used when the buttons are intended to be operational and to appear immediately, without delay, at the same time with the balloon message. Optionally, the buttons contained in the balloon message are accessible via a computer mouse or other like input methods (e.g. haptic-based) and not through keyboards typically used by personal computers or laptops.
Moreover, when the balloon message initially appears, an alert may also optionally be generated. The alert may be audible-based, vibration-based, visual-based (e.g. flashing lights from buttons of the digital device) or a combination. This variation of the tenth embodiment is more appropriately used on digital devices such as the Tablet PC 104, smartphone 110, laptop 102, iMac® 108 and the IBM®-compatible PC 112. According to an eleventh embodiment (also not shown), a second dialog window corresponding to an interrupt event is generated and immediately displayed, overlaying the other prior generated windows on the desktop of the operating system. However, the second dialog window is generated such that it is not selected or not active (i.e. lacks a current window focus). Therefore, if the keys of a keyboard/keypad that correspond to the buttons on the second dialog window are depressed, they are ignored by the operating system until the second dialog window is explicitly selected by the user and has the current window focus. It is envisaged that this specific embodiment may not protect against clicks made using a computer mouse (e.g. clicks made in an area where the buttons of the no-current- focus window are, will thus inevitably lead to pressing of the buttons) but might protect against unwanted/ unintended commands given by the user through the
keyboard/keypad. This embodiment is considered to be more applicable to digital devices with keyboards/keypads, including those with keyboards/keypads and also with a touch-screen, such as the Nokia® N97 mobile phone.
In a twelfth embodiment (not shown) of the UI processing software, the second dialog window appears, together with activated buttons, without any time delay. Consequent to triggering one of the buttons, a confirmation window appears which queries the user to confirm if he really wishes to proceed with the indicated action. When the user confirms by submitting an affirmative response through the confirmation window, the specific command is then promptly executed. Otherwise, the indicated action is cancelled and the system returns to a previous state. Hence, the advantage of this embodiment lies in that if the user accidentally clicks a button or presses a key on the keyboard which acts on the button of the second dialog window, he would still be able to subsequently remediate by cancelling the action through the confirmation window. The confirmation window can optionally be positioned on a different area of the display screen of the digital device, away from where the second dialog window is displayed. This then prevents the user from performing a quick succession doubleclicks or clicks in two spots close together, which might otherwise potentially act on both the buttons in the second dialog window and confirmation window.
Alternatively, a predetermined time period may be defined so that there is a visible time lag from when the buttons in the second dialog window are clicked till the display of the confirmation window. Furthermore during this time period, the second dialog window may either briefly disappear or continue to be displayed, but with inactivated buttons. This particular embodiment is considered more suitable to be used by digital devices running substantially complex operating systems (e.g. the IBM®-compatible PC 112, iMac® 108 or laptop 102), touch-screen based devices or, phones with keypads.
For phones with a keypad, the buttons of the dialog windows correspond to specific keys. When the buttons are inactive in the dialog window, pressing the
corresponding keys on the keypad does not trigger any action. It is also to be further highlighted that according to any of the afore-described embodiments, the first dialog window may also be the main window of a program (e.g. Adobe Acrobat®) or the desktop user-interface of an operating system installed on any one of the digital devices and the second dialog window is typically a window which comprises buttons.
Figure 10 is a flow diagram outlining the steps of a scheme 1000 in accordance with which the first embodiment 200 in Figure 2 is operated. In a step 1002, the UI processing software monitors to detect new interrupt events generated in the background, the detection being performed via a specific daemon as preconfigured in the operating system of the associated digital device. Upon detection of a new event, the scheme 1000 then inactivates the buttons 210a of the second dialog window 206 in a step 1004, which is generated by a program for corresponding to the new event. The buttons 210a are to be inactivated for a predetermined time period, which is either defined according to a factory configuration or adjustable by the user. Next, the second dialog window 206 is displayed to the user, but with inactivated buttons 210a, in a step 1006. Consequent to the expiry of the predetermined time period (i.e. receipt of a corresponding signal), the buttons 210b are then reactivated in a step 1008, so that the user may now appropriately respond to the event through the second dialog window 206.
Another flow diagram in Figure 11 outlines the steps of a scheme 1100, which is adopted by the second embodiment 300 in Figure 3. In a step 1102, the UI processing software similarly monitors for and detects new interrupt events generated in the background, the detection being performed via a specific daemon as preconfigured in the operating system of the associated digital device. Upon detection of a new event, the scheme 1100 does not show the buttons 310 of the second dialog window 306 (which is generated corresponding to the new event) for a predetermined time period in a step 1104. The predetermined time period is one of defined according to a factory configuration and adjustable by the user. In a next step 1106, the second dialog window 306 is then displayed to the user. When the predetermined time period expires (i.e. receipt of an associated signal), the buttons 310 are then shown and displayed to the user in a final step 1108.
In a further embodiment shown in Figure 12, a flow diagram outlines the steps of a scheme 1200, in accordance with which the third embodiment 400 in Figure 4 is operated. In a step 1202, the UI processing software monitors and detects new interrupt events generated in the background. The detection is preferably performed via a specific daemon as preconfigured in the operating system installed on the digital device. When a new event is detected, the scheme 1200 delays the display of the second dialog window 404 in a step 1204. Thereafter in a step 1206, an alert 412 is optionally generated in the digital device. The alert 412 may be audible-based, visual-based, vibration-based or a combination. In a subsequent step 1208, a notification window 410 is then displayed to the user for alerting him that a new event has been received. The notification window 410 is displayed for a predetermined time period according to a step 1210, which either is preconfigured in the factory or is adjustable by the user. Finally, in a step 1212, the notification window 410 is dismissed and the second dialog window 404 (complete with response buttons 416) is then displayed to the user when an associated signal is received. The signal is generated either when the predetermined time period expires or when the user dismisses the notification window 410 by pressing the close-icon 414.
In another alternative embodiment shown in Figure 13, the flow diagram outlines the steps of a scheme 1300 adopted by the fourth embodiment 500 in Figure 5. In a step 1302, the UI processing software monitors and detects new interrupt events generated in the background, the detection being performed through a specific daemon as preconfigured in the operating system of the digital device operated by the user. When a new event is detected, the scheme 1300 delays the display of the second dialog window 504 in a step 1304. In a further step 1306, an alert 510 is generated in the digital device. The alert 510 may be one of audible-based, visual- based, vibration-based and a combination. The scheme 1300 then waits for a predetermined time period in a step 1308. The predetermined time period may be adjustable by the user or follows a factory pre-configuration. In a step 1310, the second dialog window 504, along with the buttons 512, is then displayed to the user when the time period expires (i.e. a signal is received).
Yet another embodiment is shown in Figure 14 where the steps of a scheme 1400 are outlined in the flow diagram. The scheme 1400 is adopted by the fifth embodiment 600 in Figure 6. In a step 1402, the UI processing software monitors and detects new interrupt events generated in the background, performing the detection using a specific daemon as preconfigured in the operating system of the digital device used by the user. When a new event is detected, the scheme 1400 delays the display of the second dialog window 604 in a step 1404. Instead, a user response dialog window 610 is displayed, which requests and waits for a response from the user in a step 1406. The response may be in the form of providing a password or supplying an input keys combination to the operating system. Until the operating system successfully authenticates the provided response, the user response dialog window 610 remains. In a last step 1408, the user response dialog window 610 is dismissed and the second dialog window 604 is subsequently displayed to the user when a signal is received by the operating system. The signal in this case corresponds to the receipt and successful authentication of a response provided by the user.
In an alternative embodiment shown in Figure 15 where the steps of a scheme 1500 are outlined in the flow diagram, in a step 1502, the UI processing software constantly monitors the background to detect new interrupt events. The detection is carried out using a specific daemon as preconfigured in the operating system of the digital device used by the user. When a new event is detected, display of the second dialog window 704 is delayed in a step 1504. This is then followed by the display of an unlock dialog window 710 in a step 1506, which waits for an appropriate response from the user. The type of response required from the user is the same as described afore in Figure 7 and is not repeated here. When the correct response is eventually received, the unlock dialog window 710 disappears and the second dialog window 704 is consequently displayed when a signal is received by the operating system in a step 1508. The signal is generated when the user provides a correct response.
The embodiment in Figure 16 shows the steps of another scheme 1600 outlined in the illustrated flow diagram, which is adopted by the embodiments in Figures 8 and 9. In a step 1602, the UI processing software constantly monitors the background in order to detect new interrupt events. The detection is carried out using a specific daemon as preconfigured in the operating system of a digital device in use. When a new event is detected, the second dialog window 802, 902 is displayed in a step 1604. The second dialog window contains the sliders 808a, 808b of Figure 8 or the response areas 908a, 908b of Figure 9. Further in a step 1606, the scheme 1600 then waits for the user to submit a response. The response may consist of dragging the sliders 808a, 808b to a desired opposite end or activating the response areas 908a, 908b using an appropriate gesture, such as a sliding gesture or a sweeping gesture. Hence, when the correct response is received, the associated commands mapped to the respective sliders 808a, 808b or response areas 908a, 908b are then executed in a final step 1608. Figure 17 shows the steps of a scheme 1700 as illustrated, in which in a step 1702, the UI processing software monitors the background to detect the occurrence of new interrupt events. The detection is performed using a specific daemon as preconfigured in the operating system of a digital device in use. When a new event is detected, the second dialog window corresponding to the detected event is displayed in a step 1704. The second dialog window may contain conventional buttons, the sliders 808a, 808b of Figure 8 or the response areas 908a, 908b of Figure 9. When the user presses one of the buttons to indicate his response, a confirmation window is then displayed to the user in a step 1706. In the confirmation window, the user is queried to confirm whether he wishes to proceed with the indicated action as submitted. The scheme 1700 thus waits for any response to be submitted through the confirmation window in a step 1708.
Eventually, when a response is received in a step 1710, depending on the type of response as submitted, the associated commands mapped to the buttons on the confirmation window are executed, such as proceeding with the indicated action or cancelling the indicated action to return to the previous state.
Furthermore, in another embodiment of the invention, each of the previously presented embodiments and their variations may be realized as computer readable code (i.e. programming instructions) on a computer readable storage medium. The computer readable storage medium is any data storage device that can store data which can thereafter be read by a computer system, including both transfer and non-transfer devices. Examples of the computer readable storage medium include read-only memory, random-access memory, CD-ROMs, Flash memory cards, DVDs, Blu-ray Discs, magnetic tapes, optical data storage devices, and carrier waves. The computer readable storage medium can also be distributed over network-coupled computer systems so that the computer readable code is stored and executed in a distributed fashion. Figure 18 shows a schematic of an embodiment of an apparatus 1800 for performing the process of the invention. The apparatus 1800 comprises a number of functional modules which may be implemented in hardware, software, or as a combination of the two. In the present embodiment, the apparatus 1800 shown in Figure 18 is part of a multi-functional device such as the mobile phone 106, with a processor 1802 being controlled by a global controller (not shown) of the device.
The apparatus 1800 of the present embodiment comprises an interrupt event detector 1804, which may be coupled to components/ sensors 1806 of the device such as the signal receiver of the mobile phone 106, and monitors the detection of an interrupt event. The monitoring may be performed continually, or at
predetermining polling intervals. The apparatus 1800 also comprises an interface generator 1808. The interface generator 1808 is coupled to a storage device 1810 such as a ROM which stores predetermined interface layouts and instructions to enable the layouts to be generated by the interface generator 1808, and the ROM may also store additional information which specifies the event and device operations to which the interface layouts correspond. For example, the ROM stores an interface for an interrupt event, such as "incoming call", together with non- interrupt events, such as the interface for generation of a text message.
On detection of an interrupt event, a signal is generated by the interrupt event detector 1804 which is transferred to the processor 1802. The processor 1802 interprets the received signal and transfers a command to the interface generator 1808 to extract instructions from the ROM for generating a new interface which corresponds to the event associated with the detected event.
The interface generator 1808 generates the appropriate new interface and sends a signal to the processor 1802 so that it can be provided to a display unit (not shown) and presented to a user, either separately, or together with a previously generated interface. The processor 1802 controls the provision of the interface to the display unit in one or more ways as described above. For example, the generated interface may be modified so that various components of the interface are disabled for a predetermined time, or are not made visible until a predetermined event such as a password input occurs. The processor may be coupled to a condition monitor, which determines whether the condition under which the generated interface can be displayed has occurred. If the condition has occurred, a signal is provided to the processor 1802, and the processor 1802, on receipt of the signal, controls the display of the generated interface. The condition monitor may be implemented as part of the processor 1802, which may be associated with its own in-built clock, for example, in order to determine whether a predetermined time has elapsed. The condition under which the generated interface can be displayed can be set by the user, through an input means 1812, or can be programmed according to a factory setting of the device.
The present invention can also be used in digital devices which use operating systems or software programs adapted to/specific to/which support 3D gestures as an input method. Furthermore, it can also be used in all digital devices which currently have touch-screen technology or will have this technology in the future, and not just the ones explicitly referred to in this application.
The technology can also be adapted to be used with digital devices which will use new methods of input that will be developed in the future. The embodiments of the technology which will be used in such cases will be relatively similar to the ones described above, being adapted to the respective input method.
Although some of the embodiments presented may appear to not have many common elements, they all serve the same purpose— to prevent unwanted commands from being given by a user to a digital device - and as such are undoubtedly part of the same technology.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary, and not restrictive; the invention is not limited to the disclosed embodiments.
Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practising the claimed invention. In the claims, the term 'comprising' does not exclude other elements or steps and the indefinite article 'a' or 'an' does not exclude a plurality. A single processor or other unit may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in different dependent claims does not mean that a combination of these measures cannot be used to advantage.
Any reference signs in the claims should not be construed as limiting the scope of the claims.

Claims

Claims
1. A method for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device, the operating system being used by a user, the method comprises:
detecting an interrupt event which leads to generation of the at least one new dialog window; and
delaying or hiding the display of at least a portion of the at least one new dialog window in response to detection of the interrupt event until a signal is received.
2. A method for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device, the operating system being used by a user, the method comprises:
detecting an event which leads to generation and display of the at least one new dialog window; and
disabling at least a portion of the at least one new dialog window in response to detection of the interrupt event until a signal is received.
3. A method according to claim 1 or claim 2 in which the portion of the at least one new dialog window comprises activation buttons.
4. The method according to any one of the preceding claims, wherein the prior generated user-interface is one of a dialog window, the desktop of a personal computer or the main screen displayed by the operating system running on a digital device and a main window of a program executed on the operating system.
5. The method according to any one of the preceding claims, wherein the receipt of the signal corresponds to one of an expiry of a predetermined time period and a response action submitted through the operating system by the user.
6. The method according to claim 5, wherein the predetermined time period is adjustable by the user.
7. The method according to claim 5, wherein the predetermined time period is defined according to a factory configuration.
8. The method according to claim 6 or claim 7, wherein the predetermined time period is defined to be in a range of between one millisecond to thirty seconds.
9. The method according to claim 5, wherein the response action submitted by the user comprises providing at least one of a password and an input keys combination, the password and input keys combination being predetermined.
10. The method according to claim 5, wherein the response action submitted by the user comprises providing a predetermined gesture on a designated area on the digital device.
11. The method according to claim 10, wherein an image is located proximal to the designated area for indicating to the user the required gesture.
12. The method according to claim 10 or claim 11, wherein the gesture is one of a sliding gesture and a sweeping gesture.
13. The method according to any one of claims 10 to 12, wherein the designated area is on the display screen of the digital device.
14. The method according to any one of claims 1 to 3, further comprising generating an alert in response to the detection of the interrupt event.
15. The method according to claim 14, wherein the alert is at least one of audible-based, visual-based and vibration-based.
16. The method according to claim 15, wherein the audible-based alert is one of a chime and a ringing tone.
17. The method according to any one of claims 14 to 16, further comprising displaying a notification window in response to the detection of the interrupt event, wherein the notification window informs the user that the interrupt event is detected.
18. The method according to claim 17, wherein the notification window comprises a display image configurable by the user.
19. The method according to any one of claim 17 or claim 18, further providing the user with the possibility of closing the notification window that informs the user of the detected interrupt event, wherein the closing of the notification window generates the signal which determines the display of the new dialog window.
20. The method according to claim 15, wherein the visual alert consists in a balloon message or a status icon and the signal determining the display of the new dialog window is generated by a user clicking the balloon message or status icon.
21. The method according to any of the preceding claims, wherein the at least one new dialog window is displayed together with the prior generated user-interface when the signal is received.
22. A method for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device, the operating system being used by a user, the method comprising:
detecting an interrupt event which leads to generation of the at least one new dialog window; and
displaying the at least one new dialog window,
wherein the at least one new dialog window has the form of a balloon message or notification interface displayed on the prior generated user-interface.
23. A method according to claim 22, wherein buttons contained by the balloon message or notification interface are displayed or become active at a predetermined time after the balloon message or notification interface is displayed.
24. A method according to claim 23, wherein the predetermined time is between zero and 30 seconds after detection of the interrupt event.
25. A method for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device, the operating system being used by a user, the method comprises:
detecting an interrupt event which leads to generation of the at least one new dialog window; and
displaying the at least one new dialog window,
wherein the at least one new dialog window comprises at least one button configured to receive a non-click-based input.
26. A method according to claim 25, in which the display of at least a portion of the at least one new dialog window is delayed or hidden, or at least a portion of the at least one new dialog window is disabled, in response to detection of the interrupt event until a signal is received, wherein the signal is represented by the expiry of a predetermined time period which may range from one millisecond to thirty seconds.
27. A method for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device, the operating system being used by a user, the method comprises:
detecting an interrupt event which leads to generation of the at least one new dialog window;
displaying the at least one new dialog window,
wherein the at least one new dialog window comprises at least one button configured to receive a non-click-based input, and
replacing all the buttons configured to receive a non-click based input with activation buttons after the expiry of a predetermined time period,
wherein the predetermined time period is defined to be in a range of between one millisecond to thirty seconds.
28. The method according to any one of claims 25 to 27, wherein the at least one button configured to receive a non-click-based input is a key combination/ sequence button that requires the user to provide a corresponding input through a keyboard or keypad of the digital device.
29. The method according to any one of claims 25 to 27, wherein the at least one button configured to receive a non-click-based input consists in a text input box, which on one side displays a predetermined string of text, where in order to execute the command corresponding to that particular button, the user is required to subsequently provide the predetermined string of text into the text input box.
30. The method according to any one of claims 25 to 27, wherein the least one button configured to receive a non-click-based input is one of a slider and a gesture- based button.
31. The method according to claim 30, wherein the gesture-based button is activated using one of a sliding gesture and a sweeping gesture.
32. The method according to one of claim 30 and claim 31, wherein an image is located proximal to the gesture-based button for indicating to the user the required gesture.
33. The method according to any of the preceding claims, wherein the at least one new dialog window is one of a modal or modeless window.
34. A method for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device, the operating system being used by a user, the method comprises:
detecting an interrupt event which leads to generation of the at least one new dialog window which comprises at least one button;
displaying the at least one new dialog window; and
displaying a confirmation window in response to the receipt of an input from the user provided through the at least one button.
35. A method for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device, the operating system being used by a user, the method comprising:
detecting an interrupt event which leads to generation of the at least one new dialog window; and
displaying the at least one new dialog window immediately, the window lacking a current window focus.
36. The method according to any of the preceding claims, wherein the interrupt event is triggered by at least one of hardware and software.
37. The method according to any one of the preceding claims, wherein the digital device is one of a mobile phone and a computing device.
38. A computer program being configured to perform a method according to any one of claims 1 to 37.
39. An apparatus for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device, the apparatus comprising:
an event detector for detecting an interrupt event;
an interface generator for generating at least one new dialog window in response to detection of the interrupt event; and
a processor for delaying or hiding the display of at least a portion of the at least one new dialog window until a signal is received.
40. An apparatus for processing at least one new dialog window with respect to a prior generated user-interface displayed in an operating system of a digital device, the apparatus comprising:
an event detector for detecting an interrupt event;
an interface generator for generating at least one new dialog window in response to detection of the interrupt event; and a processor for disabling at least a portion of the at least one new dialog window until a signal is received.
41. An apparatus according to claim 39 or claim 40 in which the portion of the at least one new dialog window comprises activation buttons.
42. An apparatus according to one of claims 39 to claim 41 , wherein the apparatus is one of a mobile phone and a computing device.
43. An apparatus according to one of claims 39 to 42, wherein the receipt of the signal corresponds to one of an expiry of a predetermined time period and a response action submitted through the operating system by a user.
PCT/EP2011/054273 2010-12-16 2011-03-21 User interface WO2012079779A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ROA201001358 2010-12-16
ROA201001358A RO127448A0 (en) 2010-12-16 2010-12-16 Natural communication technology

Publications (1)

Publication Number Publication Date
WO2012079779A1 true WO2012079779A1 (en) 2012-06-21

Family

ID=44021815

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2011/054273 WO2012079779A1 (en) 2010-12-16 2011-03-21 User interface

Country Status (2)

Country Link
RO (1) RO127448A0 (en)
WO (1) WO2012079779A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105468221A (en) * 2014-08-18 2016-04-06 腾讯科技(深圳)有限公司 Window control method and device
US10268489B2 (en) 2016-09-20 2019-04-23 International Business Machines Corporation Adaptive successive warning message handling
CN112918251A (en) * 2019-12-06 2021-06-08 丰田自动车株式会社 Display control device, vehicle, display control method, and recording medium having program recorded thereon
CN115174504A (en) * 2022-06-07 2022-10-11 青岛海信移动通信技术股份有限公司 Interface display method, terminal equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070294627A1 (en) * 2006-06-16 2007-12-20 Microsoft Corporation Suppressing Dialog Boxes

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070294627A1 (en) * 2006-06-16 2007-12-20 Microsoft Corporation Suppressing Dialog Boxes

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105468221A (en) * 2014-08-18 2016-04-06 腾讯科技(深圳)有限公司 Window control method and device
CN105468221B (en) * 2014-08-18 2020-12-08 腾讯科技(深圳)有限公司 Window control method and device
US10268489B2 (en) 2016-09-20 2019-04-23 International Business Machines Corporation Adaptive successive warning message handling
CN112918251A (en) * 2019-12-06 2021-06-08 丰田自动车株式会社 Display control device, vehicle, display control method, and recording medium having program recorded thereon
CN112918251B (en) * 2019-12-06 2024-07-05 丰田自动车株式会社 Display control device, vehicle, display control method, and recording medium having program recorded thereon
CN115174504A (en) * 2022-06-07 2022-10-11 青岛海信移动通信技术股份有限公司 Interface display method, terminal equipment and storage medium
CN115174504B (en) * 2022-06-07 2024-03-15 青岛海信移动通信技术股份有限公司 Interface display method, terminal equipment and storage medium
CN115174504B8 (en) * 2022-06-07 2024-04-05 青岛海信移动通信技术有限公司 Interface display method, terminal equipment and storage medium

Also Published As

Publication number Publication date
RO127448A0 (en) 2012-05-30

Similar Documents

Publication Publication Date Title
AU2021201814B2 (en) Unlocking a device by performing gestures on an unlock image
US11269575B2 (en) Devices, methods, and graphical user interfaces for wireless pairing with peripheral devices and displaying status information concerning the peripheral devices
US10223518B2 (en) Unlocking a portable electronic device by performing multiple actions on an unlock interface
US10037216B2 (en) Intelligent disabling of browser plugins
US20070150826A1 (en) Indication of progress towards satisfaction of a user input condition
WO2014121730A1 (en) Electronic device and screen unlocking method thereof
US20200379946A1 (en) Device, method, and graphical user interface for migrating data to a first device during a new device set-up workflow
WO2012079779A1 (en) User interface
EP2706451B1 (en) Method of processing touch input for mobile device
WO2012098360A2 (en) Electronic device and method with improved lock management and user interaction
AU2011101193A4 (en) Unlocking a device by performing gestures on an unlock image
CN108228037B (en) Intelligent terminal, interface control method thereof and device with storage function
AU2008100419A4 (en) Unlocking a device by performing gestures on an unlock image
AU2015255304A1 (en) Unlocking a device by performing gestures on an unlock image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11712205

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11712205

Country of ref document: EP

Kind code of ref document: A1