AU2008261147A1 - Hierarchical authoring system for creating workflow for a user interface - Google Patents
Hierarchical authoring system for creating workflow for a user interface Download PDFInfo
- Publication number
- AU2008261147A1 AU2008261147A1 AU2008261147A AU2008261147A AU2008261147A1 AU 2008261147 A1 AU2008261147 A1 AU 2008261147A1 AU 2008261147 A AU2008261147 A AU 2008261147A AU 2008261147 A AU2008261147 A AU 2008261147A AU 2008261147 A1 AU2008261147 A1 AU 2008261147A1
- Authority
- AU
- Australia
- Prior art keywords
- screen
- user interface
- group
- transition
- state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Stored Programmes (AREA)
- User Interface Of Digital Computer (AREA)
Description
S&F Ref: 849925 AUSTRALIA PATENTS ACT 1990 COMPLETE SPECIFICATION FOR A STANDARD PATENT Name and Address Canon Kabushiki Kaisha, of 30-2, Shimomaruko 3 of Applicant: chome, Ohta-ku, Tokyo, 146, Japan Actual Inventor(s): Gerard Anthony Hill, Andrew John Shellshear, Michael James Webster Address for Service: Spruson & Ferguson St Martins Tower Level 35 31 Market Street Sydney NSW 2000 (CCN 3710000177) Invention Title: Hierarchical authoring system for creating workflow for a user interface The following statement is a full description of this invention, including the best method of performing it known to me/us: 5845c(1901205_1) HIERARCHICAL AUTHORING SYSTEM FOR CREATING WORKFLOW FOR A USER INTERFACE TECHNICAL FIELD The current invention relates to the field of User Interface (ULI) creation and, in particular, to providing a representation for User Interface Workflow that is able to be used in a graphical design environment by a designer. 5 BACKGROUND The development of Graphical User Interfaces (GUIs) for complex applications is often difficult, time consuming and prone to errors. GUIs are displays, or presentations, of information (screens) that include user selectable controls, icons or menus that are able to be manipulated by a user to achieve some effect or mode of operation of a 10 computerised application. GUIs are used in a wide variety of computerised applications including, but not limited to, word processor and spreadsheet applications typically executed on general purpose computers, specialised computerised control for manufacturing or process control, consumer devices such as portable audio/video players, and components such as televisions and DVD player/recorders, to name but a 15 few. A GUI may form part of, or a component of a larger GUI. For example, a word processing application may present to the user as a GUI, but nevertheless may include a number of individual GUIs that are related in some form of hierarchy of functionality within the application. Selectable or manipulable icons within a GUI are often colloquially called "buttons". 20 GUIs represent a class of user interfaces that are typically distinguished from otherwise "simple" user interfaces (UIs) in that, in a GUI, the controls or menus are displayed or otherwise presented and the display or presentation does not, of itself, perform the control. Hence, GUIs may dynamically alter in position, function or some 1898030_1 849925_specilodge -2 other parameter. The GUI is also typically manipulated with a general purpose user input device, such as a keyboard, keypad, and/or a pointer device such as a mouse pointer or trackball. GUIs also lend themselves to implementation with touch panels where the display and the input device are unitarily formed. 5 GUIs contrast with "simple" user interfaces where each control has a fixed function, and generally physically performs the function. Such is the case with a manually operable "ON/OFF" switch which actually performs an electrical switching connection, or a dial labelled "VOLUME", for example as coupled to a potentiometer which is used to adjust the amplitude of an audio signal in an amplifier. By contrast, a 10 graphical On/Off switch which may be displayed on a computer display could be varied in size, position or colour, for example. Similarly a graphical volume control may be represented upon a computer display as either a slider or a dial, either of which, when manipulated, may cause a sound processor of the computer to adjust the volume of an audio output. 15 Other forms of user interfaces exist, which include similar adaptable functionality like a GUI, but which are not implemented to depend entirely upon a graphical (display) presentation. One example is a speech or spoken word interface, where voice commands or responses are interpreted to cause some desired function. For example, a motor vehicle control system may audibly reproduce the question "Do you wish to start the 20 engine?". The system could then await a voice response "Yes" or No". Tactile interfaces may be configured to operate in a similar fashion. Speech and tactile interfaces are highly desirable for, and sought after by, sight-impaired persons. It follows therefore from the above, that the "screens" of a GUI represent one state of that user interface as presented to the user at that point in time. From that state, the user may 25 provide controlling commands and/or responses to prompts presented in that state. The 1898030_1 849925_speci_lodge -3 same applies for other functionally adaptable interfaces such as speech or tactile interfaces. Each presents or represents a "state" to which or with which the user can interact. For the purposes of the remainder of the present description, unless otherwise 5 stated, any reference to a "user interface" or "UI" is to be interpreted as a reference to a graphical user interface (GUI) or to another user interface, such as a speech or tactile interface, of adaptable functionality. A reference to a "screen" of a GUI may similarly be inferred as a reference to a state of a speech or tactile U. A typical process 100 for Graphical User Interface development is illustrated in 10 Fig. 1. Initially, in step 101, a Human Computer Interface (HCI) specialist will prepare documentation, known as the User Interface Workflow document, of the proposed user interface. The document would generally include screens that are required to perform each task, some information about the controls and operations available in each screen, and the navigation between screens. Next, in step 102, a graphical designer takes the 15 document containing the User Interface Workflow specification, and designs the look and feel of the controls and behaviours for each screen. The output may be supplied in a series of bitmap files for icons, windows, backgrounds and controls, or in some other graphical format. A programmer then writes, in step 103, computer executable User Interface 20 software code to collect the separate artwork pieces for each screen, and add behavioural functionality to that artwork. Adding behaviour could include such things as adding the ability to receive a mouse press event and to generate a button event, with button artwork to create an operational widget. Step 104 then involves the programmer from step 103 and the designer from step 102 to check that the coding work of the programmer matches 1898030_1 849925_specilodge -4 the artistic and functional vision of the designer. Steps 102, 103 and 104 may iterate several times if a problem is found in step 104. Once the screen designs are as desired, in step 105, a (or the) programmer integrates the User Interface code for each of the screens with the back end application, 5 i.e. the application that actually implements the word processing, the spreadsheet or audio/video database management and reproduction, for example. At step 106, a (the) programmer integrates or links all of the screens together with navigational elements according to the navigation provided or otherwise defined in the User Interface Workflow Document. The User Interface is tested in step 107, and the 10 process 100 iterates back to step 101 (or perhaps step 104, not illustrated) as required to correct any deficiencies in operation or performance (i.e. de-bugging). The above procedure has many problems. One significant problem is that the process 100 is time consuming and error prone because many quality problems associated with any one stage will not be detected at the time or stage they are 15 introduced. The above procedure may be improved by supplying tools that: (i) allow a designer to produce a functional user interface screen without much coding requirement on the programmer, or, (ii) allow a single person, such as an engineer, performing the roles of 20 designer and programmer, to build a user interface using a widget toolkit that has its own look and feel. By taking the first improvement, and allowing the designer to add functionality to artwork in a graphical editing tool, the process in Fig. I is shortened by removing processes 103 and 104. Fig. 2 shows a corresponding revised process 200 of a UI 25 development process with a tool for use by the designer, where steps 201, 203, 204 and 1898030_1 849925_specilodge -5 205 correspond with steps 101, 105, 106 and 107 respectively. Steps 102, 103 and 104, are now all performed by the designer in step 202. In order to perform all of these steps, the designer must be able to preview the design with functionality attached in the tool. The processes 200 of Fig. 2 still has the step of HCI design of UI Workflow, 201, 5 requiring the designer to interpret a design document. Additionally, steps 203 and 204 still require much programming work. If the second improvement is followed, the procedure 100 in Fig. I no longer needs the processes 102, 103 and 104. Fig. 3 shows a revised process 300 of UI development with a tool for the programmer in which steps 301, 303, 304 and 305 10 correspond with steps 101, 105, 106 and 107 respectively and which replaces steps 102, 103, and 104 by step 302. Once again the procedure still includes the step 301 of HCI design in a document form, integration of screens into an application workflow at step 304 as well as the testing step 305. Both the above improvements optimise the design process for individual screens 15 within an application by allowing screen graphics layout and functionality to be reasonably easily created and tested. An addition to the above optimisations is to add another tool to the tool chain that supports the design of the User Interface Workflow in such a way that the output of the tool can directly integrate with the screen designs that are created in another stage of 20 the UI development process. Fig. 4 shows what a typical Graphical User Interface Workflow tool, itself a GUI, might look like in an Editing mode. A window 400 represents a Workflow editor GUI. Icons representing each of the screens in the application being interfaced, or a part of the application, are labelled 410, 420, 430, 440, 450, and 460. The OK and CANCEL 25 buttons on Screen 1 (410), are labelled 411 and 412 respectively. Each of these buttons 1898030_1 849925_specilodge -6 is authored to cause a screen transition when activated by drawing a link from the corresponding button to the target of the screen transition that is caused by activation of the button. The drawing may be performed electronically within the GUI by using a pointer device, such as a mouse, to trace a line across the display screen from one 5 location to another. The OK button 411 has a transition 413 leaving it and going to Screen 2 (420). This indicates that when the user "presses" or otherwise selects the OK button 411, the next screen that should be loaded is Screen 2 (420). Similarly, the transition 414 links the CANCEL button 412 to the screen that it causes to be loaded, Screen 3 (430). Similar linkages are shown for the radio buttons 421, 422 and 423 10 which indicates the next screen to be loaded after Screen 2 (420) when any of these buttons is activated and the OK button 425 is pressed. The CANCEL button 424 on Screen 2 420 has a transition 429 drawn back to Screen 1 (410) which indicates that activating the CANCEL button 424 returns the user to Screen 1 (410). The HCI specialist creates the User Interface Workflow by using representations 15 of the screens that occur in the application, and drawing directional links from controls on a source screen to a target screen. The directional link indicates that the activation of the control causes the UI Application to replace the source screen with the target screen. Fig. 5 shows a Workflow GUI window 500 in a situation where screens 510, 520, 530 and 540 each require a transition to a single screen 550. This situation might occur 20 in the exemplary case where the screen 550 is a print dialog screen, and each of screens 510, 520, 530 and 540 have a print functionality. Another case where many screens need a transition to a single screen is in the case of the detection of software or hardware exceptions where a single screen warning message may result. For example, an interface on a device that allows USB key memory to be attached and accessed, might require a 25 screen to appear when the USB key is physically attached or detached. In the worst case, 1898030_1 849925_specilodge -7 a screen such as the screen 550 may require a transition from every screen in the application. This case can make the Workflow tool difficult to use because of the complexity of the display. SUMMARY 5 Disclosed is a method of authoring a User Interface workflow that allows an author to design a User Interface in a graphical editing environment. The output of the authoring tool is suitable for use, along with other tools, in a User Interface development tool chain. The UI being authored may be a GUI, but alternatively may be another type of UI with adaptable functionality. 10 The method is performed using an authoring GUI which allows grouping of UI components (e.g. screens for a GUI), with some commonality in behaviour, into UI component groups (e.g. screen groups for a GUI). With reference to the authoring of a GUI, the screen groups can contain functionality that is common to the contained screens. "Screen groups" are also known simply as "groups". A Group can also contain 15 one or more other groups. The author of a GUI can draw a directional link within the authoring GUI from a source screen icon to a target screen icon, indicating that a screen transition can occur from the source screen to the target screen. The author can select what event will cause the transition to be taken, and attach a description of the event to the transition. 20 Directional links can also be drawn from and to groups. A directional link from a screen group, to a group or screen, can be made by the author in the same way that the author defines a screen transition. A directional transition from a group applies to all the screens contained in the group. A transition to a group is interpreted as a transition to one of the screens contained inside the group. 1898030_1 849925_speci_lodge -8 The use of the group construct allows a simpler workflow to be constructed by removing the redundant requirement for transitions to a screen X to be placed on every screen that must be able to transition to screen X. The group construct allows a single transition to be authored from a parent group of the screens that must transition to X. 5 The method may optionally include the concept of a priority group. The Priority Group is different to a normal (simple) group in a couple of ways. Firstly the priority group has a priority field that indicates how high a priority is required to interrupt the workflow in the Priority group. Secondly, the priority group may be used to define a history stack, and operations available in the authoring environment for manipulating the 10 history stack. A history stack is a stack structure that allows the recording of which screens were visited during a user interaction so that a "back" navigation function is available to the user. In accordance with one aspect of the present disclosure, there is provided a method for authoring an executable user interface workflow component for a user 15 interface. The component defines navigation between a plurality of representable states of a corresponding authored user interface. The method displays an iconic representation of a plurality of representable states in an authoring environment and receives a user input to group a plurality of the representable states to form at least one state group representation. A user input is also received to generate a directional link between two 20 of the representations, wherein at least one of the two representations is one state group representation and the link symbolises a state transition between the two representations. The method then displays the directional link between the two representations in the authoring environment to represent a component of the authored user interface workflow, the component being executable to implement the state transition in the 25 authored graphical user interface. 1898030_1 849925_spedlodge -9 Other aspects are alos disclosed. BRIEF DESCRIPTION OF THE DRAWINGS At least one embodiment of the present invention will now be described with reference to the following drawings, in which: 5 Fig. I is a flowchart diagram of a prior art Graphical User Interface development methodology; Fig. 2 is a flowchart diagram of a prior art Graphical User Interface Development methodology that incorporates a tool that allows a designer to add functionality to artwork; 10 Fig. 3 is a flowchart diagram of a prior art Graphical User Interface Development methodology that incorporates a tool such as a widget toolkit that allows a programmer to add library functionality and artwork to a User Interface; Fig. 4 is a representation of an editing view of a prior art UI Workflow authoring tool; 15 Fig. 5 is a representation of an editing view of a prior art UI Workflow authoring tool showing a problematic design case; Fig. 6 is a representation of an editing view of a UI Workflow authoring tool according to the present disclosure; Fig. 7 is a graphical view of an example of a UI Workflow case that illustrates 20 the concepts of groups, screens, and transitions; Fig. 8 is a schematic block diagram representation of a general purpose computer system upon which the arrangements presently described may be implemented; Fig. 9 is a flowchart diagram of a process used to create and store a User Interface Workflow according to the present invention. 1898030_1 849925_speci_lodge - 10 Fig. 10 is an example of a nested state machine used for controlling widget behaviour; Fig. 11 is a context level diagram of a UI authoring system. Fig. 12 is a document tree diagram; and 5 Fig. 13 is a dataflow diagram of an embodiment of a runtime environment for executing a user interface. DETAILED DESCRIPTION INCLUDING BEST MODE Disclosed is a tool for the authoring of Uls. The tool utilizes a GUI presentation to provide for interactive authoring by a user. The authored UI may be a GUI or another 10 U of adaptable functionality. Unless otherwise stated, in the description and examples to follow, it is to be assumed that the authored UI is a GUI in which UI components or representable states include displayable "screens" with which a user may interact. Fig. 6 shows an example of a use of a screen group to define U behaviour that is common to a number of screens, and which provides a solution to the problematic design 15 case described above with respect to Fig. 5. In Fig. 6, an authoring GUI window 600 is shown including an iconic representation of each of a number of screens 620, 630, 640 and 650. Each screen 620, 630, 640 and 650 is required to transition to a further single screen 660. This is required behaviour equivalent to that shown in Figure 5. In this case however, the required behaviour is achieved by placing each of the screens 620, 630, 640 20 and 650 into a group, Group 1 610. The required transition is represented as a transition 611 from the group 610 to the screen 660. Therefore, the transition 611 applies to all screens, as well as groups, that are contained in the group 610, and to all descendants of group 610 in a containment hierarchy formed by the screens, groups and transitions. The transition 611 is a navigational element that represents and provides for navigating 25 between a plurality of screens in the GUI. The transition 611, together with the group 1898030_1 849925_speci_lodge -11 610 and screen 660, form and represent a user interface workflow component for the GUI being authored. Each iconic screen representation 620, 630, 640, 650 and 660 corresponds to an individual GUI of the GUI being authored. The workflow is that specific sequence of functionality afforded by the various GUI components. The 5 window 600 forms part of a GUI authoring environment and the transition 611 is seen as a link that symbolises the transition between the screens to be represented in the GUI being authored. The authored GUI is executable to provide for the desired overall workflow. The individual UI workflow components are similarly each executable to provide for specific workflow functionality desired. Typically execution is performed by 10 a computer or computing device, as described below. The methods of UI development and workflow integration described herein are typically implemented using a computer system 800, such as that shown in Fig. 8 wherein the processes of Figs. 6, 7 and 10 to 13 may be implemented using software, such as one or more application programs executable within the computer system 800. 15 In particular, the steps or method of authoring a user interface workflow are effected by instructions in the software that are carried out or executed within the computer system 800. The instructions may be formed as one or more code modules, each for performing one or more particular tasks. The software may also be divided into two separate parts, in which a first part and the corresponding code modules performs the workflow 20 authoring methods, and a second part and the corresponding code modules manage a graphical user interface between the first part and the user performing the authoring tasks and implementing the workflow. The software may be stored in a computer readable medium, including the storage devices described below, for example. The software is loaded into the computer system 800 from the computer readable medium, and then 25 executed by the computer system 800. A computer readable medium having such 1898030_1 849925_speci_lodge - 12 software or computer program recorded on it is a computer program product. The use of the computer program product in the computer system 800 preferably effects an advantageous apparatus for authoring a user interface workflow. As seen in Fig. 8, the computer system 800 is formed by a computer module 801, 5 input devices such as a keyboard 802, a mouse pointer device 803 and a microphone 819, and output devices including a printer 815, a display device 814 and loudspeakers 817. An external Modulator-Demodulator (Modem) transceiver device 816 may be used by the computer module 801 for communicating to and from a communications network 820 via a connection 821. The network 820 may be a wide 10 area network (WAN), such as the Internet or a private WAN. Where the connection 821 is a telephone line, the modem 816 may be a traditional "dial-up" modem. Alternatively, where the connection 821 is a high capacity (e.g.: cable) connection, the modem 816 may be a broadband modem. A wireless modem may also be used for wireless connection to the network 820. 15 The computer module 801 typically includes at least one processor unit 805, and a memory unit 806 for example formed from semiconductor random access memory (RAM) and read only memory (ROM). The module 801 also includes a number of input/output (I/0) interfaces including an audio-video interface 807 that couples to the video display 814, loudspeakers 817 and microphone 819, an 1/0 interface 813 for the 20 keyboard 802 and mouse 803 and optionally a joystick (not illustrated), and an interface 808 for the external modem 816 and printer 815. In some implementations, the modem 816 may be incorporated within the computer module 801, for example within the interface 808. The computer module 801 also has a local network interface 811 which, via a connection 823, permits coupling of the computer system 800 to a local 25 computer network 822, known as a Local Area Network (LAN). As also illustrated, the 1898030_1 849925_speci_lodge - 13 local network 822 may also couple to the wide network 820 via a connection 824, which would typically include a so-called "firewall" device or similar functionality. The interface 811 may be formed by an Ethernets circuit card, a wireless Bluetoothm or an IEEE 802.11 wireless arrangement. 5 The interfaces 808 and 813 may afford both serial and parallel connectivity, the former typically being implemented according to the Universal Serial Bus (USB) standards and having corresponding USB connectors (not illustrated). Storage devices 809 are provided and typically include a hard disk drive (HDD) 810. Other devices such as a floppy disk drive and a magnetic tape drive (not illustrated) may also 10 be used. An optical disk drive 812 is typically provided to act as a non-volatile source of data. Portable memory devices, such as optical disks (e.g.: CD-ROM, DVD), USB RAM, and floppy disks for example may then be used as appropriate sources of data to the system 800. The components 805 to 813 of the computer module 801 typically communicate 15 via an interconnected bus 804 and in a manner which results in a conventional mode of operation of the computer system 800 known to those in the relevant art. Examples of computers on which the described arrangements can be practised include IBM-PC's and compatibles, Sun Sparcstations, Apple Macm or alike computer systems evolved therefrom. 20 Typically, the application programs discussed above are resident on the hard disk drive 810 and read and controlled in execution by the processor 805. Intermediate storage of such programs and any data fetched from the networks 820 and 822 may be accomplished using the semiconductor memory 806, possibly in concert with the hard disk drive 810. In some instances, the application programs may be supplied to the user 25 encoded on one or more CD-ROM and read via the corresponding drive 812, or 1898030_1 849925_spedJodge -14 alternatively may be read by the user from the networks 820 or 822. Still further, the software can also be loaded into the computer system 800 from other computer readable media. Computer readable storage media refers to any storage medium that participates in providing instructions and/or data to the computer system 800 for execution and/or 5 processing. Examples of such media include floppy disks, magnetic tape, CD-ROM, a hard disk drive, a ROM or integrated circuit, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external to the computer module 801. Examples of computer readable transmission media that may also participate in the provision of instructions and/or data 10 include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like. The second part of the application programs and the corresponding code modules mentioned above is executed to implement one or more graphical user interfaces (GUIs) 15 to be rendered or otherwise represented upon the display 814. Through manipulation of input devices such as the keyboard 802 and the mouse 803, a user of the computer system 800 and the GUI authoring application may manipulate the interface to provide controlling commands and/or input to the applications associated with the UI authoring tool. 20 In the presently disclosed arrangements, the workflow authoring tool is a software application that, when executed by the processor 805, results in the presentation of an authoring GUI environment upon the display 814. The authoring GUI may itself include a number of screens for performing a number of authoring tasks. Those screens and tasks are used to create a GUI which, depending upon the intended purpose may also 25 be executable within the computer system 800 or another computer system or device. 1898030_1 849925_spedlodge - 15 The execution of an authored GUI may be referred to as "running" the GUI, or "runtime", during which the screens of the authored GUI are displayed to enable actual user interaction. Fig. 11 shows a data flow of a system 1100 for authoring a user interface 5 workflow according to the present disclosure. The system 1100 may be implemented using the computer system 800. An authoring tool 1101, being a GUI application executable within the computer 800, is used to generate a screen tree representation 1102, comprising a description of a hierarchical state machine or a description of a hierarchy of state machines, such as that shown in Fig. 10 to be described, for 10 representing the behaviour within the screens of the GUI being authored. The UI Workflow Tool editor 600 described in Fig. 6 can be used to generate the screen tree representation 1102, as will be described. The authoring tool 1101 also generates a widget tree representation 1103, comprising the description of a hierarchy of widgets that implement the behaviour of the widgets within the screens being authored. The 15 authoring tool 1101 also generates presentation data 1104 that describes any graphical or non graphical feedback that is to be presented by the GUI to the user. An example of such presentation data may include the SVG for a visible widget that animates in response to user or system events. The presentation data 1104, screen tree representation 1102, and widget tree representation 1103, are all preferably represented in XML in the 20 manner described herein. The screen tree representation 1102, is linked to the widget tree representation 1103, representing that each screen contains a widget tree. Fig. 13 shows a user interface framework 1300 that executes the authored user interface from the presentation data 1104, the screen tree representation 1102, and the widget tree representation 1103, that were created by the authoring tool shown in 1100. 25 An Interaction Manager 1303 starts the process by creating in memory, the user interface 1898030_1 849925_speci_lodge -16 objects 1302 from the data in the widget tree representation 1103 and the screen tree representation 1102 with the help of a parser 1301. Within the screen tree representation 1102, there is a designated initial screen which should be loaded as the current screen. The interaction manager 1303, sends commands to a modality manager 1305 indicating 5 the presentation data 1104, that should be loaded for the current screen. The modality manager 1305 is operable to manage the presentation of the GUI. The presentation data 1104, can include information indicating the selection of graphical or non-graphical user interface elements that are required for the presentation of the current screen in the GUI. The elements are then supplied to an audio synthesiser 1306, a renderer 1310 or to an 10 "other" modality unit 1308. The synthesizer 1306 generates output for audio reproduction via the loudspeakers 817, seen in Fig.8. The other modality unit 1308 represents an arbitrary technology for providing user interface feedback to a user, such as a force feedback device in a game, or a set of LEDs configured as a meter. The other modality unit 1308, is configured to generate output and send that output to an 15 appropriate 1/0 device, 1309. The graphical presentation data supplied to the renderer 1310 is rendered and output to the display 814. The renderer 1310 may receive input from a position indicating device such as the mouse 803, or text from the keyboard 802. The renderer 1310 may generate events indicating some information about the state of the graphics being displayed, and send event messages to an event queue 1315 which 20 stores the events until they can be processed in a serial fashion by the interaction manager 1303. The microphone 819 can be used to generate an audio input that is fed to a speech analyser 1314 to detect user spoken commands that may be used to supplement events in the event queue 1315. The Screen Tree Representation in 1102 can be authored using the process 900 25 shown in Fig. 9. The author uses a tool such as the tool shown in 600 to create the 1898030_1 849925_speci lodge - 17 Screen Workflow Representation. The first task is to decide on what screens are required in the application and to author and display those screens in the authoring application (905). The second task is to group screens based on requirements for common screen transition behaviour amongst the screens being grouped (910). The 5 screen to screen transitions within the workflow are then created by authoring and displaying the transitions between groups and screens (915). The transitions that are added include screen to screen, screen to group, group to screen and group to group transitions. Once the author is satisfied with the workflow that they have constructed, the workflow may be saved to a disk, 920, in a format that is described below. The 10 saved workflow may be read in by the authoring tool so that the workflow can be modified. The saved workflow can also be used in tools that allow the designers to add presentation and functionality within the screens that are specified The present inventors have determined that a tool that supports authoring workflow using groups, transitions and screens, desirably has an output format that 15 contains a serialised representation of the authored workflow. This representation is desirably also an interchange format to allow the screen workflow authoring tool to be part of a larger user interface tool chain. In the preferred implementations this output format is an XML document format that is both human and machine readable and writable. The XML documents produced by the authoring tool contain a machine 20 readable definition of a complete user interface. The Ul Workflow XML document is in a format that contains XML elements for screens, transitions, groups, priority groups, variables, and variable assignment, as indicated in Table 1 below: Table 1 1898030_1 849925_speci_lodge - 18 <uiworkflow> Root element for a UI workflow. <screen> A screen. <group> A group of screens. This can be used to specify behaviour common to multiple screens. <prioritygroup> This is also a group, but has slightly different behaviour from a normal <group> so as to be able to handle exceptions. <transition> Specifies which screen to transition to on a given event. <var> Declaration of a variable. <assign> An action to set <var> to a value. <exit> Exit from a screen or group and return the workflow to the previously loaded screen. The <uiworkflow> element is at the base of the screen flow hierarchy. It contains all the screens, groups, transitions and other elements that are required to describe the GUI being authored. There can be only one <uiworkflow> element in such 5 an XML document, and all other elements are descendants of the <uiworkflow> element. The <uiworkflow> element has the attributes indicated below in Table 2: Table 2 Name Required Type Default Value Description initial required IDREF none ID (identity) of the initial screen or group 10 The <screen> element represents a screen in the Graphical User Interface being authored. A <screen> element can be a child of a <uiworkflow>, a <group> or a <prioritygroup> element. A <screen> element can contain <transition> elements that 1898030_1 849925_speci_lodge - 19 represent transitions out of the screen that contains them. The <transition> elements can target other <screen> elements, or <group> or <prioritygroup> elements. A <screen> element has the attributes indicated below in Table 3: Table 3 Name Required Type Default Value Description id required ID none ID (identity) of the screen uri required anyURI none URI to refer to the resource that describes this screen. history optional string "keep" Indication as to whether the destination screen is recorded in the history stack. e "none" : Indicates that history is not recorded for the destination of the transition. 0 "keep" : Indicates that history should be recorded (= default). 5 A <group> element is a container for one or more <screen>, <group> or <prioritygroup> elements. A <group> element can contain <transition> elements that represent transitions to <screen>, <group> or <prioritygroup> elements that are external to the group containing the transition. The <transition> elements in a group are 10 global to all <screen>, <group> and <prioritygroup> elements that are descendants of the <group> element. The transitions of a group can be triggered by an event that does not trigger a transition in any of the currently loaded child elements of the group. The <group> element, as seen below, has an attribute "initial" that indicates the child of the group that is to be loaded when the group is first transitioned into during an execution of 15 the GUI. The initial attribute can indicate a child <screen>, <group> or 1898030_1 849925_speci_lodge - 20 <prioritygroup> element. During execution of the GUI, a group must keep track of any currently active child screen, child group or child priority group. A <group> element may have the attributes described below in Table 4: Table 4 Name Required Type Default Description Value id required ID none ID (identity) of the group initial required IDREF none Identifies initial screen or sub-group of this group. If a screen transition visits this group, and no child of this group has been recorded in the history, then it immediately goes to the initial screen or sub-group specified by the "initial" attribute. If the "initial" is a sub-group, after going to the sub-group, the transition immediately goes to the sub-group's initial screen or sub-group. This is repeated until reaching a screen. In the case that the group contains history, the screen transition to the group is interpreted as a transition to the child of the group that is recorded in the group's history. uri optional anyURI none Specifies the content of the group. If the uri is specified, any children of the group are ignored. The URI must contain a single uiworkflow element. The children 1898030_1 849925_speci_lodge -21 and "initial" attribute of the uiworkflow element are treated as though they belong to the group. If both the group and the uiworkflow elements have the "initial" attribute, the uiworkflow value overrides the group value. A <prioritygroup> element is equivalent to a <group> element except for the following differences: (i) a <prioritygroup> element has an integer priority value; 5 (ii) a transition into a <prioritygroup> raises the priority level of the workflow so that transitions within the current priority group take precedence over all other workflow transitions except transitions to a <pioritygroup> that has a higher integer priority value. This behaviour persists until the <prioritygroup> is exited. (iii) a <prioritygroup> cannot contain another <prioritygroup> as a 10 descendant. (iv) a <prioritygroup> contains no transitions to screen elements outside the priority group except the <exit> transition that terminates the operation of the <prioritygroup>. The reason a <prioritygroup> cannot contain another priority group as a child is 15 demonstrated by considering two <prioritygroup> elements A and B where A contains B as a descendent. The following 3 cases are considered: 1. A has priority greater than B, 2. A has the same priority as B, 3. A has priority less than B. 1898030_1 849925_spedlodge -22 In case 1, we have a higher priority <prioritygroup> A that contains a lower priority group B inside it. If we say that the priority of A is "high" and the priority of B is "low", then it is unclear what the priority of the transition within <prioritygroup> A into <prioritygroup> B is. As all transitions in <prioritygroup> A have the same priority 5 as A itself, that is "high", the transition into <prioritygroup> B should also have priority "high". But the transition into a priority group has the same implied priority as that of the <prioritygroup> being entered. This means that the transition within A into B should have the priority "low" which is the same priority as <prioritygroup> B has. We could legislate that the transition has one of the values "low" or "high", but either choice leads 10 to further possible problems in the workflow. In case 2, we know what the priority of the transition from <prioritygroup> A into <prioritygroup> B is, because A has the same priority as B. The transition therefore has that same priority as both A and B. In this case however, the <prioritygroup> B will operate in precisely the same manner as a normal <group>, so we have no need to 15 increase the design's complexity by using <prioritygroup> B inside <prioritygroup> A, we can just replace <prioritygroup> B by a normal <group> B. In case 3, we are faced with a similar problem to case 1. As all transitions within <prioritygroup> A have the same priority, "low", the priority of the transition from A into B should also be "low". But the priority of a transition into a <prioritygroup> is the 20 same as the priority of that group by implication. So there are two possible priorities for the transition from <prioritygroup> A into <prioritygroup> B. Once again, aribtrarily defining the transition from A to B to have priority of "high" or "low" leads to inconsistency in the interpretation of the workflow. For the above reasons, and the lack of any advantage of nesting priority groups, 25 we do not allow a priority group to contain another priority group as a descendant. 1898030_1 849925_speci_lodge -23 Priority groups are used to represent parts of the workflow that should not be interrupted, and may be entered from multiple places in the workflow. One such use would be a Wizard that helps a user to configure a newly added piece of hardware. A <prioritygroup> element desirably has the attributes shown in Table 5: 5 Table 5 Name Required Type Default Description Value id required ID none ID of the priority-group initial required IDREF none Identifies initial screen or sub-group of this group. If a screen transition visits this group, and no child of this group has been recorded in the history, then it immediately goes to the initial screen or sub-group specified by the "initial" attribute. If the "initial" is a sub-group, after going to the sub-group, the transition immediately goes to the sub group's initial screen or sub-group. This is repeated until reaching a screen. In the case that the group contains history, the screen transition to the group is interpreted as a transition to the child of the group that is recorded in the group's history. priority optional integer "0" The priority of the priority-group. A higher number indicates a higher priority. The default priority is 0. The number must be an integer and negative priorities are disallowed. 1898030_1 849925_specilodge -24 uri optional anyURI none Specifies the content of the prioritygroup. If the uri is specified, any children of the prioritygroup are ignored. The URI must contain a single uiworkflow element. The children and "initial" attribute of the uiworkflow element are treated as though they belong to the prioritygroup. If both the prioritygroup and the uiworkflow elements have the "initial" attribute, the uiworkflow value overrides the prioritygroup value. A <transition> element represents a screen transition from a current screen to a destination screen. A <transition> element has an "event" attribute that defines the event that will cause the transition to be triggered. A transition from a source screen SI 5 to a destination screen S2 is authored to occur when the screen S1 receives an event X, and the screen SI, or any of its parent groups or priority groups, has a transition with the attribute event="X" and a destination attribute to="S2". A <transition> element has the attributes of Table 6: Table 6 Name Required Type Default Description Value event required NMTOKEN none Specifies the event type of the event which triggers the transition. observer optional IDREF none Specifies the identity of the element at which to listen to the event which 1898030_1 849925_specilodge -25 triggers the transition. target optional IDREF none Specifies the event target of the event which triggers the transition. to optional IDREF or none The identifier of a screen, a group, or "_BACK" a priority group, to transition to. If a or group or a priority group is specified "_NONE" as "to", the transition follows the "initial" attribute of the group as explained in the description of the initial attribute in the group and priority group elements.. cond optional string none Guard condition for this transition. If it is present, the transition is taken only if the guard condition evaluates to true. The guard condition is tested on the value of a <var> element. history optional string "keep" Indication as to whether the destination screen is recorded in the history stack. Recording a screen in the history stack allows that screen to be returned to from the immediately following screen, by following a transition that has its to attribute equal to "BACK". * "none" : Indicates that history is not recorded for the destination of the transition. * "keep" : Indicates that history should be recorded (= default). 1898030_1 849925_speci_lodge -26 Fig. 7 shows a UI Workflow Example which illustrates the workflow of an authored GUI as displayed in a GUI of the authoring tool. The workflow shows a set of GUI screens in an application. The "screens" as described in relation to a "workflow" are representations of the final or ultimate screens that are to be displayed in a functional, 5 operating, GUI authored using the software application tool described herein. It will be appreciated that the final functional screens may be adorned with artwork and structure such as drop-down menu boxes, images etc. which are not illustrated in either the present description or in the GUI authoring tool whose purpose is to establish and define the working interrelationship between the various screens that may be present in a GU. A 10 Start Screen 719 contains a menu that can cause screen transitions authored to a Save As screen 726 and a Help Screen 735. The screens 726 and 735 are each within a group, Group 1 710. Fig. 7 also shows an ONPRINT transition 716 to Group 4 740, where the transition 716 can occur from any screen in Group 1 710. In particular, the Start Screen 719 and the Help Screen 735 have print functionality, as seen by print icons 713 and 731 15 respectively, so the ONPRINT transition 716 can apply to them. The example includes an outermost group, Group 0 700 which encompasses or envelops all other groups as seen in Fig. 7. A marker 701 indicates the initial subgroup of Group 0 700 which is Group 1 710, mentioned above. When Group 0 700 is loaded at runtime, being the time at which the application forming the authored GUI is executed to 20 reproduce the authored GUI, the subgroup of Group 0 700 that is loaded is Group 1 710. Group 1 710 has an initial attribute that is indicated by a marker 717, and sets the Start Screen 719 as the screen to be loaded initially when Group 1 710 is loaded. When the menu item Save As 711 is selected by a user action, such as a clck of the mouse 803 in the Start Screen 719, a screen transition 712 is taken to display a Save 25 As dialog 726. Once a user has selected the Directory and Filename for the Save As 1898030_1 849925_specilodge -27 operation, clicking an icon SAVE 721 causes a transition 723 to an EXIT element 725. The EXIT element 725 indicates that a screen transition should take place that returns the user to a display of the screen that was present before the immediate parent group of the EXIT element was entered. In this case, since the Save As screen 726 is within a 5 Group 2 720, the EXIT element 725 indicates that the screen to transition to is the last screen before the group 720 was entered, that is the next screen should be screen 719. The CANCEL button 722 also causes a transition 724 to the EXIT element 725, so the CANCEL button 724 also causes the screen to transition back to the screen 719. When the Menu Item Help icon 760 is selected in the screen 719, a transition 718 10 is activated that sets the current screen to be the Help Screen 735 in Group 3 730. When the user presses or otherwise selects the CANCEL button 732 within the Help Screen 735, a transition 733 occurs to another EXIT element 734. Once again, the EXIT element 734 indicates that the next screen is the screen that was loaded before the current group, Group 3 730, was entered. Accordingly, when the user selects CANCEL 732, the 15 screen that is transitioned to is the Start Screen 719. The Start Screen 719 and the Help Screen 735 are iconic representations that both contain print functionality. Instead of providing a separate transition for each of the Print Menu entry 713 and the PRINT button 731, a single or common transition 716 is provided in the corresponding parent group 710 for both screens 719 and 735. The 20 transition 716 symbolises the desired screen transition and is authored to occur when the ONPRINT event has been propagated to Group 1 710, and takes either screen 719 or screen 735 to the Group 4 740. Group 4 740 contains a print workflow that can be activated by both screens 719 and 735. Group 4 740, has an initial screen indicated by a marker 752. The initial 25 screen is the Select Printer screen 741. The screen 741 shows a drop down list 742 of 1898030_1 849925 speci lodge - 28 selectable printers, and has an OK button 743 and a CANCEL button 744. Once the user has selected a printer in the drop down list 742, in the illustrated case a printer called "Peasoup", pressing or otherwise selecting the OK button 743 causes a screen transition 745 to a Select Paper screen 747. Once the user has selected a paper size using the drop 5 down list 748, the print can be performed by selecting the PRINT button 749. Pressing PRINT button 749 activates a transition 753 to an EXIT element 751. The CANCEL button 744 and a CANCEL button 750 in the screen 747 also transition to the EXIT element 751, via transitions 746 and 754 respectively. Once again the EXIT element causes the next screen to be loaded to be the screen 10 that was present when the group containing the currently encountered EXIT element was entered. For the EXIT element 751, the parent group is Group 4 740. The transition that entered Group 4 740 was the transition 716. This transition can be activated when the print function is activated from either of screens 719 and 735. So the screen that is loaded when any of transitions 753, 754 or 746 are taken is whichever of screens 719 15 and 735 that was operating when the transition 716 was taken. The UI Workflow Example in Fig. 7 can be marked up using the XML markup as indicated in Code Sample I below. Specifically, the XML markup of Code Sample I represents a document that can be interpreted by an interpreter application, to thus form an application that represents or otherwise forms the authored GUI corresponding to the 20 workflow of Fig. 7. This is equivalent to execution of the workflow. Code Sample 1 <?xml version="1.0" encoding="UTF-8"?> <uiworkflow xmlns=http://www.canon.com/ns/squid/screenflow 25 initial="SCREEN MAIN"> <!-- Group_0 (700) -- > 1898030_1 849925_specilodge -29 <group id="Group_0" initial="Group_1"> <!-- Group_1 (710) -- > <group id="Group_1" initial="StartScreen"> 5 <!-- The Global Group Transition to the print dialog (716).--> <transition event="ONPRINT" to="Group_4"/> <!-- The Start Screen. (719) -- > <screen id="StartScreen" uri="StartScreen.xml"> 10 <!-- Transition on SaveAs menu item to SaveAsScreen (712).--> <transition event="SaveAsEvent" to="SaveAsScreen"/> <!-- Transition on Help menu item to SaveAsScreen (718). --> <transition event="HelpEvent" to="HelpScreen"/> <!-- Transition on Exit on Exit (714). -- > 15 <transition event="Exit" to="GrouplExit"/> </screen> <!-- Group_2 (720). -- > <group id="Group_2" initial="SaveAsScreen"> 20 <!-- The Save As Screen (726). -- > <screen id="SaveAsScreen" uri="SaveAs.xml"> <!-- Transition on press Save Button (723). -- > <transition event="OnSave" to="Group2_Exit"/> <!-- Transition on press Cancel Button (724). -- > 25 <transition event="OnCancel" to="Group2_Exit"/> </screen> <!-- Exit Element (725). --> <exit id="Group2_Exit"/> 30 </group> <!- End Group_2 -- > <!-- Group_3 (730). -- > <group id="Group_3" initial="HelpScreen"> 1898030_1 849925_speciJodge -30 <!-- The Help Screen (735) -- > <screen id="HelpScreen" uri="HelpScreen.xml"> <!-- Transition on press Cancel Button (733). -- > <transition event="OnCancel" to="Group3_Exit"/> 5 </screen> <!-- Exit Element (734). -- > <exit id="Group3_Exit"/> </group> <!-- End Group_3 -- > 10 <!-- The Group 1 Exit Element (715). -- > <exit id="GrouplExit"/> </group> <!-- End Group_1 -- > <group id="Group_4" initial="SelectPrinterScreen"> 15 <!-- The Select Printer Screen (741). -- > <screen id="SelectPrinterScreen" uri="SelectPrinter.xml"> <!-- Transition on press OK. Button (745). -- > <transition event="OnOk" to="SelectPaperScreen"/> 20 <!-- Transition on press Cancel Button (746). -- > <transition event="OnCancel" to="Group3_Exit"/> </screen> <!-- The Select Paper Screen (747). -- > 25 <screen id="SelectPaperScreen" uri="SelectPaper.xml"> <!-- The transition on press Print button (753). -- > <transition event="OnPrintButton" to="Group4_Exit"/> <!-- The transition on press Cancel button (753). -- > <transition event="OnCancel" to="Group4_Exit"/> 30 </screen> <!-- The Group_4 Exit Element (751). -- > <exit id="Group4_Exit"/> 1898030_1 849925_specilodge -31 </group> <!-- End Group_4 -- > </group> <!-- End Group_0 -- > 5 </uiworkflow> <!-- End uiworkflow Element -- > The above discussion is concentrated on the creation of the workflow for the inter screen behaviour of the GUI. The following discussion describes the specification and authoring of the intra screen behaviour of the GUI. Table 3 above contains a row that 10 describes the <screen> element attribute "uri". This attribute allows the screen definition to locate the definition of the screen elements (widgets) that implement the intra screen behaviour. The value of this uri attribute refers to a Widget Class Definition Markup document. This document is an XML document that describes the widgets that belong to the screen, as well as how the widgets behave and how they are organised 15 hierarchically. The Widget Class Definition markup can describe both an atomic widget and a container widget. An atomic widget is a leaf node of a widget hierarchy. A container widget contains other widgets, such as atomic widgets and container widgets, again as part of a hierarchy. The Widget Class Definition Markup can also describe widget 20 deployment and behaviour on a screen. The root widget of a screen's widget tree is contained in a Widget Class Definition Markup file that is referred to by the uri attribute of the <screen> element in the Screen tree 1102. The Widget Class Definition Markup document has the <WidgetClass> element as a root element and it has the child elements <statemachine> and <WidgetTree>. 25 The <statemachine> element describes widget behaviour based on a state machine. The <WidgetTree> element describes a child widget tree of a container widget. The 1898030_1 849925_speci_lodge -32 <WidgetTree> element contains one or more <WidgetInstance> elements. <WidgetInstance> elements can be nested to describe hierarchy and structure of the widget tree. Additionally, <WidgetInstance> elements can in turn refer to other Widget Class Definition Markup documents for the behavioural and structural definition of the 5 class of widget that is to be instantiated. The widget instantiated as a result of a <WidgetInstance> element definition, is positioned in the widget tree at a place that is defined by the relative position of the <WidgetInstance> element within the set of <WidgetInstance> elements that are specified inside the <WidgetTree> element. The ability to instantiate a widget that contains a widget tree, as part of another widget tree, 10 highlights the equivalence between the concepts of a container widget and a widget tree. The <WidgetTree> element is needed only if the Widget Class Markup document describes a container widget. The following Example Code 2 shows the structure of a widget class for a container widget: 15 Example Code 2 <WidgetClass ClassName="example"> <var> ... </var> <statemachine> <state> 20 <transition> ... </transition> </state> </statemachine> <WidgetTree> 25 <WidgetInstance> ... </WidgetInstance> <WidgetInstance> ... </WidgetInstance> </WidgetTree> 1898030_1 849925_specilodge - 33 </WidgetClass> The <WidgetClass> element is the root element of the Widget Class Definition Markup. There can be multiple XML widget class definition markup files that are 5 referred to by the WidgetInstance elements. The <WidgetClass> has the attributes shown in Table 7: Table 7 Name Required Type Default Description Value name required NMTOKEN none The name of the widget class. This attribute is not required when the document is the root document and should not cause an error if included in the root document. The <WidgetClass> element has the following child elements: 10 <statemachine>: Defines the state machine for the class. <var> : A variable declaration for use within the widget class. <WidgetTree> : If the widget class is a container, this element defines a child widget tree. The Widget Tree 1103 is described using the <WidgetClass> element. The tree 15 1103 can contain instances of widget classes defined in separate Widget Class XML Markup documents which are referred to in the <WidgetInstance> elements. The <WidgetTree> element has no attributes. The child element of the <WidgetTree> element is the <WidgetInstance> element. The <WidgetInstance> element indicates a 1898030_1 849925_speci_lodge -34 widget instance in the widget tree. The <WidgetInstance> element can have the attributes of Table 8: Table 8 Name Required Type Default Description Value id required ID none ID value of the widget instance class optional NMTOKEN none The widget class which the widget instance belongs to. This attribute is only required when the uri attribute is not present. This allows a class to be instantiated based on the class name value of the class attribute. uri optional anyURI URI of XML document class definition document. This attribute is only required if there is no class attribute. 5 The <WidgetInstance> element has the following child elements: <WidgetInstance>: Defines a child widget instance. <override> : Overrides the default value of a variable defined in the widget class definition. Within each screen there are one or more widgets. The widgets contained in a screen 10 are the objects that provided the behaviour of the GUI while that screen is loaded.. A widget may therefore be any control that affords functionality, and can include a text entry area, a drop down menu, a button, a slider or an area for drawing. The widgets are organised in a tree, where each widget can have 0 or more children, and each of the children can have 0 to more children. The preferred embodiment has the widget tree root 15 being a single root widget which contains all other widgets as descendants Each widget 1898030_1 849925_speci lodge - 35 has states, such as Yes-No, On-Off for example, and there are directional transitions between states triggered by events. The states within a widget can be either simple states or nested states. A nested state contains other simple or nested states. A simple state does not contain other states. 5 The state machine description in the Widget Class Definition Markup document includes XML elements <statemachine>, <state>, <history> and <transition>. A description of the elements and their attributes follows. The <statemachine> element is the child element of the <WidgetClass> element and contains the state machine description. It has child elements of <state> and 10 <var>. The <statemachine> element has the attributes indicated in Table 9 below: Table 9 Name Required Type Default Description Value initialstate required IDREF none ID of the initial state The <state> element holds the description of a state. There can be multiple 15 <state> elements within a <statemachine>. The state can also have sub-states. The <state> element has the following child elements: <state> : defines a substate of the parent state. <transition>: Defines an outgoing transition from this <state>. <onentry> : Holds executable content to be run upon entering this state. 20 <onexit> : Holds executable content to be run upon exiting this state. 1898030_1 849925_speci_lodge -36 <history> : The <history> element is an indication that the current substate of this state is to be remembered when this state is exited by a transition. <var> : A variable declaration for use within the state. 5 The <state> element has the attributes indicated in Table 10: Table 10 Name Required Type Default Description Value id required ID none ID value of the state initial optional IDREF none Identifies the initial sub-state of this state. If a state transition visits this state, and no child of this state has been recorded in the history, then it immediately goes to the initial sub state specified by the "initial" attribute. In the case that the state contains history, the state transition to this state is interpreted as a transition to the child sub-state of this state that is recorded in the state's history. This attribute is required if and only if the state has sub-states. A <transition> element defines an outgoing transition to a destination widget state. Transitions between states are triggered by events. The optional attribute "to" 10 specifies the destination of the transition, which is a <state> element. If the "to" attribute is omitted, then taking the transition has the effect of leaving the state machine in the same state after invoking any action specified by the child elements of the 1898030_1 849925_speci_lodge -37 <transition>. Note that this is different to a <transition> whose "to" is the same as its source state. In the latter case, the state is exited and re-entered, triggering execution of any corresponding <onentry> and <onexit> executable content. Any executable content contained in a transition is executed after the <onexit> 5 handlers of the source state and before the <onentry> handlers of the target state. The <transition> element desirably has the attributes shown in Table 11: Table 11 Name Required Type Default Description Value event optional NMTOKEN none Specifies the event type of the event which triggers the transition. observer optional IDREF none Specifies the id of the element at which to listen to the event which triggers the transition. The element can be a <WidgetInstance> or <var>. target optional IDREF none Specifies the event target of the event which triggers the transition. The element can be a <WidgetInstance> or <var>. propagate optional NMTOKEN "contin Specifies whether after ue" processing all listeners at the current node, the event is allowed to continue to propagate. e "continue" event propagation continues * "stop" : event propagation 1898030_1 849925_specilodge -38 stops to optional IDREF none The identifier of the state to transition to. cond optional NMTOKEN none Guard condition for this transition. If it is present, the transition is taken only if the guard condition evaluates to true. The guard condition is tested on the value of a <var> element The "observer"/"target" attribute can refer to: - <WidgetInstance> element (a child widget in the container widget class definition); or a 5 - <var> element If the "observer" or "target" attribute is omitted, it means that the <transition> element is triggered by an event observed at the widget that contains the transition. An example of this could be a transition between a state "NORMAL" and a state "CLICKED" in a button widget that receives a mouse click event. The state transition 10 occurs only when the mouse click event is received by the button widget that contains the state transition. Immediate children of <transition> elements are executable content that are run just before the transition occurs. Executable content includes the following actions: <operate> : An action to send a message to a child of the widget who's 15 state machine contains this transition. <updateView> : An action to send a message to the modality manager for updating the presentation of the GUI to display a change that may 1898030_1 849925_speci_lodge -39 have occurred in the state of the widget,. This provides the linkage between a widget and the modality manager 1305 in Fig. 13. <assign>: An action to set a value to a variable. <postevent> : An action to post an event to another module. 5 <action>: An application specific action. <switch>: A container of conditional actions. There may be other executable actions defined to occur on transitions. If a <state> element contains a <history> element as a child, then this state retains knowledge of its current sub-state whenever this state is exited. Upon re-entry of 10 the state due to a subsequent state transition, the state resumes operation from the sub state that has been recorded when the state exited. This feature allows for 'pause and resume' control flow. The <history> element must be placed immediately under the composite <state> element in the Widget Class Markup document. If no state is recorded as history (if the composite <state> that contains the history state has never 15 been entered before), a transition to the state results in a transition to the initial sub-state of that state that is the parent of the history element The <history> element has no attributes and no children. The <updateView> element mentioned above provides the linkage between the widget tree and the modality manager 1305. The <updateView> element is one of the 20 actions specified in the executable content that is specified in the XML children of the <transition> element. The <updateView> element describes an action to send a message to the modality manager for updating the presentation. . The <updateView> element has the attributes of Table 12: Table 12 1898030_1 849925_speci_lodge -40 Name Required Type Default Description Value message required string none Name of the message to the modality manager Below is a simple example of sending a message to the presentation layer to update itself to reflect the new state of data in a widget: <updateView message=" updatedata"/> 5 The above <updateView> message is received by the modality manager 1305. An example of transitions between simple states and nested states including usage of the history state can be seen in Fig. 10 which shows an example of a nested widget state machine 1000. In Fig. 10, rectangles shown with rounded corners are states Sn (eg. states 1001, 1002, 1011, 1012, 1021, 1022, 1023, 1031 and 1032 where n represents an 10 identifier for the states such as Sl1 for state 1011) and "H" is a history state (eg. 1005). Arrows with attached filled circles in Fig. 10 are used to indicate initial states (1003, 1033, 1024 and 1004), and the other arrows are used to show transitions (eg. 1042, 1043, 1044, 1045, 1046 and 1041). The outermost arrow with an attached filled circle is 1004 and represents an initial state. Since the initial state 1004 is the outermost initial state the 15 widget state machine 1000 starts in state S2 1002. In this example, we assume that the state machine 1000 is newly initialised and is in its initial state S2 1002, and receives events in the order El, E2, E3, E4, E5, E6, El. State changes would occur as follows. - The initial state is S2, 1002. 1898030_1 849925_specliodge -41 - On receipt of the event El, the transition 1041 is taken and the state machine 1000 goes to the history state "H" 1005 in Si 1001. Since state Si 1001 has never been entered before, "H" 1005 records no state. As a consequence, the machine 1000 goes to S1l 1011 which is the initial sub-state of Si 1001 as denoted by the filled 5 circle symbol 1003. Since SlL 1011 has sub-states, the machine 1000 goes to the corresponding initial sub-state S111 1031, denoted by the filled-circle symbol 1033. S111 1031 thus becomes the next current state. - On receipt of the event E2, the transition 1042 is taken and the state changes from S111 1031 to S112 1032. 10 - On receipt of the event E3, the transition 1043 is taken, since the current state S112 1032 is a sub-state of S11 1011. Therefore the state machine 1000 goes to state S12 1012. Since S12 1012 has sub-states, the machine 1000 goes to the corresponding initial sub-state S121 1021 denoted by the filled-circle symbol 1024. S121 1021 thus becomes the next current state. 15 - On receipt of the event E4, the transition1044 is taken and the state changes from S121 1021 to S122 1022. - On receipt of the event E5, the transition 1045 is taken and the state changes from S122 1022 to S123 1023. - On receipt of the event E6, the transition 1046 is taken as the current state 20 S123 1023 is a sub-sub-state of SI 1001. Therefore the state machine 1000 goes to S2 1002. Since S12 1012 is exited by this transition, the source state S123 1023 is recorded by the history state 1050. Similarly, since S1 1001 is exited by this transition, the source state S12 1012 is recorded by the history state "H" 1005 of Sl 1001. Upon the next encountering of the event El, the transition 1041 is taken and the 25 machine 1000 goes to the history state "H" 1005 of S1 1001. Since S12 1012 is 1898030_1 849925_specilodge - 42 recorded by "H" 1005, the transition goes to the state 1012. Similarly, as S123 1023 is recorded by "H" 1050, the machine 1000 goes to S123 1023 and awaits the next event. In this fashion, the history states "H" provides for a transition to accurately reference a nested UI group component. Following from this, referring back to Fig. 6 for example, a 5 transition may be configured from or to a screen group, or between screen groups. In a preferred implementation, the widget's nested state machine is represented using the <statemachine> element that forms part of a document constructed to describe a widget and which uses the Widget Class Definition Markup that is described below. For the example shown in Fig. 10, the XML that describes the state machine for the 10 widgets is shown in Code Sample 3 below. Code Sample 3 <statemachine initia1="1S2"> <state id="S1" initial="S11"> 15 <!-- history state -- > <history id="H"/> <transition event="E6" to="S2"/> <state id="S11"> <state id="S111"> 20 <transition event="E2" to="S112"/> </state> <state id="S112"> ...... </state> <transition event="E3" to="S12"/> </state> 25 <state id="S12"> <!-- history state -- > <history id="H1"/> <state id="S121"> <transition event="E4" to=" S122"/> 1898030_1 849925_specilodge -43 </state> <state id="S122"> <transition event="E5" to="S123"/> </state> 5 <state id="S123"> ...... </state> </state> </state> <state id="S2"> <transition event="E1" to="IH"/> 10 </state> </statemachine> 15 Turning to Fig. 12 there is a diagram showing a document tree 1200 and how the layers of the markup described above are linked together. The document tree 1200 represents a combination of the screen tree 1102 and the widget tree 1103 described above. The root of the tree 1200 is a screen state machine 1210 and the markup for this is contained in a UI Workflow XML document. The state machine in 1210 has three 20 screens 1211, 1212 and 1213, and shows the transitions between each of the screens, namely from 1211 to 1212, 1212 to 1213 and 1213 to 1211. The transitions between screens can be taken upon receipt of events not shown in Fig. 12. Each screen has an associated widget tree along with a widget state machine. Screen 1213 has widget tree 1220 indicated by arrow 1214. Screen 1211 has widget tree 1239 indicated by arrow 25 1216. Screen 1212 has widget tree 1240 indicated by arrow 1215. The arrows 1214, 1215 and 1216 from the screen state machine 1210 to the corresponding widget state machines 1220, 1230 and 1240 indicate that the widgets become active when the associated screen is activated. 1898030_1 849925_speci_lodge - 44 There are two types of widget: an atomic widget as in the widgets 1220, 1240, 1250 and 1260, and a container widget, as in the widget 1239. The atomic widgets each have an associated state machine. The atomic widget 1220 has two states 1221 and 1222. The transitions between the two states are from state 1221 to 1222 and from state 5 1222 to 1221. Upon a transition from state 1222 to 1221 there is an associated message 1223 sent to the modality manager 1105. The atomic widget 1240 is associated with screen 1212 and has two states 1241 and 1242. Upon a transition from state 1242 to 1241 a message 1243 is sent to the modality manager 1105. A container widget consists of a state machine and child widgets. In the present 10 case of the container widget 1239, there is a state machine 1230 and child widgets 1233 and 1234. The state machine 1230 has two states 1231 and 1232. The transitions between the two states are from 1231 to 1232 and from 1232 to 1231. Upon a state transition from state 1231 to state 1232, the child widget 1233 is activated. Similarly, upon a transition from state 1232 to state 1231 there is a transition that activates child 15 widget 1234. In the case of child widget 1233 there is an associated state machine 1250 as shown by arrow 1235 with two states labelled 1251 and 1252. There are transitions between these two states. In the case of the transition from state 1252 to 1251 there is an associated message 1253 that is sent to the modality manager 1105. In the case of child widget 1234 there is an associated state machine 1260 as shown by arrow 1236. The 20 state machine 1260 has two states 1261 and 1262. There are transitions between these two states. In the case of the state transition from 1262 to 1261 there is an associated message 1263 that is sent to the modality manager 1105. The modality manager 1105 can receive messages 1223, 1253, 1263 and 1243 from the widget state machines. The modality manager 1105 performs actions on the 25 presentation depending on the message received and the presentation that is authored for 1898030_1 849925_speci lodge -45 the element, that can be read from the presentation data 1104. The changes to the presentation are made by calling functions in or sending messages to the synthesiser 1106 via path 1271 or renderer 1110 via path 1272. Other modalities not shown on the diagram could also be the destination of the modality manager operations. 5 The presentation Data 1104 can be organised in separate files depending on the modality addressed by that data, or in a single file. The presentation for a widget is indexed by a widget's instance identifier which should be uniquely available for each widget in a screen's tree of widgets. Industrial Applicability 10 The arrangements described are applicable to the computer and data processing industries and particularly for the development and implementation of systems utilizing GUIs. The foregoing describes only some embodiments of the present invention, and modifications and/or changes can be made thereto without departing from the scope and 15 spirit of the invention, the embodiments being illustrative and not restrictive. (Australia Only) In the context of this specification, the word "comprising" means "including principally but not necessarily solely" or "having" or "including", and not "consisting only of'. Variations of the word "comprising", such as "comprise" and "comprises" have correspondingly varied meanings. 20 1898030_1 849925_speci_lodge
Claims (17)
1. A method for authoring an executable user interface workflow component for a user interface, said component defining navigation between a plurality of representable 5 states of a corresponding authored user interface, the method comprising the steps of: (i) displaying an iconic representation of a plurality of representable states in an authoring environment; (ii) receiving a user input to group a plurality of the representable states to form at least one state group representation; 10 (iii) receiving a user input to generate a directional link between two of said representations, wherein at least one of said two representations is one said state group representation and said link symbolises a state transition between said two representations; and (iv) displaying said directional link between said two representations in said 15 authoring environment to represent a component of the authored user interface workflow, the component being executable to implement the state transition in the authored graphical user interface.
2. A method according to claim 1 wherein the authored user interface comprises a 20 graphical user interface (GUI) and the representable states comprise screens of the GUI.
3. A method according to claim 1 wherein the user input operates to group the displayed iconic representations of the representable states, from which a grouping of the representable states is formed. 25 1898030_1 849925_specilodge - 47
4. A method according to claim 2 wherein each screen in the authored user interface refers to a separate plurality of widgets that implement the behaviour within that screen.
5. A method according to claim 4 wherein user input operates to create the widget 5 structure including widget state machines hierarchy, and class references, and add them to a screen.
6. A method according to claim 5 wherein a user operates to allow the addition of presentation to the widget behaviours to form screens and widgets that provide an 10 interactive capability with an application user.
7. A method according to claim 1 wherein said iconic representation is associated with presentation material by receiving a user input. 15
8. A computer readable storage medium having a computer program recorded thereon, the program being executable by computer apparatus to author an executable user interface workflow component for a user interface, said component defining navigation between a plurality of representable states of a corresponding authored user interface, the program comprising: 20 code for displaying on a display device an iconic representation of a plurality of representable states in an authoring environment; code for receiving a user input from a user interface to group a plurality of the representable states to form at least one state group representation; code for receiving a user input from said user interface to generate a directional 25 link between two of said representations, wherein at least one of said two representations 1898030_1 849925_spedjodge - 48 is one said state group representation and said link symbolises a state transition between said two representations; and code for displaying on the display device said directional link between said two representations in said authoring environment to represent a component of the authored 5 user interface workflow, the component being executable to implement the state transition in the authored graphical user interface.
9. A computer readable storage medium according to claim 1 wherein the authored user interface comprises a graphical user interface (GUI) and the representable states 10 comprise screens of the GUI.
10. A computer readable storage medium according to claim 8 wherein the user input operates to group the displayed iconic representations of the representable states, from which a grouping of the representable states is formed. 15
11. A computer readable storage medium according to claim 9 wherein each screen in the authored user interface refers to a separate plurality of widgets that implement the behaviour within that screen. 20
12. A computer readable storage medium according to claim 11 wherein user input operates to create the widget structure including widget state machines hierarchy, and class references, and add them to a screen. 1898030_1 849925_speclodge -49
13. A computer readable storage medium according to claim 12 wherein a user operates to allow the addition of presentation to the widget behaviours to form screens and widgets that provide an interactive capability with an application user. 5
14. A computer readable storage medium according to claim 8 wherein said iconic representation is associated with presentation material by receiving a user input.
15. Computer apprtaus adapted to perform the method of any one of claims I to 7. 10
16. A method for authoring an executable user interface workflow component for a user interface, said method being substantially as described herein with reference to any one of the embodiments as that embodiment is illustrated in the drawings.
17. Computer apparatus comprising a user interface formed according to the method 15 of any one of claims I to 7 or 16. Dated this 19th day of December 2008 CANON KABUSHIKI KAISHA Patent Attornerys for the Applicant 20 Spruson&Ferguson 1898030_1 849925_speci_lodge
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2008261147A AU2008261147A1 (en) | 2008-12-19 | 2008-12-19 | Hierarchical authoring system for creating workflow for a user interface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2008261147A AU2008261147A1 (en) | 2008-12-19 | 2008-12-19 | Hierarchical authoring system for creating workflow for a user interface |
Publications (1)
Publication Number | Publication Date |
---|---|
AU2008261147A1 true AU2008261147A1 (en) | 2010-07-08 |
Family
ID=42313398
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2008261147A Abandoned AU2008261147A1 (en) | 2008-12-19 | 2008-12-19 | Hierarchical authoring system for creating workflow for a user interface |
Country Status (1)
Country | Link |
---|---|
AU (1) | AU2008261147A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3015964A1 (en) * | 2014-10-30 | 2016-05-04 | Amadeus S.A.S. | Controlling a graphical user interface |
WO2016066250A1 (en) * | 2014-10-30 | 2016-05-06 | Amadeus S.A.S. | Controlling a graphical user interface |
US10044785B2 (en) | 2014-10-30 | 2018-08-07 | Amadeus S.A.S. | Controlling a graphical user interface |
US20220129118A1 (en) * | 2020-10-28 | 2022-04-28 | Axure Software Solutions, Inc. | Stateful widget container management for interactive designs |
US11645047B2 (en) | 2019-09-13 | 2023-05-09 | Axure Software Solutions, Inc. | Focused specification generation for interactive designs |
-
2008
- 2008-12-19 AU AU2008261147A patent/AU2008261147A1/en not_active Abandoned
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3015964A1 (en) * | 2014-10-30 | 2016-05-04 | Amadeus S.A.S. | Controlling a graphical user interface |
WO2016066250A1 (en) * | 2014-10-30 | 2016-05-06 | Amadeus S.A.S. | Controlling a graphical user interface |
US10044785B2 (en) | 2014-10-30 | 2018-08-07 | Amadeus S.A.S. | Controlling a graphical user interface |
US11645047B2 (en) | 2019-09-13 | 2023-05-09 | Axure Software Solutions, Inc. | Focused specification generation for interactive designs |
US20220129118A1 (en) * | 2020-10-28 | 2022-04-28 | Axure Software Solutions, Inc. | Stateful widget container management for interactive designs |
US11762531B2 (en) * | 2020-10-28 | 2023-09-19 | Axure Software Solutions, Inc. | Stateful widget container management for interactive designs |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Potel | MVP: Model-View-Presenter the Taligent programming model for C++ and Java | |
JP4381708B2 (en) | Graphical user interface system | |
Loy et al. | Java swing | |
JP4503438B2 (en) | System and method for informing applications and users of user interface elements | |
US7636897B2 (en) | System and method for property-based focus navigation in a user interface | |
Hill et al. | The MAUI toolkit: Groupware widgets for group awareness | |
US20020089526A1 (en) | Infocenter user interface for applets and components | |
US20020083414A1 (en) | System and method for a command structure representation | |
US20070276875A1 (en) | Harmonized theme definition language | |
Zukowski | The definitive guide to Java Swing | |
US8739120B2 (en) | System and method for stage rendering in a software authoring tool | |
JP2010009623A (en) | Transformation of platform specific graphical user interface widgets migrated between heterogeneous device platforms | |
AU2004202329A1 (en) | Framework for creating modular web applications | |
JP2006526183A (en) | User interface automation framework classes and interfaces | |
AU2008261147A1 (en) | Hierarchical authoring system for creating workflow for a user interface | |
JP2005327251A (en) | System and method for interactive radio application having conditional ui control and screen navigation | |
US20050193370A1 (en) | System and method for interactive wireless applications with conditional UI controls and screen navigation | |
Crease et al. | A toolkit of mechanism and context independent widgets | |
Molina Massó et al. | Towards virtualization of user interfaces based on UsiXML | |
Radeke et al. | PIM Tool: Support for pattern-driven and model-based UI development | |
Ferguson et al. | MetaMOOSE—an object-oriented framework for the construction of CASE tools | |
Bleul et al. | Multimodal dialog description for mobile devices | |
Stephanidis et al. | Universally accessible UIs: the unified user interface development | |
US11960860B2 (en) | Workflow for computer game development | |
Manca et al. | Supporting multimodality in service-oriented model-based development environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
MK4 | Application lapsed section 142(2)(d) - no continuation fee paid for the application |