AU2010249319A1 - Conditional optimised paths in animated state machines - Google Patents

Conditional optimised paths in animated state machines Download PDF

Info

Publication number
AU2010249319A1
AU2010249319A1 AU2010249319A AU2010249319A AU2010249319A1 AU 2010249319 A1 AU2010249319 A1 AU 2010249319A1 AU 2010249319 A AU2010249319 A AU 2010249319A AU 2010249319 A AU2010249319 A AU 2010249319A AU 2010249319 A1 AU2010249319 A1 AU 2010249319A1
Authority
AU
Australia
Prior art keywords
state
sequence
animation
widget
animations
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2010249319A
Inventor
Ping Leung Chan
Andrew R. Coker
Bin LIAO
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to AU2010249319A priority Critical patent/AU2010249319A1/en
Publication of AU2010249319A1 publication Critical patent/AU2010249319A1/en
Abandoned legal-status Critical Current

Links

Abstract

CONDITIONAL OPTIMISED PATHS IN ANIMATED STATE MACHINES Dislcosed is a method of performing a sequence of animations from a first state to a 5 final state via a plurality of states of a state machine based animated widget. Each of the plurality of states has a defined visual appearance. The method determines a plurality of state transitions from the first state to the final state and evaluates if a corresponding animation associated with each of the transitions is mandatory or non-mandatory based on a first predetermined criteria. The method selects a first sequence of state transitions 10 from the plurality of state transitions having a corresponding first sequence of animations evaluated as non-mandatory, said first sequence of animations having a start and end state. An alternative sequence of animation between the start and end states of the first sequence of animations is determined wherein the alternative sequence is selected to optimise a second predetermined criteria. The method then selects a second 15 sequence of state transitions from the plurality of state transitions having a corresponding second sequence of animations evaluated as mandatory. At least the alternative sequence of animations and the second sequence of animations are then executed (405, 406) to perform the sequence of animations. Start DETERMINE SEQUENCE OF STATE TRANSITIONS EVALUATE IF EACH ANIMATION IS MANDATORY IDENTIFY SEQUENCE OF NON MANDATORY TRANSITIONS DETERMINE OPTIMISED ANIMATION SEQUENCE IDENTIFY SEQUENCE OF MANDATORY TRANSITIONS EXECUTE ALTERNATE AND SECOND ANIMATION

Description

S&F Ref: 970375 AUSTRALIA PATENTS ACT 1990 COMPLETE SPECIFICATION FOR A STANDARD PATENT Name and Address Canon Kabushiki Kaisha, of 30-2, Shimomaruko 3 of Applicant: chome, Ohta-ku, Tokyo, 146, Japan Actual Inventor(s): Ping Leung Chan Andrew R Coker Bin Liao Address for Service: Spruson & Ferguson St Martins Tower Level 35 31 Market Street Sydney NSW 2000 (CCN 3710000177) Invention Title: Conditional optimised paths in animated state machines The following statement is a full description of this invention, including the best method of performing it known to me/us: 5845c(3213886_1) CONDITIONAL OPTIMISED PATHS IN ANIMATED STATE MACHINES TECHNICAL FIELD The current invention relates to animated state machines for use within user interfaces, and in particular to the computation of a conditionally optimised route from a current visual appearance to a target visual appearance for an animated state machine. 5 BACKGROUND In modem user interfaces that feature animated controls and rich functionality, it is important to provide a user experience in which the various animated elements within the interface are highly responsive to input from a user of the device. When a user interface control, such as an on-screen button, is subject to user interaction it is 10 important that the button responds immediately and without undue delay to confirmation that the user's intention was successfully received. Any effect that causes a lack of responsiveness - such as a delay, or missing confirmation feedback - may lead to confusion in the user, raise the incidence of user error, or detract from the satisfactory experience of using the device. 15 Using animated state machines for the control and animation of elements within a user interface is a well established method. The most basic implementation of such a technique simply queues up all input that is received by the user interface element, and plays, in sequence, the animation associated with the receipt of each input in turn. However, this can lead to a lack of responsiveness in the user interface element, as 20 animations from previous inputs may still be playing when new input is received. Hence, the user interface element does not immediately acknowledge, via some sort of feedback mechanism (such as a change in visual appearance), when user input is 3193685_1 970375_speci-lodge -2 received, leaving the user unable to immediately determine if his or her input to the user interface element has actually been received. In a further refinement of this technique, an animated user interface element may intentionally drop some of the queued animations where it is determined that repetitive 5 or unnecessary sequences of animation exist, such as in the case where it can be determined that an alternate quicker sequence of animation throughout the visual states of the animated user interface element exists that leads to the current logical state. However, if universally applied, this technique may also reduce the responsiveness of the user interface, by dropping animations of the user interface element that provide 10 important visual feedback and confirmation to the user of the device. Again, this situation leads to confusion in the user of the device, leaving the user unable to determine correctly if the intended interaction with the user interface was actually received by the device. It is therefore an aim of the present invention to reduce the impact of these and 15 other deficiencies within the prior art, by providing a framework in which animations associated with state changes in a user interface element can be more finely controlled. SUMMARY According to one aspect of the present disclosure there is provided a method of performing a sequence of animations from a first state to a final state via a plurality of 20 states of a state machine based animated widget, each of the plurality of states having a defined visual appearance, the method comprising the steps of: determining a plurality of state transitions from the first state to the final state; evaluating if a corresponding animation associated with each of the transitions is mandatory or non-mandatory based on a first predetermined criteria; 3193685_1 970375_speci_lodge -3 selecting a first sequence of state transitions from the plurality of state transitions having a corresponding first sequence of animations evaluated as non-mandatory, said first sequence of animations having a start and end state; determining an alternative sequence of animation between the start and end 5 states of the first sequence of animations wherein the alternative sequence is selected to optimise a second predetermined criteria; selecting a second sequence of state transitions from the plurality of state transitions having a corresponding second sequence of animations evaluated as mandatory; and 10 executing at least the alternative sequence of animations and the second sequence of animations to perform the sequence of animations. Other aspects are also disclosed. BRIEF DESCRIPTION OF THE DRAWINGS One or more embodiments of the invention will now be described with reference 15 to the following drawings, in which: Figs. IA and 1B collectively show a computing environment in which the arrangements to be described may be implemented; Fig. 2a is a diagram of a state machine representing the logical states of an on screen control element of a user interface according to the embodiments of the 20 invention; Figs. 2b, 2c and 2d are line drawings depicting the example appearance of an on screen control element of a user interface for the logical states of the state machine of Fig. 2; 3193685_1 970375_specilodge -4 Fig. 3 is a schematic flow diagram illustrating a method of recording a sequence of state transitions undergone by a widget within a user interface; Fig. 4 and 5 comprise a schematic flow diagram illustrating a method of determining a sequence of animation based on the recorded state transitions undergone 5 by a widget within a user interface; Fig. 6 is a diagram of an example state machine for an animated widget with an optimised animation path. Fig. 7 depicts a part of an example state machine showing the contents of queues associated with transitions of the state machine. 10 Fig. 8 is a schematic flow diagram illustrating a method for determining a sequence of animation in a state machine based animated widget. Fig. 9a depicts an example state machine for a rotating widget. Figs. 9b and 9c depict example visual configurations of the rotating widget for two of its states. 15 Fig. 10 depicts an example state machine for a button widget in which some transitions are indicated as being optimisable, and yet others are indicated as non optimisable. DETAILED DESCRIPTION INCLUDING BEST MODE Figs. 1A and I B depict a general-purpose computer system 100, upon which the 20 various arrangements described can be practiced. As seen in Fig. IA, the computer system 100 includes: a computer module 101; input devices such as a keyboard 102, a mouse pointer device 103, a scanner 126, a camera 127, and a microphone 180; and output devices including a printer 115, a display device 114 and loudspeakers 117. An external Modulator-Demodulator 3193685_1 970375_specilodge -5 (Modem) transceiver device 116 may be used by the computer module 101 for communicating to and from a communications network 120 via a connection 121. The communications network 120 may be a wide-area network (WAN), such as the Internet, a cellular telecommunications network, or a private WAN. Where the connection 121 is 5 a telephone line, the modem 116 may be a traditional "dial-up" modem. Alternatively, where the connection 121 is a high capacity (e.g., cable) connection, the modem 116 may be a broadband modem. A wireless modem may also be used for wireless connection to the communications network 120. The computer module 101 typically includes at least one processor unit 105, and 10 a memory unit 106. For example, the memory unit 106 may have semiconductor random access memory (RAM) and semiconductor read only memory (ROM). The computer module 101 also includes an number of input/output (1/0) interfaces including: an audio-video interface 107 that couples to the video display 114, loudspeakers 117 and microphone 180; an 1/0 interface 113 that couples to the keyboard 102, mouse 103, 15 scanner 126, camera 127 and optionally a joystick or other human interface device (not illustrated); and an interface 108 for the external modem 116 and printer 115. In some implementations, the modem 116 may be incorporated within the computer module 101, for example within the interface 108. The computer module 101 also has a local network interface 111, which permits coupling of the computer system 100 via a 20 connection 123 to a local-area communications network 122, known as a Local Area Network (LAN). As illustrated in Fig. IA, the local communications network 122 may also couple to the wide network 120 via a connection 124, which would typically include a so-called "firewall" device or device of similar functionality. The local network interface I11 may comprise an EthernetTM circuit card, a BluetoothTM wireless 3193685_1 970375_specilodge -6 arrangement or an IEEE 802.11 wireless arrangement; however, numerous other types of interfaces may be practiced for the interface 111. The /0 interfaces 108 and 113 may afford either or both of serial and parallel connectivity, the former typically being implemented according to the Universal Serial 5 Bus (USB) standards and having corresponding USB connectors (not illustrated). Storage devices 109 are provided and typically include a hard disk drive (HDD) 110. Other storage devices such as a floppy disk drive and a magnetic tape drive (not illustrated) may also be used. An optical disk drive 112 is typically provided to act as a non-volatile source of data. Portable memory devices, such optical disks (e.g., CD 10 ROM, DVD, Blu-ray Discm), USB-RAM, portable, external hard drives, and floppy disks, for example, may be used as appropriate sources of data to the system 100. The components 105 to 113 of the computer module 101 typically communicate via an interconnected bus 104 and in a manner that results in a conventional mode of operation of the computer system 100 known to those in the relevant art. For example, 15 the processor 105 is coupled to the system bus 104 using a connection 118. Likewise, the memory 106 and optical disk drive 112 are coupled to the system bus 104 by connections 119. Examples of computers on which the described arrangements can be practised include IBM-PC's and compatibles, Sun Sparcstations, Apple Macim or a like computer systems. 20 The method of performing a sequence of animations from a first state to a final state via a plurality of states of a state machine based animated widget may be implemented using the computer system 100 wherein the processes of Figs. 2 to 10, to be described, may be implemented as one or more software application programs 133 executable within the computer system 100. In particular, the steps of the method of 3193685_1 970375_speci lodge -7 performing a sequence of animations from a first state to a final state via a plurality of states of a state machine based animated widget are effected by instructions 131 (see Fig. 1B) in the software 133 that are carried out within the computer system 100. The software instructions 131 may be formed as one or more code modules, each for 5 performing one or more particular tasks. The software may also be divided into two separate parts, in which a first part and the corresponding code modules performs the animation methods and a second part and the corresponding code modules manage a user interface between the first part and the user. The software may be stored in a computer readable medium, including the 10 storage devices described below, for example. The software is loaded into the computer system 100 from the computer readable medium, and then executed by the computer system 100. A computer readable medium having such software or computer program recorded on the computer readable medium is a computer program product. The use of the computer program product in the computer system 100 preferably effects an 15 advantageous apparatus for performing a sequence of animations from a first state to a final state via a plurality of states of a state machine based animated widget. The software 133 is typically stored in the HDD 110 or the memory 106. The software is loaded into the computer system 100 from a computer readable medium, and executed by the computer system 100. Thus, for example, the software 133 may be 20 stored on an optically readable disk storage medium (e.g., CD-ROM) 125 that is read by the optical disk drive 112. A computer readable medium having such software or computer program recorded on it is a computer program product. The use of the computer program product in the computer system 100 preferably effects an apparatus 3193685_1 970375_speci lodge -8 for performing a sequence of animations from a first state to a final state via a plurality of states of a state machine based animated widget. In some instances, the application programs 133 may be supplied to the user encoded on one or more CD-ROMs 125 and read via the corresponding drive 112, or 5 alternatively may be read by the user from the networks 120 or 122. Still further, the software can also be loaded into the computer system 100 from other computer readable media. Computer readable storage media refers to any non-transitory tangible storage medium that provides recorded instructions and/or data to the computer system 100 for execution and/or processing. Examples of such storage media include floppy disks, 10 magnetic tape, CD-ROM, DVD, Blu-ray Disc, a hard disk drive, a ROM or integrated circuit, USB memory, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the computer module 101. Examples of transitory or non-tangible computer readable transmission media that may also participate in the provision of software, application 15 programs, instructions and/or data to the computer module 101 include radio or infra red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like. The second part of the application programs 133 and the corresponding code 20 modules mentioned above may be executed to implement one or more graphical user interfaces (GUIs) to be rendered or otherwise represented upon the display 114. Through manipulation of typically the keyboard 102 and the mouse 103, a user of the computer system 100 and the application may manipulate the interface in a functionally adaptable manner to provide controlling commands and/or input to the applications 3193685_1 970375_specilodge -9 associated with the GUI(s). Other forms of functionally adaptable user interfaces may also be implemented, such as an audio interface utilizing speech prompts output via the loudspeakers 117 and user voice commands input via the microphone 180. Fig. lB is a detailed schematic block diagram of the processor 105 and a 5 "memory" 134. The memory 134 represents a logical aggregation of all the memory modules (including the HDD 109 and semiconductor memory 106) that can be accessed by the computer module 101 in Fig. 1A. When the computer module 101 is initially powered up, a power-on self-test (POST) program 150 executes. The POST program 150 is typically stored in a 10 ROM 149 of the semiconductor memory 106 of Fig. 1A. A hardware device such as the ROM 149 storing software is sometimes referred to as firmware. The POST program 150 examines hardware within the computer module 101 to ensure proper functioning and typically checks the processor 105, the memory 134 (109, 106), and a basic input-output systems software (BIOS) module 151, also typically stored in the 15 ROM 149, for correct operation. Once the POST program 150 has run successfully, the BIOS 151 activates the hard disk drive 110 of Fig. IA. Activation of the hard disk drive 110 causes a bootstrap loader program 152 that is resident on the hard disk drive 110 to execute via the processor 105. This loads an operating system 153 into the RAM memory 106, upon which the operating system 153 commences operation. The 20 operating system 153 is a system level application, executable by the processor 105, to fulfil various high level functions, including processor management, memory management, device management, storage management, software application interface, and generic user interface. 3193685_1 970375_speci_lodge -10 The operating system 153 manages the memory 134 (109, 106) to ensure that each process or application running on the computer module 101 has sufficient memory in which to execute without colliding with memory allocated to another process. Furthermore, the different types of memory available in the system 100 of Fig. 1 A must 5 be used properly so that each process can run effectively. Accordingly, the aggregated memory 134 is not intended to illustrate how particular segments of memory are allocated (unless otherwise stated), but rather to provide a general view of the memory accessible by the computer system 100 and how such is used. As shown in Fig. 1 B, the processor 105 includes a number of functional modules 10 including a control unit 139, an arithmetic logic unit (ALU) 140, and a local or internal memory 148, sometimes called a cache memory. The cache memory 148 typically include a number of storage registers 144 - 146 in a register section. One or more internal busses 141 functionally interconnect these functional modules. The processor 105 typically also has one or more interfaces 142 for communicating with 15 external devices via the system bus 104, using a connection 118. The memory 134 is coupled to the bus 104 using a connection 119. The application program 133 includes a sequence of instructions 131 that may include conditional branch and loop instructions. The program 133 may also include data 132 which is used in execution of the program 133. The instructions 131 and the 20 data 132 are stored in memory locations 128, 129, 130 and 135, 136, 137, respectively. Depending upon the relative size of the instructions 131 and the memory locations 128 130, a particular instruction may be stored in a single memory location as depicted by the instruction shown in the memory location 130. Alternately, an instruction may be 3193685_1 970375_speci_lodge - 11 segmented into a number of parts each of which is stored in a separate memory location, as depicted by the instruction segments shown in the memory locations 128 and 129. In general, the processor 105 is given a set of instructions which are executed therein. The processor 1105 waits for a subsequent input, to which the processor 105 5 reacts to by executing another set of instructions. Each input may be provided from one or more of a number of sources, including data generated by one or more of the input devices 102, 103, data received from an external source across one of the networks 120, 102, data retrieved from one of the storage devices 106, 109 or data retrieved from a storage medium 125 inserted into the corresponding reader 112, all 10 depicted in Fig. IA. The execution of a set of the instructions may in some cases result in output of data. Execution may also involve storing data or variables to the memory 134. The disclosed animation arrangements use input variables 154, which are stored in the memory 134 in corresponding memory locations 155, 156, 157. The animation 15 arrangements produce output variables 161, which are stored in the memory 134 in corresponding memory locations 162, 163, 164. Intermediate variables 158 may be stored in memory locations 159, 160, 166 and 167. Referring to the processor 105 of Fig. 1B, the registers 144, 145, 146, the arithmetic logic unit (ALU) 140, and the control unit 139 work together to perform 20 sequences of micro-operations needed to perform "fetch, decode, and execute" cycles for every instruction in the instruction set making up the program 133. Each fetch, decode, and execute cycle comprises: (a) a fetch operation, which fetches or reads an instruction 131 from a memory location 128, 129, 130; 3193685_1 970375_specilodge - 12 (b) a decode operation in which the control unit 139 determines which instruction has been fetched; and (c) an execute operation in which the control unit 139 and/or the ALU 140 execute the instruction. 5 Thereafter, a further fetch, decode, and execute cycle for the next instruction may be executed. Similarly, a store cycle may be performed by which the control unit 139 stores or writes a value to a memory location 132. Each step or sub-process in the processes of Figs. 2 to 10 is associated with one or more segments of the program 133 and is performed by the register section 144, 145, 10 147, the ALU 140, and the control unit 139 in the processor 105 working together to perform the fetch, decode, and execute cycles for every instruction in the instruction set for the noted segments of the program 133. The method of performing a sequence of animations from a first state to a final state via a plurality of states of a state machine based animated widget may alternatively 15 be implemented in dedicated hardware such as one or more integrated circuits performing the functions or sub functions of animation. Such dedicated hardware may include graphic processors, digital signal processors, or one or more microprocessors and associated memories. Description of animated state machines 20 The present invention relates to the animation of user interface widgets that are defined according to a state machine. Fig. 2a shows a diagram representing an example state machine for the logical states of a widget in a user interface. In this example, logical states are shown for a button widget. The circle 201 represents a logical state in which the example button widget is in a NORMAL state. The circle 202 represents a 3193685_1 970375_speci_lodge - 13 logical state in which the example button widget is in a FOCUS state, such as when a movable cursor is moved to the displayed location of the button widget. The circle 203 represents a logical state in which the example button widget is in an ACTIVE state, such as when the button is being pressed. In the preferred embodiment of the invention, 5 a widget within a user interface has exactly one state as the current state at all times. Figs. 2b, 2c and 2d show example associated visual appearances 212, 213 and 214 for an example button widget in the NORMAL, FOCUS and ACTIVE states respectively. A directed arrow 204 indicates a state transition from NORMAL state 201 to 10 FOCUS state 202. A state transition denotes that it is possible for the current state of a widget to undergo an instantaneous change between the two states indicated, and in the direction indicated. This transition 204 is labelled with a state change command name, GAIN FOCUS. Similarly, other transitions 208, 209, 210 and 211 within this example button state machine are labelled with state change command names LOSE FOCUS, 15 DEACTIVATE and ACTIVATE respectively. A transition may be labelled with any number of such state change command names. When a state change command name is associated with a transition, it carries the meaning that when the current state of the widget is that of the source state of the transition, and furthermore that the designated state change command is received by the widget, then the current state of the widget is 20 updated to become the target state of that transition. The computer program for controlling the user interface in which the animated widgets are deployed determines when the various state change commands such as GAIN FOCUS are to be applied to the various widgets within the user interface, by monitoring commands input by the user of the interface via input devices such as a keyboard, mouse or touch-sensitive screen. 3193685_1 970375_speci lodge -14 There is associated with each transition of a widget's state machine an accompanying animation, with a defined duration. When a widget undergoes a state change, the current logical state changes immediately, and subsequent state change commands that are directed to the widget are evaluated according to the updated current 5 state. However, the animation associated with a particular state transition in a widget state machine, such as state transition 204, has defined a certain time duration which is generally non-zero. In the present embodiment, when a state change occurs, or a sequence of state changes occur for a widget, the widget's appearance shall be animated by interpolating between defined visual appearances for states of the widget (as 10 exemplified in Figs. 2b, 2c and 2d) corresponding to pairs of states defined for the state machine of the widget. This sequence of animations is performed in series, in a chain of animations each taking a certain duration, until finally the animation sequence is complete when the visual appearance of the widget is equal to the visual appearance associated with the current logical state of the widget. 15 Overview of record and play modes, and optimised routes Figs. 3, 4 and 5 show a series of flowcharts for carrying out the preferred embodiment. The processing steps of Fig. 3 are for making a record of the actual state changes performed by a widget. The processing steps of Figs. 4 and 5 are then carried out, and will determine an equivalent series of animation steps throughout the state 20 machine. Here, a series of animation steps is considered to be equivalent to the series of state changes if they have the same start and end states, and the animation follows a sequence of connected steps that follow the transitions provided by the definition of the widget's state machine. An equivalent series of animation steps may indeed follow a different path throughout the states of the state machine, and visit states that were not 3193685_1 970375_specilodge - 15 encountered in the actual series of state changes recorded during the execution of the steps of Fig. 3 - typically, this situation will occur when the chosen animation path is considered to be optimised in some fashion compared to the animation that would occur by simply following precisely the animation steps corresponding to the sequence of state 5 changes. For example, an optimised path of animation may be found that has less total duration, or takes fewer steps between states, than the original route. As will be explained, it is sometimes but not always advantageous to allow animation paths to be optimised in this way. Therefore it is desirable to permit optimisation of animation paths only for certain transitions between certain states of the state machine for an 10 animated widget, or under certain conditions. The embodiment operates by recording not only the series of actual state changes experienced by the widget, but accompanies this with a record of whether the animation step corresponding to each state transition is optimisable, or not optimisable. Then, when determining a sequence of animation steps, this information is used to determine where optimisations will occur. 15 Fig. 6 shows an example state machine for a widget, for which an optimised animation path may be identified. In this example, the widget begins in state 601, and undergoes a first state transition 612 to state 602 with animation duration 400ms followed immediately by a second state transition 623 to state 603 with animation duration 400ms. At the point in time immediately after these state transitions has been 20 received by the widget, the current visual appearance of the widget is that corresponding to the state 601, while the current logical state of the widget is state 603. In this example, an animation path is defined to be optimised if it takes the shortest possible total duration from the current visual appearance to the target visual appearance (that associated with the current logical state). The total duration of the unoptimised 3193685_1 970375_specilodge -16 animation path is 800ms, but the duration of transition 613 is only 650ms. The sequence of animation performed by following only transition 613 has the same start and ends states as the original unoptimised animation sequence, therefore, an optimised animation path exists, following transition 613 instead. 5 In the present embodiment, if an animation step corresponding to a state change of a widget is non-optimisable, it means that if that state change actually occurs, then the corresponding animation step must also occur. Otherwise, it is permitted to substitute that animation step by finding an equivalent optimised animation path. Throughout this description the terms "mandatory" is synonymous with "non-optimisable", and the term 10 "non-mandatory" is synonymous with "optimisable" when used to refer to individual animation segments associated with state changes of a state machine based widget. The following description of the processing steps of Figs. 3, 4 and 5 makes reference to queues associated with the transitions for widgets within the user interface. These queues are a data structure implemented as an array of memory, wherein each cell 15 within the array of memory may store a single recorded data entry. A queue starts off as the empty queue, with no entries. As entries are added to the queue, the length of the queue increases. Entries can be read from the queue, in an operation where the least recently added entry in the queue is read and then removed from the queue. Hence, all queued entries are read in the same order that they are added to the queue. In the 20 following description entries within these queues can be any of a) an entry representing a record of following an optimisable state transition, b) an entry representing a record of a non-optimisable state transition, or c), an empty entry that indicates that this transition was not followed at this time (but instead, some other outgoing transition from the same source state was followed). 3193685_1 970375_speci_lodge - 17 Record mode Fig. 3 shows a flowchart of operations for carrying out the steps of making a record of a state transition of a widget controlled by an animated state machine. The process begins at step 301, where a state change command is received by a widget. The 5 process then flows to step 302, in which an outgoing transition from the current state of the widget is set as the currently examined transition. For the first time when step 302 is carried out for a particular state change command received at step 301, the first outgoing transition shall be set as the currently examined transition. For subsequent times when step 302 is carried out for a particular state change command received at step 301, the 10 next outgoing transition is examined in sequence. The process then flows to decision step 303, where it is determined if the currently examined transition is labelled with a state change command corresponding to that received at step 301. If that is the case, then the process flows to decision step 304. Otherwise, the process flows to step 307 instead. 15 At decision step 304, it is determined if the corresponding animation for this transition is permitted to be optimised. In one embodiment of the invention, this determination is made according to a setting assigned to each transition within the state machine of the widget by a designer using an authoring environment. That is, the designer can designate any or all of the transitions within a widget as being non 20 optimisable. In the case that the determination is made that the animation is permitted to be optimised, the process will flow to step 305, and to step 306 otherwise. At step 305, after having determined that the animation corresponding to the current transition is not optimisable, a data value representing a non-optimisable animation is appended onto the queue for the current transition identified at step 302. 3193685_1 970375_specilodge - 18 Similarly, at step 306, after having determined that the animation corresponding to the current transition is optimisable, a data value representing an optimisable animation is added to the queue for the current transition identified at step 302. At step 307, after determining that the current transition identified at step 302 does not correspond to the 5 state change command received at step 301, an empty entry is added to the queue for the current transition. After any of steps 305, 306 and 307, the process flows to decision step 308. In decision step 308, it is determined if there remain any outgoing transitions that have yet to be examined by step 302. If there are such outgoing transitions remaining, then the 10 process returns to step 302. Otherwise, the process flows to step 309. It can therefore be seen that an iterative loop is performed in which all outgoing transitions of the current state of the widget are examined. For each such outgoing transition, the process will flow through exactly one of steps 305, 306 and 307, in which an entry is added to the currently examined outgoing transition. Therefore, by carrying out the steps of 15 Fig. 3, the queues of all outgoing transitions from the current state of a widget are increased in size by exactly one entry. At step 309, the current state of the widget is updated. In this step, the outgoing transition if any is followed as identified in decision step 303 as having a matching state change command received in step 301. The destination state of that transition is then set 20 as the current state of the widget. Accordingly, any new state change command received by the widget is then processed by a new instance of the steps depicted in Fig. 3. It can be seen that by repeated application of the steps of Fig. 3, a series of state change commands is processed, and the current state of the widget can be updated several times in response to those state change commands. Furthermore, the exact path of state 3193685_1 970375_specilodge -19 changes that was taken throughout the connected graph that represents the states of the widget can be reconstructed by careful examination of the entries in the queues for the transitions within that widget. Fig. 7 depicts a part of an example state machine indicating the contents of 5 queues associated with outgoing transitions from an original current state 701. In this example, the processing steps of Fig. 3 have been performed for the widget undergoing the transition 712 from an original current state 701 to a new current state 702. There also exist other transitions 713 leading to state 703, and 714 leading to state 704. Following the application of the processing steps of Fig. 3, the queues 720, 730 and 740 10 associated with the outgoing transitions of state 701 each have an entry added. In this example, all queues are initially blank, so there is now only one entry in these queues. In the case where the sequence of transitions revisits the same state multiple times, there will typically be more than one entry in each queue associated with a transition of the state machine. In this example, as it was actually transition 712 that was followed, the 15 corresponding queue 720 has an entry "NM" representing the fact that a state transition with non-mandatory animation was recorded. (If instead this transition had been evaluated as having mandatory animation, then a different entry "M" would have been added to the queue 720.) Other queues 730 and 740 have an entry "E" indicating an empty entry, signifying that this particular transition was not followed at this stage. 20 Playback mode Fig. 4 is a flowchart illustrating the steps for determining a sequence of animation throughout the states of an animated state machine. The processing steps of Fig. 4 refer to animation entries recorded in queues according to the process described in Fig. 3. 3193685_1 970375_specilodge -20 The process of Fig. 4 begins at step 401, where the widget state corresponding to the current appearance of the widget is set as the state to be examined for the remaining processing steps of Fig. 4. Whenever there is pending animation to perform, the widget state corresponding to the current appearance of the widget will be some state other than 5 the current logical state of the widget. In that case, the processing steps of Fig. 4 perform the task of determining a sequence of animation throughout the animated state machine to bring the current visual appearance of the state machine up to date with the current logical state. The process then flows to step 402, in which an animation target state, and a routing mode is determined. The animation target state is an intermediate 10 destination representing some subset of the remaining animation required for bringing the current appearance of the widget up to date with the current state. The routing mode is a binary choice between "optimisable" mode, and "non-optimisable" mode. Processing steps for determining the animation target, and routing mode are shown in Fig. 5. As a result of performing the processing steps of Fig. 5, a target animation state 15 and mode will be set only if further animation is required to make the current appearance of the widget up to date with the current logical state of the widget. After determining an animation target state and animation mode at step 402, the process flows to decision step 403, in which it is determined if an animation target state was set. If no such state was identified, the process ends, as this means that the current 20 widget appearance is up to date with the current logical state of the widget. If a target state has been set, the process flows to decision step 404 in which it is determined if the routing mode is set as "optimisable". If it is not, then the process flows to step 406, in which the widget is animated to the target state via a route specified exactly by the sequence of state changes that were recorded by the widget. Otherwise, the process 31936851 970375_speci_lodge -21 flows to step 405, in which the widget is animated to the target via a route that may be optimised - that is, be directed through a sequence of intermediate states that differs from the exact sequence of state changes that were recorded by the widget. The sequence of optimised animation must begin at the state corresponding to the current 5 widget appearance, and must end at the animation target state. Following either of steps 405 or 406, the process returns to step 401. At this stage, the animation performed by either of steps 405 or 406 will typically result in the current widget appearance having progressed to a new state of the widget. It can be seen that the processing steps of Fig. 4 form an iterative loop, in which the widget is 10 animated to a current logical state, via a series of intermediate states identified at successive applications of step 402. This iterative loop terminates when the widget's visual appearance corresponds to the current logical state, as determined by decision step 403. When the logical state of the widget once again changes, as may occur when the widget receives a state change command according to the processing steps of Fig. 3, 15 the processing steps of Fig. 4 will once again be executed in order to animate the widget to the visual appearance for the new current logical state. Fig. 5 shows a flowchart of the processing steps for determining an animation target state and routing mode, as referenced in step 402 of Fig. 4. The process begins at step 501, in which an outgoing transition from the examined state of the widget is set as 20 the active transition. When the processing steps of Fig. 5 are first entered (as from processing step 402), the examined state of the widget will be that of the current widget appearance. However, after subsequent operations of the processing steps of Fig. B, processing step 501 shall be performed for other widget states as set by processing step 505. 3193685_1 970375_specilodge - 22 For the first time that processing step 501 is performed for a particular examined state of the widget, the first outgoing transition of that state is made the active transition. For subsequent times that processing step 501 is performed for a particular examined state of the widget, the remaining outgoing transitions are made the active transition in 5 sequence. After step 501, the process flows to decision step 502 in which it is determined if the entry at the head of the queue for the active outgoing transition is an empty entry. If it is an empty entry, then the process flows to decision step 506 in which it is determined if there are any outgoing transitions yet to be made the active transition at step 501. If there are, then the process returns to step 501 to make active the next 10 outgoing transition. It can be seen that steps 501, 502 and 506 form an iterative loop that processes all of the outgoing transitions of the examined state of the widget, proceeding until either a non-empty entry is found at the head of a transition's queue, or until the loop has examined the queues all the outgoing transitions without such a non empty entry being found. 15 At the point where a non-empty entry is found at the head of a queue for an outgoing transition, the process flows to step 503 where it is determined if that entry is for an optimisable animation. If it is, then the process flows to step 504 where all of the queues for outgoing transitions of the widget's current state have the entry at the head of the queue discarded. This operation is performed even for outgoing transitions that have 20 not been considered at step 501 - hence, the queues for the outgoing states are kept synchronised. Additionally, at step 504, the routing mode is set to optimised, as at this stage it is now known that the eventual animation target state is for a sequence of optimisable animation. After this operation, the process flows to processing step 505 in which the identified outgoing transition is followed, and the destination state of that 3193685_1 970375_specilodge - 23 transition is set as the currently examined widget state for the processing steps of Fig. 5. The process then returns to 501, in which the previously described operations are performed for the new widget state. It can be noted that steps 501, 502, 503, 504 and 505 form an iterative loop that 5 is followed for as long as the processing steps follow a series of optimisable animation entries in an uninterrupted sequence. This loop is executed iteratively until a non optimisable animation entry is found at step 503. Alternatively, the loop may terminate if the process arrives at a state in which all of the outgoing transitions have an empty entry at the head of the queue - a situation in which no outgoing transitions will remain 10 at decision step 506, with the processing flowing to decision step 507. If, at decision step 503, the queue entry represents a non-optimisable animation, then the process flows to decision step 509. At decision step 509, it is determined if the routing mode is optimised - that is, if the current execution of the processing steps of Fig. 5 has already performed step 504. If the routing mode is optimised, then the 15 process flows to processing step 508. This situation occurs when the iterative loop formed by steps 501, 502, 503, 504 and 505 has followed a contiguous sequence of optimisable animations, and finally at step 503 it is determined that this contiguous sequence has been interrupted by a non-optimisable animation. At processing step 508, the animation target state is set for the currently examined state, after which the process 20 of Fig. 5 ends. If, at decision step 509, the routing mode has not been set as optimised, then the process flows to processing step 510. This situation occurs when the very first non empty queue entry for an outgoing transition of the examined widget state is a non optimisable animation entry - that is, no contiguous sequence of optimisable animations 3193685_1 970375_speci_lodge - 24 has been found for the current execution of the processing steps of Fig. 5. At processing step 510, all of the queues for outgoing transitions of the widget's current state are popped, that is, the entry at the head of the queue is discarded. The process flows to processing step 511, in which the animation target state is set as the destination state of 5 the currently examined transition. Additionally, the routing mode is set as non optimisable, and the process of Fig. 5 ends. If, at decision step 506, there remain no outgoing transitions, then the process flows to decision step 507. At decision step 507, it is determined if the routing mode is optimisable. In the situation that the routing mode is optimisable, then the process 10 flows to step 508 where the animation target state is equal to the examined state. This situation occurs when the processing steps of Fig. 5 are following a contiguous sequence of optimisable animation that is terminated due to running out of recorded animations. Alternately, if at decision step 507 the routing mode has not been set as optimisable, the process ends immediately without setting any animation target state at all. This 15 situation occurs for any attempt to carry out the steps of Fig. 5 when the current widget appearance is up to date with the current logical state of the widget. As no animation target state will be set, processing step 403 will find no animation target state and end. It can be seen then, that the task of determining an animation target state and routing mode is performed by following a contiguous sequence of optimisable 20 animation entries, or following a single non-optimisable animation entry, and setting the animation target state as the state that is found at the end of that sequence of animation or single animation step. By repeated application of this process, a sequence of animation is determined from a current widget appearance, following possible 3193685_1 970375_specilodge -25 animation paths that correspond to transitions between states of the widget, until the widget appearance is brought up to date with the logical state of the widget. In the preferred embodiment of the invention, the processes for recording a sequence of state changes as shown in Fig. 3, and for determining a sequence of 5 animation as shown in Figs. 4 and 5, are carried out simultaneously and continuously. This means that even when performing a sequence of animation as determined by the steps of Figs. 4 and 5, if a state change command is received by the widget, then the state change will be recorded according to the steps of Fig. 3, adding animation entries to the various queues for the transitions of the widget, and resulting in a change to the 10 widget's current state. The determination of a sequence of animation according to Figs. 4 and 5 will then be carried out using the updated animation queue entries and current state. Alternative embodiment as look-ahead. The task of determining the sequence of animations can be achieved by alternate 15 means to the processing steps of Figs. 3, 4 and 5 without departing from the spirit and scope of the present invention. For example, the sequence of state changes undergone by the state machine of the widget can be recorded by storing them in an external list, recording the start state and end state, and optimisability permission for each transition. Then, to determine a sequence of animation for the widget to perform in response, the 20 recorded list of transitions is examined to determine contiguous sections of optimisable transitions, or single non-optimisable transitions. For the former case, a corresponding sequence of optimised animation is determined that has the same start and end states as for the contiguous sequence of recorded optimisable state transitions. For the latter 3193685_1 970375_speci_lodge -26 case, a single animation is determined according to the non-optimisable recorded transition. Conditional statements for optimisability In Fig. 3, decision step 304 determines if an animation shall be recorded as 5 optimisable or non-optimisable according to a conditional expression. As discussed above, this evaluation may simply refer to a Boolean data value associated with the particular transition of the state machine, representing a simple flag that stores whether or not the animation associated with the transition is permitted to be optimised. However, more complex conditional expressions are possible. In one embodiment of 10 the invention, a transition within the state machine for a widget may be designated as non-optimisable in the situation in which less than a certain number of animations are already recorded for that transition, and optimisable otherwise. For example, for a certain transition in a widget, it is defined there shall be at most one pending non optimisable animation entry at any particular point in time. Therefore at decision step 15 304, the process refers to the animation entries that already exist in the queue for that transition. If there is already one or more non-optimisable animation entries recorded in that queue, then this animation will be recorded using an optimisable animation entry. But if no non-optimisable animation entries are found in the queue, the animation shall be recorded as non-optimisable. Other examples can be constructed in which the limit 20 of non-optimisable entries is some other value. It is clear to one skilled in the art that the construction of other conditional expressions for the evaluation of decision step 304 is not limited by the given examples, but may refer to any other information, status, state or data value associated with the widget or otherwise. 3193685_1 970375_specilodge - 27 It can be noted that the steps 504 and 510 of Fig. 5, in which entries at the head of outgoing queues are removed, performs the exact inverse operation to that performed in steps 305 and 306 of Fig. 3. This operation is "inverse" in the sense that every queue entry added to a queue in steps 305 and 306 at the time in which the sequence of 5 animations is recorded by the steps of Fig. 3 shall eventually, at the time in which the sequence of animations is played back (or optimised away), be removed from the queues. Thus, when the steps of Fig. 4 end, there having been no target set in processing step 403, all queues associated with transitions of the state machine of the widget are empty. In this manner, the queues do not accumulate entries associated with 10 state transitions that have already been handled. The method is then suitable for an implementation in which a constant memory size is assigned for storing each queue of transition entries. When in use for each of the multiple of widgets typically employed within a user interface, the processing steps and storage queues are performed separately for each 15 widget within the user interface. Fig. 8 shows a schematic flow diagram indicating a process for performing a sequence of animations from a first state to a final state via a plurality of states of a state machine based animated widget, in which each of the plurality of states has a defined visual appearance. The process begins at step 801, in which a sequence of state 20 transitions undergone by the state machine based animated widget are determined. This process occurs as previously described in accordance with the processing steps of Fig. 3. The process then flows to the step 802, where it is evaluated if each corresponding animation for the transitions within the sequence of step 801 are mandatory, again according to the steps of Fig. 3 in processing step 304. This evaluation is performed 3193685_1 970375_specilodge - 28 according to a first predetermined criteria, such as the value of a stored flag associated with the state transition. The process then flows to step 803, in which a sequence of transitions, selected from amongst the sequence of transitions of step 801, is identified such that the corresponding animation for each such transition was previously evaluated 5 as non-mandatory in step 802. This step is performed according to the processing steps of Figs. 4 and 5. The process then flows to step 804, in which an alternate sequence of animation is determined that is optimised, but having equivalent start and end states as the sequence identified in step 803. The optimisation is performed with regards to a second predetermined criteria, such as was previously described by the example criteria 10 such as shortest possible total duration for the animation sequence. The process then flows to step 805 in which a sequence of transitions is selected from amongst the sequence of transitions of step 801. The sequence of step 805 is identified such that the corresponding animation for each such transition was previously evaluated as mandatory in step 802, again according to the processing steps of Figs. 4 15 and 5. Finally, the process flows to step 806 in which the identified animation sequences (the alternate sequence from step 804, and the sequence from 805) are executed. Typically when determining a sequence of animation in response to a sequence of state changes undergone by a state machine based animated widget, the executed animation sequences of Fig. 8 are a subset of many such identified sequences 20 of mandatory and non-mandatory animation. Advantages. The disclosed embodiments possess several advantage over the prior art, as shall be described in reference to Figs. 9 and 10. Fig. 9a depicts a state machine for a widget within a user interface that operates a rotating menu of buttons. Each of state A 901, 3193685_1 970375_specilodge - 29 state B 902, state C 903 and state D 904 is associated with a distinct layout of a set of controls 906, 907, 908 and 909, that form a rotating menu. Fig. 9b shows one configuration of menu items 906, 907, 908 and 909 for state A 901 of Fig. 9a. Fig. 9c shows another configuration of these menu items for state B 902 of Fig. 9a. It can be 5 seen that these configurations differ in that the positions of the menu items have been rotated by ninety degrees around the central point of the menu. When any of the state to state transitions such as 905 are carried out, the configuration of the menu items is linearly interpolated giving the appearance of rotating motion for the set of menu items. State change transitions 905 are triggered by a command that has a meaning 10 "rotate clockwise" or "rotate anti-clockwise". Thus, when such a state change command is received by the widget, a state change occurs from a current state to a new state, according to the steps of Fig. 3. Correspondingly, an animated change of appearance occurs for the widget, in a rotating manner as shown in Figs. 9b and 9c. For the case in which all of the state transitions of Fig. 9a are defined to be for 15 optimisable animation, a rapid sequence of three "rotate clockwise" commands sent to the widget results in, for example, the widget quickly transitioning from state A 901, to state D904. According to the steps of Figs. 4 and 5, a sequence of optimisable animation is then identified, in which a single animation step from state A 901 directly to state B 902 is performed. However, this animation has the visual appearance of the 20 menu items rotating in an anti-clockwise direction, and this may not be the animation result intended by the creator of this widget - it is reasonable to expect that the animation will instead follow a clockwise direction of rotation that matches the state change commands that were given to the menu widget. 3193685_1 970375_specilodge - 30 The disclosed arrangements allow the creator of the user interface to instead declare the state transitions 905 are for non-optimisable animation. In this case, a rapid sequence of three "rotate clockwise" commands sent to the widget will result in non optimisable animation entries being recorded in the queues for the menu widget state 5 machine according the steps of Fig. 3. Then, according to Figs. 4 and 5, a matching sequence of animation will result, in which optimisations are not permitted to occur. Hence, the sequence of animation will always match the intended direction of rotation, rather than taking an unexpected optimised animation path. The widget controlled by state machine of Fig. 9a may additionally have states other than those for controlling 10 rotation, and transitions to and from such states may be designated as optimisable or non-optimisable as required. For example, there may be one of more states integrated into the state machine that represent an inactive state for the rotating widget. Fig. 10 depicts a state machine for a button style widget as may be used in some interactive graphical user interface. This state machine has a NORMAL state 1001, an 15 OVER state 1002, and a DOWN state 1003. These states are linked together with transitions for optimisable animation 1012, 1023, 1021, and transitions for non optimisable animation 1032. There are state change commands associated with these transitions, such that the widget responds to a mouse enter event that takes it from NORMAL 1001 to OVER state 1002; a mouse button down event that takes it from 20 OVER 1002 to DOWN 1003 state; a mouse button release event that takes it from DOWN 1003to OVER 1002 state; and a mouse exit event that takes it from OVER 1002 to NORMAL 1001 state. In Fig. 10, the creator of this button widget has designated the transition 1032 from the DOWN to OVER states as being non-optimisable - that is, if the state change from DOWN to OVER occurs, then the corresponding animation must 3193685_1 970375_speci_lodge - 31 occur without being optimised out. In Fig. 10, a bold line represents a non-optimisable transition between states, and a fine line represents an optimisable transition. For the button governed by the state machine of Fig. 10, if a rapid sequence of alternate mouse enter and mouse exit commands is received, then the current state will 5 rapidly oscillate between the NORMAL state 1001, and the OVER state 1002. The queued sequence of recorded state transitions can be accumulated at a faster rate than the animations can be played. However, as the transitions 1012, 1021 connecting these states are designated as being optimisable, then unnecessary repeated animation sequences will be dropped. This is advantageous as it is not desired to show a long 10 sequence of oscillating animation between the NORMAL 1001 and OVER 1002 states this oscillating sequence of animation may take a long time to be displayed. In this situation, it is acceptable for the creator of the button widget to designate these superfluous animations to be discarded as they are not important for the usability or utility of the widget. 15 Similarly, if the button governed by the state machine of Fig. 10 received a rapid sequence of mouse button down and mouse button release events starting at the OVER state 1002, then the current state will rapidly oscillate between the OVER state 1002 and DOWN state 1003. In this example, the creator of the button widget observes that the animation from the DOWN state 1003 to the OVER state 1002 occurs when a user 20 interacting with the button has actually pressed the button, and expects some interaction with the user interface to occur. Therefore it is important for the corresponding animation sequence to be displayed, as this animation acts as visual feedback to confirm that the desired interaction has occurred. In order to accomplish this, the creator of the button widget has designated the transition 1032 from the DOWN state 1003 to OVER 3193685_1 970375_speci_lodge - 32 state 1002 as non-optimisable, so that it is not possible for such animation to be dropped by the optimisation process. Furthermore, the non-optimisable transition 1032 has been designated with an optimisation limit 1006 with the value 1. This means that the conditional expression of decision step 304 is evaluated in reference to the animation 5 queue of transition 1032; if pending animation for this transition has not yet been queued (that is, not yet reached the optimisation limit of 1), then the animation is considered to be non-optimisable, and if pending animation for this transition is already recorded in the queue, then the animation is considered to be optimisable. In order to determine if such an optimisation limit has been reached, the queue for the 10 transition 1032 is examined, and the number of entries representing recorded optimisable or non-optimisable transitions is counted. This arrangement has the advantage that the important user-confirmation animation associated with the state transition 1032 will usually not be dropped by the optimisation process, but if a rapid oscillating sequence of state changes occur, then it will not be necessary to queue up a 15 long sequence of animation that may otherwise cause the button to play a repeated sequence of animation. It is therefore an advantage of the present invention that an additional degree of control is imparted to the creator of a widget for a user interface, giving the ability to sclectivcly define which transitions between states of a widget are permitted to be 20 optimised and which ones are not. INDUSTRIAL APPLICABILITY The arrangements described are applicable to the computer and data processing industries and particularly for animation. 3193685_1 970375_specilodge - 33 The foregoing describes only some embodiments of the present invention, and modifications and/or changes can be made thereto without departing from the scope and spirit of the invention, the embodiments being illustrative and not restrictive. (Australia Only) In the context of this specification, the word "comprising" 5 means "including principally but not necessarily solely" or "having" or "including", and not "consisting only of". Variations of the word "comprising", such as "comprise" and "comprises" have correspondingly varied meanings. 10 3193685_1 970375_speci_lodge

Claims (4)

1. A method of performing a sequence of animations from a first state to a final state via a plurality of states of a state machine based animated widget, each of the 5 plurality of states having a defined visual appearance, the method comprising the steps of: determining a plurality of state transitions from the first state to the final state; evaluating if a corresponding animation associated with each of the transitions is mandatory or non-mandatory based on a first predetermined criteria; 10 selecting a first sequence of state transitions from the plurality of state transitions having a corresponding first sequence of animations evaluated as non-mandatory, said first sequence of animations having a start and end state; determining an alternative sequence of animation between the start and end states of the first sequence of animations wherein the alternative sequence is selected to 15 optimise a second predetermined criteria; selecting a second sequence of state transitions from the plurality of state transitions having a corresponding second sequence of animations evaluated as mandatory; and executing at least the alternative sequence of animations and the second 20 sequence of animations to perform the sequence of animations.
2. A method of performing a sequence of animations from a first state to a final state via a plurality of states of a state machine based animated widget, said method being substantially as described herein with reference to the drawings. 3193685_1 970375_speci_lodge
3. Computerized appratus configured to perform the method of claim I or 2.
4. A computer readable storage medium having a program recorded thereon, the 5 program being executable by computerized appratus to perform the method of claim I or 2. Dated this thirteenth day of December 2010 CANON KABUSHIKI KAISHA 10 Patent Attorneys for the Applicant Spruson&Ferguson 3193685_1 970375_specilodge
AU2010249319A 2010-12-13 2010-12-13 Conditional optimised paths in animated state machines Abandoned AU2010249319A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2010249319A AU2010249319A1 (en) 2010-12-13 2010-12-13 Conditional optimised paths in animated state machines

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2010249319A AU2010249319A1 (en) 2010-12-13 2010-12-13 Conditional optimised paths in animated state machines

Publications (1)

Publication Number Publication Date
AU2010249319A1 true AU2010249319A1 (en) 2012-06-28

Family

ID=46464862

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2010249319A Abandoned AU2010249319A1 (en) 2010-12-13 2010-12-13 Conditional optimised paths in animated state machines

Country Status (1)

Country Link
AU (1) AU2010249319A1 (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DK178589B1 (en) * 2014-08-02 2016-08-01 Apple Inc Context-specific user interfaces
US9459781B2 (en) 2012-05-09 2016-10-04 Apple Inc. Context-specific user interfaces for displaying animated sequences
US9547425B2 (en) 2012-05-09 2017-01-17 Apple Inc. Context-specific user interfaces
US9916075B2 (en) 2015-06-05 2018-03-13 Apple Inc. Formatting content for a reduced-size user interface
US10055121B2 (en) 2015-03-07 2018-08-21 Apple Inc. Activity based thresholds and feedbacks
US10254948B2 (en) 2014-09-02 2019-04-09 Apple Inc. Reduced-size user interfaces for dynamically updated application overviews
US10272294B2 (en) 2016-06-11 2019-04-30 Apple Inc. Activity and workout updates
US10304347B2 (en) 2012-05-09 2019-05-28 Apple Inc. Exercised-based watch face and complications
US10452253B2 (en) 2014-08-15 2019-10-22 Apple Inc. Weather user interface
US10613745B2 (en) 2012-05-09 2020-04-07 Apple Inc. User interface for receiving user input
US10620590B1 (en) 2019-05-06 2020-04-14 Apple Inc. Clock faces for an electronic device
US10771606B2 (en) 2014-09-02 2020-09-08 Apple Inc. Phone user interface
US10802703B2 (en) 2015-03-08 2020-10-13 Apple Inc. Sharing user-configurable graphical constructs
US10838586B2 (en) 2017-05-12 2020-11-17 Apple Inc. Context-specific user interfaces
US10852905B1 (en) 2019-09-09 2020-12-01 Apple Inc. Techniques for managing display usage
US10872318B2 (en) 2014-06-27 2020-12-22 Apple Inc. Reduced size user interface
US10990270B2 (en) 2012-05-09 2021-04-27 Apple Inc. Context-specific user interfaces
US11061372B1 (en) 2020-05-11 2021-07-13 Apple Inc. User interfaces related to time
US11301130B2 (en) 2019-05-06 2022-04-12 Apple Inc. Restricted operation of an electronic device
US11327650B2 (en) 2018-05-07 2022-05-10 Apple Inc. User interfaces having a collection of complications
US11372659B2 (en) 2020-05-11 2022-06-28 Apple Inc. User interfaces for managing user interface sharing
US11526256B2 (en) 2020-05-11 2022-12-13 Apple Inc. User interfaces for managing user interface sharing
US11580867B2 (en) 2015-08-20 2023-02-14 Apple Inc. Exercised-based watch face and complications
US11604571B2 (en) 2014-07-21 2023-03-14 Apple Inc. Remote user interface
US11694590B2 (en) 2020-12-21 2023-07-04 Apple Inc. Dynamic user interface with time indicator
US11720239B2 (en) 2021-01-07 2023-08-08 Apple Inc. Techniques for user interfaces related to an event
US11921992B2 (en) 2021-05-14 2024-03-05 Apple Inc. User interfaces related to time
US11960701B2 (en) 2019-05-06 2024-04-16 Apple Inc. Using an illustration to show the passing of time

Cited By (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10496259B2 (en) 2012-05-09 2019-12-03 Apple Inc. Context-specific user interfaces
US10990270B2 (en) 2012-05-09 2021-04-27 Apple Inc. Context-specific user interfaces
US9547425B2 (en) 2012-05-09 2017-01-17 Apple Inc. Context-specific user interfaces
US9582165B2 (en) 2012-05-09 2017-02-28 Apple Inc. Context-specific user interfaces
US10613745B2 (en) 2012-05-09 2020-04-07 Apple Inc. User interface for receiving user input
US11740776B2 (en) 2012-05-09 2023-08-29 Apple Inc. Context-specific user interfaces
US9459781B2 (en) 2012-05-09 2016-10-04 Apple Inc. Context-specific user interfaces for displaying animated sequences
US10606458B2 (en) 2012-05-09 2020-03-31 Apple Inc. Clock face generation based on contact on an affordance in a clock face selection mode
US9804759B2 (en) 2012-05-09 2017-10-31 Apple Inc. Context-specific user interfaces
US10304347B2 (en) 2012-05-09 2019-05-28 Apple Inc. Exercised-based watch face and complications
US10613743B2 (en) 2012-05-09 2020-04-07 Apple Inc. User interface for receiving user input
US11720861B2 (en) 2014-06-27 2023-08-08 Apple Inc. Reduced size user interface
US10872318B2 (en) 2014-06-27 2020-12-22 Apple Inc. Reduced size user interface
US11250385B2 (en) 2014-06-27 2022-02-15 Apple Inc. Reduced size user interface
US11604571B2 (en) 2014-07-21 2023-03-14 Apple Inc. Remote user interface
DK178589B1 (en) * 2014-08-02 2016-08-01 Apple Inc Context-specific user interfaces
US11042281B2 (en) 2014-08-15 2021-06-22 Apple Inc. Weather user interface
US10452253B2 (en) 2014-08-15 2019-10-22 Apple Inc. Weather user interface
US11922004B2 (en) 2014-08-15 2024-03-05 Apple Inc. Weather user interface
US11550465B2 (en) 2014-08-15 2023-01-10 Apple Inc. Weather user interface
US11700326B2 (en) 2014-09-02 2023-07-11 Apple Inc. Phone user interface
US10254948B2 (en) 2014-09-02 2019-04-09 Apple Inc. Reduced-size user interfaces for dynamically updated application overviews
US10771606B2 (en) 2014-09-02 2020-09-08 Apple Inc. Phone user interface
US10055121B2 (en) 2015-03-07 2018-08-21 Apple Inc. Activity based thresholds and feedbacks
US10409483B2 (en) 2015-03-07 2019-09-10 Apple Inc. Activity based thresholds for providing haptic feedback
US10802703B2 (en) 2015-03-08 2020-10-13 Apple Inc. Sharing user-configurable graphical constructs
US9916075B2 (en) 2015-06-05 2018-03-13 Apple Inc. Formatting content for a reduced-size user interface
US10572132B2 (en) 2015-06-05 2020-02-25 Apple Inc. Formatting content for a reduced-size user interface
US11908343B2 (en) 2015-08-20 2024-02-20 Apple Inc. Exercised-based watch face and complications
US11580867B2 (en) 2015-08-20 2023-02-14 Apple Inc. Exercised-based watch face and complications
US10272294B2 (en) 2016-06-11 2019-04-30 Apple Inc. Activity and workout updates
US11148007B2 (en) 2016-06-11 2021-10-19 Apple Inc. Activity and workout updates
US11161010B2 (en) 2016-06-11 2021-11-02 Apple Inc. Activity and workout updates
US11660503B2 (en) 2016-06-11 2023-05-30 Apple Inc. Activity and workout updates
US11918857B2 (en) 2016-06-11 2024-03-05 Apple Inc. Activity and workout updates
US11775141B2 (en) 2017-05-12 2023-10-03 Apple Inc. Context-specific user interfaces
US10838586B2 (en) 2017-05-12 2020-11-17 Apple Inc. Context-specific user interfaces
US11327634B2 (en) 2017-05-12 2022-05-10 Apple Inc. Context-specific user interfaces
US11327650B2 (en) 2018-05-07 2022-05-10 Apple Inc. User interfaces having a collection of complications
US11131967B2 (en) 2019-05-06 2021-09-28 Apple Inc. Clock faces for an electronic device
US11960701B2 (en) 2019-05-06 2024-04-16 Apple Inc. Using an illustration to show the passing of time
US10788797B1 (en) 2019-05-06 2020-09-29 Apple Inc. Clock faces for an electronic device
US11340778B2 (en) 2019-05-06 2022-05-24 Apple Inc. Restricted operation of an electronic device
US11340757B2 (en) 2019-05-06 2022-05-24 Apple Inc. Clock faces for an electronic device
US11301130B2 (en) 2019-05-06 2022-04-12 Apple Inc. Restricted operation of an electronic device
US10620590B1 (en) 2019-05-06 2020-04-14 Apple Inc. Clock faces for an electronic device
US10936345B1 (en) 2019-09-09 2021-03-02 Apple Inc. Techniques for managing display usage
US10852905B1 (en) 2019-09-09 2020-12-01 Apple Inc. Techniques for managing display usage
US10878782B1 (en) 2019-09-09 2020-12-29 Apple Inc. Techniques for managing display usage
US10908559B1 (en) 2019-09-09 2021-02-02 Apple Inc. Techniques for managing display usage
US11442414B2 (en) 2020-05-11 2022-09-13 Apple Inc. User interfaces related to time
US11061372B1 (en) 2020-05-11 2021-07-13 Apple Inc. User interfaces related to time
US11822778B2 (en) 2020-05-11 2023-11-21 Apple Inc. User interfaces related to time
US11372659B2 (en) 2020-05-11 2022-06-28 Apple Inc. User interfaces for managing user interface sharing
US11842032B2 (en) 2020-05-11 2023-12-12 Apple Inc. User interfaces for managing user interface sharing
US11526256B2 (en) 2020-05-11 2022-12-13 Apple Inc. User interfaces for managing user interface sharing
US11694590B2 (en) 2020-12-21 2023-07-04 Apple Inc. Dynamic user interface with time indicator
US11720239B2 (en) 2021-01-07 2023-08-08 Apple Inc. Techniques for user interfaces related to an event
US11921992B2 (en) 2021-05-14 2024-03-05 Apple Inc. User interfaces related to time

Similar Documents

Publication Publication Date Title
AU2010249319A1 (en) Conditional optimised paths in animated state machines
US10176620B2 (en) Automatic animation generation
US9721374B2 (en) Chart animation
US7178111B2 (en) Multi-planar three-dimensional user interface
US9355012B2 (en) Stepping and application state viewing between points
US7464341B2 (en) Canceling window close commands
US6904561B1 (en) Integrated timeline and logically-related list view
TWI492062B (en) Methods and devices for programming a state machine engine
US20100150520A1 (en) Method and system for controlling playback of a video program including by providing visual feedback of program content at a target time
US8504985B2 (en) Context sensitive script editing for form design
JP6276675B2 (en) Media editing method and electronic device for processing the method
US8819567B2 (en) Defining and editing user interface behaviors
CN107765968A (en) Target switching method, device, terminal and computer-readable recording medium
JPH10116265A (en) Navigation editor for constructing multimedia title
WO2018001202A1 (en) Video playback mode switching method, device, program and medium
US8259132B2 (en) Rotationally dependent information in a three dimensional graphical user interface
CN110225246B (en) Event script generation method and device, electronic equipment and computer readable storage medium
US6480207B1 (en) Method, apparatus and computer program product for implementing graphical user interface (GUI) window control
US20100205531A1 (en) Portable media playback device including user interface event passthrough to non-media-playback processing
US9164576B2 (en) Conformance protocol for heterogeneous abstractions for defining user interface behaviors
US9767592B2 (en) Animating content display
CN106940722A (en) A kind of image display method and device
US9671923B2 (en) Multi-view model for mobile applications based on double stacks
US20090328036A1 (en) Selection of virtual computing resources using hardware model presentations
US20130063484A1 (en) Merging User Interface Behaviors

Legal Events

Date Code Title Description
MK4 Application lapsed section 142(2)(d) - no continuation fee paid for the application