WO1998045767A2 - Procede de sequencement de taches informatisees en fonction de la repartition spatiale des differents objets de taches dans un champ oriente - Google Patents

Procede de sequencement de taches informatisees en fonction de la repartition spatiale des differents objets de taches dans un champ oriente Download PDF

Info

Publication number
WO1998045767A2
WO1998045767A2 PCT/US1998/006086 US9806086W WO9845767A2 WO 1998045767 A2 WO1998045767 A2 WO 1998045767A2 US 9806086 W US9806086 W US 9806086W WO 9845767 A2 WO9845767 A2 WO 9845767A2
Authority
WO
WIPO (PCT)
Prior art keywords
objects
task
master
sequence
sequencing
Prior art date
Application number
PCT/US1998/006086
Other languages
English (en)
Other versions
WO1998045767A3 (fr
Inventor
Fred Steven Isom
Original Assignee
Fred Steven Isom
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US08/905,701 external-priority patent/US6948173B1/en
Application filed by Fred Steven Isom filed Critical Fred Steven Isom
Priority to EP98913211A priority Critical patent/EP1031079A4/fr
Publication of WO1998045767A2 publication Critical patent/WO1998045767A2/fr
Publication of WO1998045767A3 publication Critical patent/WO1998045767A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/4881Scheduling strategies for dispatcher, e.g. round robin, multi-level priority queues
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/34Graphical or visual programming

Definitions

  • the present invention relates generally to graphical user interfaces for operating a programmed computer and more particularly, to a graphical user interface for sequencing tasks to be performed by the computer.
  • the traditional methods for sequencing a series of tasks is to use a program, script or graphical user interface (GUI).
  • GUI graphical user interface
  • the program or script consists of a series of instructions which tells the computer to perform certain tasks.
  • most programming or scripting languages require an understanding of programming methods and of the syntax of the programming language used.
  • Most computer users do not have the time or skill needed to build constructs to make these tools useful. Therefore this approach is not viable for such users.
  • Another problem with this method for sequencing tasks is that one must rewrite the code if it is desired to modify the sequence of tasks.
  • the tasks to be performed may be represented as objects. After the default behavior of the objects is defined, the user must manually connect the objects to define the sequence of the tasks. The process of sequencing the tasks is typically done by the user manually connecting the objects together in the desired sequence. A problem with this method is that it is time consuming and error prone.
  • many GUI based programs allow the objects to be sequenced by manually selecting a first object referred to as the source object, manually selecting a second object referred to as the target object, and creating a link between the source and target objects. This link typically appears as a line extending between the source object and the target object. This line may also include an arrow to indicate the direction of sequencing. This process is repeated over and over to create a continuous chain, where the previous target object becomes the source object with the new object chosen as the target. Once constructed, the entire sequence or chain may be triggered and the underlying tasks executed in the order specified by the user.
  • the present invention provides a graphical method for sequencing computer controlled tasks.
  • the tasks which are controlled by the computer are represented as objects in a graphical user interface.
  • the task objects are placed by the user in a directional field in the user interface.
  • the directional field includes a directional attribute which is represented in the user interface by a directional indicator.
  • the directional attribute specifies how the order of the tasks within the field is determined.
  • the tasks are automatically sequenced by the computer when the sequence is executed based on the relative location of the objects in the directional field with respect to one another and the directional attribute of the directional field.
  • the user does not need to explicitly link one object to another. Instead, links are created dynamically when the sequence is executed.
  • the user can modify the sequence of tasks in one of two ways.
  • the sequence of tasks can be changed by moving objects in the directional field so that the relative location of the objects is changed.
  • the directional attribute of the directional field can be changed to change the sequence.
  • the links between objects will by dynamically re-created the next time the sequence is executed.
  • links between objects are drawn on the screen as the sequence is executed.
  • the links appear in the interface as a line connecting two objects together.
  • the links form geometric patterns which provide information regarding the nature of the underlying sequence, irrespective of the particular application any individual, organization or group chooses to use. For example, the pattern reflects both the ordering of tasks comprising the sequence and the spatial preferences of the individual user who created the sequence.
  • Man's innate pattern recognition ability provides a basis for a uniform methodology for creating systems to order a sequence of operations.
  • the user will intuitively know how that object will be incorporated into the chain without the need to manually draw links connecting the new object to existing objects.
  • the advantages of such a system are many. Not only does such a system provide consistent reliable information in the form of the geometric patterns which naturally result from the sequencing, but it allows meaning to be passed among individuals without the need for common language or shared cultural background.
  • the directional field indicator permits the knowledgeable user to intuitively estimate the proper placement of objects within the directional field to achieve the desired result.
  • the user does not need to explicitly create a link between objects. Instead, the links will be created dynamically when the sequence is executed. Eliminating the need to explicitly create links between objects makes sequence creation more efficient. Just as important, eliminating the need to explicitly create links allows the user to focus attention on the overall sequence.
  • a primary advantage of the present invention is the improvement in efficiency in creating complex sequences of tasks. There is no longer any need to manually create or modify sequences as was the case with prior art programs. A knowledgeable user of any skill level, whether a novice or an experienced user, can quickly and easily modify sequences by rearranging the objects in the directional field. Since the user does not need to explicitly define links between objects, fewer actions are required.
  • Another advantage of the present invention lies in the loose coupling of the spatial arrangements of objects in the user interface and the resulting geometric patterns which are formed by lines connecting the objects together.
  • These geometric patterns provide the user with information regarding the nature of the underlying sequence, irrespective of the particular application in use.
  • the user can readily interpret the meaning of the pattern without the necessity for prerequisite knowledge of the context in which the application was originally constructed.
  • the user can easily transfer patterns from one environment to the next without spending time and energy gaining expertise in the particular program in which its original pattern was constructed.
  • a knowledgeable user, moving from one program to another, can become productive, faster and without the need for retraining to learn the nuances of the new program.
  • Another advantage is that the present invention allows expertise to be transferred via task sequence patterns.
  • the geometric patterns have meaning to those within knowledge of the context in which it was created. Therefore, a knowledgeable user can more easily understand and modify sequences created by another.
  • Figure 1 illustrates a plurality of task objects and a single master object arranged in a directional field having its directional attribute set to UPPER RIGHT TO LOWER LEFT.
  • Figure 2 is an illustration of the task sequence pattern produced by the arrangement of objects in Figure 1.
  • Figure 3 illustrates the same objects as shown in Figure 1 wherein one of the task objects has been moved.
  • Figure 4 shows the task sequence for the arrangement of objects shown in Figure 3.
  • Figure 5 shows the same objects as shown in Figure 1 wherein the directional attribute has been changed to UPPER LEFT TO LOWER RIGHT.
  • Figure 6 shows the task sequence pattern for the arrangement of objects shown in Figure 5.
  • Figure 7 is an illustration of two viewers observing a task sequence pattern.
  • Figure 8 shows a plurality of task objects and a master object placed in a directional field wherein the master object has a limited region of influence.
  • Figure 9 shows the same task objects and master object as shown in Figure 8 wherein the master object and its associated region of influence has been moved.
  • Figure 10 shows a plurality of task objects and two master objects placed in a directional field wherein each master object has a limited region of influence, and further wherein each limited region of influence has its Interaction property set to NONE.
  • Figure 11 shows the same task objects and master objects as seen in Figure 10, however, the Interaction Property of each region of influence of each master object is set to CALL OTHER MASTER.
  • Figure 12 shows the same task objects and master objects as seen in Figure 10, however, the Interaction property of each region of influence is set to CALL ALL OBJECTS.
  • Figure 13 shows a plurality of task objects of different types and a plurality of type specific master objects placed in a directional field.
  • Figure 14 is a flow diagram describing the sequencing method of the present invention.
  • Figure 15 is an illustration of a typical user interface for a software program incorporating the sequencing method of the present invention.
  • Figure 16 is an illustration showing the various spatial sequence indicators for a 2- dimensional directional field.
  • Figure 17 is an illustration of a 3-dimensional directional field having a plurality of task objects and master objects with limited regions of influence.
  • Figure 18 is an exterior view of an out-in menu object.
  • Figure 19 is an interior view of an out-in menu object.
  • Figure 20 is an illustration of a virtual office inco ⁇ orating the sequencing method of the present invention. Detailed Description of The Invention
  • the method is implemented through a user interface 10 in which the computer controlled tasks are graphically represented as task or task objects 14 in a spatial field 12 on a computer display.
  • the tasks are automatically sequenced by the computer based on the relative location of the task objects 14 in the spatial field 12.
  • the spatial field 12 includes a directional attribute which specifies how the order of the tasks is determined.
  • the user creates task objects and places the task objects in the spatial field 12.
  • the task objects 14 represent specific tasks in a sequence or procedure. Task objects may be represented as push-buttons or as icons which inform the user of the task associated with the particular object instance.
  • a task object 14 Once a task object 14 is created or instantiated, its default behavior or functionality is set by the user.
  • the behavior of the task object 14 may, for example, be set through property sheets which provide access to the various properties and methods which are encapsulated by the task object.
  • the user By setting or changing the properties of the task object 14, the user can specify the functions or tasks which the task object 14 performs.
  • a task object 14 may represent virtually any task which can be performed or controlled by a computer.
  • a task object 14 may be used to execute other programs on the computer and to send keystrokes to the application.
  • properties affecting the basic functionality of the task objects 14 are accessible to the user, once the object has been instantiated.
  • a computer-controlled process or procedure will typically include multiple tasks which are represented as a series of task objects 14 in the user interface.
  • the tasks represented as task objects 14 are sequenced and performed automatically by the computer.
  • a line known as a sequence line 20 may be drawn between each task object 14 in the sequence.
  • the sequence lines 20 extend between points on the objects referred to herein as the object location point 24.
  • the object location point 24 lies on the upper left corner of each of the objects in the user interface 10.
  • the object location point 24 could, however, be placed in a location external to the object.
  • the object location point 24, whether internal or external is used by the computer to determine the sequence of the task objects 14. That is, it is the position of the object location point 24 which is used to determine the sequencing of objects.
  • sequence lines 20 are drawn from the object location point 24 of one task object 14 to the object location point 24 of the next task object 14 in the sequence.
  • sequence lines 20 also serve as a form of progress indicator.
  • the sequence lines 20 form a pattern referred to herein as the task sequence pattern 22.
  • the user can sequence the computer controlled tasks in one of two ways.
  • the sequence of tasks can be changed by moving the corresponding task objects 14 in the spatial field 12 to a new location. That is, the tasks' relative position in the sequence can be changed by simply moving the iconic representation of the task (i.e., task object) in the spatial field 12.
  • the computer automatically re-sequences the tasks whenever one of the task objects 14 is moved in the spatial field 12 without the user having to explicitly re-link the object.
  • the second way to change the sequence of tasks is to change the directional attribute of the spatial field 12.
  • the directional attribute specifies how the tasks are sequenced based on their location in the spatial field 12. For example, in a two-dimensional spatial field 12, a directional attribute could specify that tasks are sequenced from top-right-to-bottom-left based on the corresponding task object's location in the spatial field 12. If the directional attribute were changed to specify a bottom-right-to-top-left sequence, the order of tasks would change even though the task objects 14 all remain in the same location.
  • the directional attribute is represented as an icon called the spatial sequence indicator 18 which is displayed on the user interface.
  • the directional attribute of the spatial field is set by accessing a property sheet, for example, by right clicking in the spatial field 12.
  • the property sheet would allow the user to set properties for the spatial field.
  • One of the properties is the directional attribute of the spatial field. This property specifies how objects placed within the spatial field will be sequenced.
  • the directional attribute has six possible settings. Each setting is represented by a different spatial sequence locator which is shown in Figure 16.
  • the values of the directional attribute include UPPER LEFT TO LOWER RIGHT, LOWER RIGHT TO UPPER LEFT, LOWER LEFT TO UPPER RIGHT, UPPER RIGHT TO LOWER LEFT, CURRENT POSITION OUTWARD, and OUTER MOST POINTS INWARD.
  • the spatial field 12 may also have other attributes such as color or font which can be set by the user.
  • Figure 5 shows the same task objects 12 as seen in Figure 1 in the same locations, however, the directional attribute of the spatial field 12 has been changed to specify an UPPER LEFT TO BOTTOM RIGHT sequence.
  • the order of execution of the tasks has been affected and a different task sequence pattern 22 results.
  • the new sequence begins with Objectl, then proceeds to Object2, Object3, Object4, and Object ⁇ in that order.
  • the task sequence pattern 22 for this sequence is shown in Figure 6. This example demonstrates how the sequence of tasks can be changed without changing the relative locations of the objects in the spatial field 12.
  • Figures 8 and 9 illustrate how the region of influence 26 can be used in sequencing.
  • five task objects 14 and one master object 16 are placed in a spatial field 12.
  • the region of influence 26 of the master object 16 is shown by a boundary line 26 which is visible in the user interface 10.
  • Two task objects 14, namely, Objectl and Object2 fall within the region of influence 26 of the master object 16.
  • Object3, Object4, and Object5 lie outside the region of influence 26 of the master object 16.
  • Objectl and Object2 are included in the sequence.
  • Object3, Object4, and Object5 are excluded since they lie outside of the region of influence 26 of the master object 16.
  • Figure 9 shows the same spatial arrangement of task objects 14 as shown in Figure 8, however the region of influence 26 of the master object 16 has been shifted to the right.
  • the regions of influence 26 By moving the region of influence 26 the tasks represented by Objectl and Object2 are excluded while the tasks represented by Object3, Object4, and Object5 are included.
  • the new order of execution would be Object3, Object4, and Object5.
  • the method for sequencing computer controlled tasks of the present invention supports multiple master objects 16, each having its own region of influence 26.
  • the present invention also supports interactions between master objects 16.
  • Each master object 16 has properties that can be set by the user. One of the properties already mentioned is the directional attribute. Another property of the master object 16 which can be set by the user is the Mode property.
  • the Mode property specifies the interaction mode between one master object 16 and other master objects 16. In the present invention, three modes are available NONE, CALL OTHER MASTER, and CALL ALL OBJECTS. If the interaction mode is set to NONE, the master object 16 will sequence task objects 14 within its own region of influence 26 and will ignore task objects 14 outside of its own region of influence 26.
  • Figures 10-12 illustrate the interaction between different master objects 16.
  • Figure 10 shows two master objects with intersecting regions of influence 26.
  • Each master object 16 has two task objects 14.
  • Objectl and Object2 belong to Masterl.
  • Object3 and Object4 belong to Master2.
  • the tasks represented by Objectl and Object2 are executed.
  • the second master object, Master2 is triggered, the tasks represented by Object3 and Object4 are executed. It should be noted that since Object2 is not triggered by Master2 even though it appears to fall within the region of influence 26 of Master2. This is because Object2 is a child of Masterl and not of Master2.
  • Figure 11 shows the same master objects 16 and task objects 14 as seen in Figure 10.
  • the interaction mode property for each master object 16 is set to CALL OTHER MASTER.
  • the first master object 16, Masterl when the first master object 16, Masterl, is triggered, the tasks represented by Objectl and Object2 are executed.
  • the first master object 16, Masterl has completed the sequencing of objects within its region of influence 26, it calls the second master object 16, Master2.
  • Master2 must have is Reactivity property set to respond to Masterl . Master2 then sequences and executes the processes represented by the objects within its region of influence 26. Specifically, Master2 causes the tasks associated with Object3 and Object4 to be executed.
  • the CALL OTHER MASTERS property allows master objects to respond to other master objects 16 as if it were a task object 14. All master objects 16 have a Reactivity property which can be set by the user so that one master object 16 will respond to other master types or to a particular master by name. The Reactivity property would identify a particular master object 16 by type or by name to which it will respond. By setting this property, one master object 16 may be called by another master object 16. When a master object 16 has its default behavior triggered by another master object 16 which is on its reactivity list, it is referred to as a servant-master object. The reactivity list serves as a safeguard in large programs in which there may be hundreds of master and task master objects.
  • Figure 14 is a flow diagram illustrating the process by which the computer sequences tasks represented as task objects 14 in the user interface.
  • the process is normally triggered by an event (block 100).
  • the event triggers the default behavior of a master object 16.
  • master objects 16 are not an essential part of the triggering process.
  • Other techniques may be used to trigger the sequence such as timer events or external input.
  • a dynamic data structure is created (block 102) to store information about the objects in the user interface. The information stored would include the location of the objects.
  • a function is called to return the number of objects to be sequenced (block 104).
  • the computer then iterates through all the task objects 14 within the region of influence 26, the master object 16 (block 106).
  • the interface includes a main window 200 comprised of a number of common Windows components which are typical in most Windows applications.
  • the main window 200 comprises a window frame 202 which encloses the other parts of the main window 200.
  • a title bar 206 stretches across the top of the main window 200.
  • the system menu and application icon 204 are disposed in the upper left corner of the window at the end of the title bar 206.
  • Three title bar buttons are disposed at the right end of the title bar 206. The left most of these buttons is the minimize window button 208 which allows the user to minimize the window.
  • the button to the immediate right of the minimize window button 208 is the maximize window button 210.
  • the main functions of the user application are accessed through a menu bar 224 and a tool bar 226.
  • the menu bar 224 lies just below the title bar 206 and provides numerous menu options, such as File and Help.
  • File When File is selected, a list of menu options is presented (e.g. New, Open, Save As, Exit).
  • the Help menu activates a help file. It will be understood by those skilled in the art that each menu item may include many menu items as well as submenus.
  • the construction of menus for application programs is well known to those skilled in the art.
  • the tool bar 226 typically consists of a series of buttons, some of which provide access to the same functions as the menu bar 224. For example, at the left end of the tool bar 226 are three buttons which duplicate functions of the menu bar 224.
  • the File Open button 228 opens a common windows dialog box for opening files.
  • the File Save button 230 opens a common windows dialog box for saving files.
  • the Exit button 232 closes the application.
  • buttons 240, 242, 244, 246, 248, 250, and 252 are used to instantiate task objects 14 in the user interface 10.
  • Buttons 260 and 262 are used to instantiate master objects 16 in the user interface 10.
  • the user selects one of the buttons (usually by clicking on the button with a pointing device 222 such as a mouse), positions the cursor 222 over the visible client area 218, and then clicks at the desired location with the cursor 222.
  • An object of the selected type will be instantiated where the cursor 222 is positioned.
  • This method of using buttons in combination with the cursor 222 to instantiate objects in a user interface 10 is common in Windows applications and is well known to those skilled in the art.
  • the default behavior of the object is set through a property sheet.
  • the property sheet may be accessed, for example, by right clicking or double clicking on the object with a pointing device. From the property sheet, the user can set or change the properties of the object instantiated in the user interface 10. The property sheet is used, for example, to set the default behavior or tasks performed by the object.
  • the Inclusion property is also accessed through the property sheet.
  • Another property which is useful is the Hide property of the object.
  • the Hide property allows instantiated objects to be hidden from view. This property is useful, for instance, to control access to object settings. For example, if a new employee is hired, certain object instances may be hidden making such instances inaccessible while allowing the new employee to activate the program.
  • the employee can be given greater access. Hiding an object instance prevents users from interacting with it, i.e., changing property attributes, spatial location, etc., however, the object is still included within the sequence.
  • the Hide property does provide an extra measure of control, flexibility, and security to the program.
  • buttons 240-252 allow the user to instantiate task objects 14 in the user interface 10. Each of these buttons represents a different type of task object 14.
  • button 240 is used to instantiate an Exit Button object.
  • the Exit Button object provides various means for exiting the current instance of an application.
  • Button 242 allows the user to instantiate a Run Button object.
  • Button 244 allows the user to instantiate a Run Picture object. Both the Run Button object and Run Picture object are used to launch other applications and to send keystrokes to those applications. The primary difference lies in how the objects appear in the interface.
  • the Run Button object appears as a simple button in user interface 10, whereas the Run Picture object appears as a bitmap image or icon.
  • Button 246 allows the user to instantiate an SQL Button object.
  • the SQL button Object differs from the Run Button object and Run Picture object in that its default behavior permits access to databases via ODBC (Open Database Connectivity), JDBC (Java Database Connectivity) and SQL (Structured Query Language).
  • Button 248 allows the user to instantiate a QBE button object which permits access to third party query engines incorporating query by example. SQL, QBE, JDBC, and ODBC are well known to those skilled in the art.
  • Button 250 allows the user to access an Automatic Object Maker (AOM).
  • AOM Automatic Object Maker
  • the AOM permits the user to attach to a pre-existing database file and to select those fields in the database file corresponding to the properties required to instantiate a particular object type.
  • the top-level menu for the AOM presents a drop-down list of available object types which may be instantiated.
  • the user selects records from a database which is used to construct an object instance and to place it on the visible client area 218 as if it had been manually instantiated.
  • Button 252 allows the user to instantiate and Override Line Object (OLO). This object is like any other task object, except it has the attributes of a line and displays itself as a line. When an OLO object is created, it is linked to other task objects 14 in the user interface 10. The primary reason for using an OLO is when the automatically generated pattern is not the pattern desired to be shown. OLOs provide a means to make exceptions in the automatically generated pattern.
  • Figure 17 shows two master objects 16 each having a limited Region of Influence 26.
  • Each Region of Influence 26 includes a plurality of task objects 14.
  • a spatial sequence indicator 18 is associated with each Region of Influence 26.
  • the spatial sequence indicator 18 indicates how the objects within the particular Region of Influence 26 are sequenced.
  • the spatial sequence indicators 18 reflect the directional attribute as being FRONT TO BACK - UPPER LEFT TO LOWER RIGHT.
  • the three-dimensional embodiment of the present invention has several objects which do not exist in the 2-D embodiment which may be generally described as out-in objects.
  • An out-in object is a three dimensional object which may be entered by the user. In the disclosed embodiment, the out-in object may serve several pu ⁇ oses.
  • out-in menu object for the 3-D directional field is located at one corner of the field.
  • Out-in menu objects for the task objects 14 and master objects 16 are located in the interior of these objects. It should be understood that the task objects 14 and master objects 16 are also out-in objects.
  • Figure 18 shows an out-in menu object viewed from the exterior thereof.
  • Figure 19 shows an out-in menu object viewed from the interior thereof.
  • the menu items appear on both the interior and exterior surfaces of the object so that interaction with the menu object is possible whether the user is inside or outside the objects.
  • Figure 20 shows a practical application of the sequencing method previously discussed.
  • Figure 20 is a virtual office display which allows sequencing and re-sequencing to occur by moving objects within the virtual office space.
  • the practical pu ⁇ ose of the virtual office environment is to present a means of sequencing documents to be faxed.
  • the background image 300 is simply a graphic image stored in an instance of a Picture object containing the office background. By right clicking anywhere on this background, the user can access the properties of the object and make whatever changes are needed, including changing the background by loading a new graphic image.
  • the graphic image used illustrates a conventional office with a virtual desk and a door. The door has a leave sign which is actually an Exit Button object 302.
  • the first object 304 has the appearance of a folder on the desktop.
  • the other three objects 306, 308, and 310 have an appearance that resembles a letter.
  • the Start Button 312 in the lower left of the virtual office is a master object having an external location point 314 located on the image of the fax machine.
  • the spatial sequence indicator 18 indicates that the directional attribute is set to outermost points inward.

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Stored Programmes (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measuring Or Testing Involving Enzymes Or Micro-Organisms (AREA)

Abstract

La présente invention concerne un procédé graphique permettant de séquencer des tâches de gestion d'ordinateur et utilisant des objets pour représenter les tâches à effectuer par l'ordinateur. Ces objets sont placés dans un champ orienté affecté d'un attribut d'orientation spécifiant les modalités de séquencement des tâches. La séquence de tâches à effectuer définit collectivement une procédure. Dès qu'une procédure est lancée, l'ordinateur procède automatiquement au séquencement de la tâche dans le cadre de la procédure en fonction de la répartition spatiale spéciale des objets de la tâche et de l'attribut d'orientation. Le reséquencement peut se faire par modification de la répartition des objets de la tâche ou par modification de l'attribut d'orientation.
PCT/US1998/006086 1997-04-04 1998-03-27 Procede de sequencement de taches informatisees en fonction de la repartition spatiale des differents objets de taches dans un champ oriente WO1998045767A2 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP98913211A EP1031079A4 (fr) 1997-04-04 1998-03-27 Procede de sequencement de taches informatisees en fonction de la repartition spatiale des differents objets de taches dans un champ oriente

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US4337197P 1997-04-04 1997-04-04
US60/043,371 1997-04-04
US08/905,701 US6948173B1 (en) 1997-08-04 1997-08-04 Method of sequencing computer controlled tasks based on the relative spatial location of task objects in a directional field
US08/905,701 1997-08-04

Publications (2)

Publication Number Publication Date
WO1998045767A2 true WO1998045767A2 (fr) 1998-10-15
WO1998045767A3 WO1998045767A3 (fr) 1999-01-07

Family

ID=26720347

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1998/006086 WO1998045767A2 (fr) 1997-04-04 1998-03-27 Procede de sequencement de taches informatisees en fonction de la repartition spatiale des differents objets de taches dans un champ oriente

Country Status (3)

Country Link
EP (1) EP1031079A4 (fr)
CN (1) CN1266510A (fr)
WO (1) WO1998045767A2 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6615211B2 (en) * 2001-03-19 2003-09-02 International Business Machines Corporation System and methods for using continuous optimization for ordering categorical data sets
JP4117352B2 (ja) * 2002-11-12 2008-07-16 株式会社ソニー・コンピュータエンタテインメント ファイル処理方法とこの方法を利用可能な装置
DE102010000929A1 (de) * 2010-01-15 2011-07-21 Robert Bosch GmbH, 70469 Ausrichtungssensor

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4920514A (en) * 1987-04-13 1990-04-24 Kabushiki Kaisha Toshiba Operational information display system
US5136705A (en) * 1988-06-14 1992-08-04 Tektronix, Inc. Method of generating instruction sequences for controlling data flow processes
US5212771A (en) * 1990-04-27 1993-05-18 Bachman Information Systems, Inc. System for establishing concurrent high level and low level processes in a diagram window through process explosion and implosion subsystems
US5363482A (en) * 1992-01-24 1994-11-08 Interactive Media Corporation Graphical system and method in which a function is performed on a second portal upon activation of a first portal
US5442746A (en) * 1992-08-28 1995-08-15 Hughes Aircraft Company Procedural user interface
US5586243A (en) * 1994-04-15 1996-12-17 International Business Machines Corporation Multiple display pointers for computer graphical user interfaces
US5724492A (en) * 1995-06-08 1998-03-03 Microsoft Corporation Systems and method for displaying control objects including a plurality of panels
US5745109A (en) * 1996-04-30 1998-04-28 Sony Corporation Menu display interface with miniature windows corresponding to each page

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2662009B1 (fr) * 1990-05-09 1996-03-08 Apple Computer Icone manupulable a faces multiples pour affichage sur ordinateur.
JP2768412B2 (ja) * 1992-07-15 1998-06-25 財団法人ニューメディア開発協会 ユ−ザ適応型システムおよびその適応方法
US5623592A (en) * 1994-10-18 1997-04-22 Molecular Dynamics Method and apparatus for constructing an iconic sequence to operate external devices
AU712491B2 (en) * 1995-04-07 1999-11-11 Sony Electronics Inc. Method and apparatus for improved graphical user interface with function icons
GB9606791D0 (en) * 1996-03-29 1996-06-05 British Telecomm Control interface

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4920514A (en) * 1987-04-13 1990-04-24 Kabushiki Kaisha Toshiba Operational information display system
US5136705A (en) * 1988-06-14 1992-08-04 Tektronix, Inc. Method of generating instruction sequences for controlling data flow processes
US5212771A (en) * 1990-04-27 1993-05-18 Bachman Information Systems, Inc. System for establishing concurrent high level and low level processes in a diagram window through process explosion and implosion subsystems
US5363482A (en) * 1992-01-24 1994-11-08 Interactive Media Corporation Graphical system and method in which a function is performed on a second portal upon activation of a first portal
US5442746A (en) * 1992-08-28 1995-08-15 Hughes Aircraft Company Procedural user interface
US5586243A (en) * 1994-04-15 1996-12-17 International Business Machines Corporation Multiple display pointers for computer graphical user interfaces
US5724492A (en) * 1995-06-08 1998-03-03 Microsoft Corporation Systems and method for displaying control objects including a plurality of panels
US5745109A (en) * 1996-04-30 1998-04-28 Sony Corporation Menu display interface with miniature windows corresponding to each page

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1031079A2 *

Also Published As

Publication number Publication date
CN1266510A (zh) 2000-09-13
EP1031079A4 (fr) 2004-09-15
WO1998045767A3 (fr) 1999-01-07
EP1031079A2 (fr) 2000-08-30

Similar Documents

Publication Publication Date Title
US6948173B1 (en) Method of sequencing computer controlled tasks based on the relative spatial location of task objects in a directional field
JP2675987B2 (ja) データの処理方法及び処理システム
US5414806A (en) Palette and parts view of a composite object in an object oriented computer system
CA2051180C (fr) Services de validation independants de l'application pour hypermedias
US6097391A (en) Method and apparatus for graphically manipulating objects
JP3656220B2 (ja) スマートオブジェクトを有する対話形データ視覚化
US5442788A (en) Method and apparatus for interfacing a plurality of users to a plurality of applications on a common display device
AU755715B2 (en) Browser for hierarchical structures
EP0752640B1 (fr) Représentation des relations entre objets graphiques dans un dispositif d'affichage d'ordinateur
US5862379A (en) Visual programming tool for developing software applications
US6232968B1 (en) Data processor controlled display system with a plurality of switchable customized basic function interfaces for the control of varying types of operations
JPH10507286A (ja) グラフィカル・ユーザ・インターフェース
JPH07134765A (ja) データのグラフ表示の方法
JPH06208448A (ja) ブラウザ項目を有する集合ブラウザをアプリケーションに供給させる方法およびコンピュータ制御表示装置
WO1994024657A1 (fr) Interface utilisateur interactive
EP0873548B1 (fr) Interaction graphique et retroaction de selection extensibles
US5848429A (en) Object-oriented global cursor tool which operates in an incompatible document by embedding a compatible frame in the document
EP0558223B1 (fr) Système de gestion de fenêtres dans une station de travail d'ordinateur
WO1998045767A2 (fr) Procede de sequencement de taches informatisees en fonction de la repartition spatiale des differents objets de taches dans un champ oriente
US6122558A (en) Aggregation of system settings into objects
EP0693192B1 (fr) Outil a curseur oriente objet
EP0572205B1 (fr) Système d'affichage orienté objets
Tudoreanu et al. Legends as a device for interacting with visualizations
WO2023069068A1 (fr) Procédés et systèmes basés sur une gui permettant de travailler avec de grands nombres d'articles interactifs
AU2002301073B2 (en) Browser For Hierarichial Structures

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 98805711.5

Country of ref document: CN

AK Designated states

Kind code of ref document: A2

Designated state(s): CN IL JP RU

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE

AK Designated states

Kind code of ref document: A3

Designated state(s): CN IL JP RU

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 1998913211

Country of ref document: EP

NENP Non-entry into the national phase in:

Ref document number: 1998542820

Country of ref document: JP

WWP Wipo information: published in national office

Ref document number: 1998913211

Country of ref document: EP