GB2474053A - Graphical user interface using three dimensional objects. - Google Patents

Graphical user interface using three dimensional objects. Download PDF

Info

Publication number
GB2474053A
GB2474053A GB0917293A GB0917293A GB2474053A GB 2474053 A GB2474053 A GB 2474053A GB 0917293 A GB0917293 A GB 0917293A GB 0917293 A GB0917293 A GB 0917293A GB 2474053 A GB2474053 A GB 2474053A
Authority
GB
United Kingdom
Prior art keywords
dimensional object
data
user interface
dimensional
processing resource
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0917293A
Other versions
GB0917293D0 (en
Inventor
Paul Mulvanny
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB0917293A priority Critical patent/GB2474053A/en
Publication of GB0917293D0 publication Critical patent/GB0917293D0/en
Publication of GB2474053A publication Critical patent/GB2474053A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/248Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/26Visual data mining; Browsing structured data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/048023D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A user interface apparatus comprises a graphical user interface and a processing resource. The processing resource is arranged to control, when in use, the graphical user interface for navigating through a data structure associated with data stored in a database, the data structure having a topology. The GUI is presented as a one or more three-dimensional objects (908, 912, 922, 934, 936, 938, 940). The three-dimensional objects (908, 912, 922, 934, 936, 938, 940) are selectable in order to enable visual navigation through the topology of the data structure. A second three-dimensional (928) object may only be displayed in response to a selection from the first object (918). A third object may be displayed in response to a selection from the second object. The first objects may be displayed as a non geometric shape or as a representation of a machine. The second object may subtend from the first and be a polyhedron rotatable about an axis. The objects may be manipulated by a motion sensitive controller capable of detecting a movement gesture of the user.

Description

USER INTERFACE APPARATUS AND METHOD THEREFOR
[0001] The present invention relates to a user interface apparatus of the type that, for example, facilitates navigation through a data structure, such as a data structure for data stored in a database, via a graphical user interface. The present invention also relates to a method of navigating through a data structure, the method being of the type that, for example, facilitates navigation through the data structure, such as a data structure for data stored in a database, using a graphical user interface.
[0002] In the field of data processing, it is known to store information in an electronic form, accessible through one or more computing devices. For example, it is known to provide a data structure such as in a database, to support storage of information. In the field of computing, many types of database exist, employing operating principles centred around different techniques for organising and/or storing data. Furthermore, it is not uncommon to find that each database software application has its own operational idiosyncrasies. However, a common objective of data storage and/or retrieval systems is the ability to enable one or more users to access the information stored as easily as possible [0003] One traditional technique employed in respect of database software applications is a forms-based approach. Typically, when designing and building a database, the design process will include the design of so-called "forms" that provide fields, which are usually labelled, to enable data retrieved from a database to be presented to a user for review or editing. Similarly, forms can also be used to complete fields for a new record to be added to the database. However, the forms serve as a "window" into a part of a data structure and exploration of the data structure in order to retrieve data is extremely limited due to the fixed pre-defined nature of the forms. Such limitations can lead to the context, for example the causal origins, of the data being lost to the user and the integrity of the data being consequently undermined.
[0004] It is also known to store attributes of an object to be graphically represented via an output device, such as a display, by a computing apparatus. In one example, the object can be an object featured in a video game that is manipulated on-screen by a computer game application, details relating to the shape and/or orientation of the object being stored in a data structure.
[0005] A menu, for example, of a website is also a form of crude data structure.
A user of the menu is able to navigate through the structure of the menu using a mouse. In this respect, the menu can be expressed as a tree structure having multiple branches and sub-branches that can be selectively expanded by selection using a mouse. However, "drilling-down" through the menu structure is not particularly visually intuitive and does not provide the user with an overview that is easy to regard and understand.
[0006] For some applications, for example technology enterprise management applications, it is necessary to support very complex data structures and large amounts of information, and the above technologies for exploring the data structures with a view to accessing data, are not scalable for such complex data structures.
[0007] In this respect, the application of practical sciences has become more complex for a variety of reasons. A significant reason is the amount of knowledge which must be coordinated in order to enable highly developed technology to function in today's society. It is often the case that costs associated with the delivery of a solution, which needs highly developed technology, cannot be justified in a business case for a single deployment of the solution. It is sometimes not appreciated that certain aspects of the solution are inextricably, and sometimes explicitly, linked to aspects internal and/or external to an organisation that are not perceived as directly relating to the engineering of the solution.
[0008] In the context of product development, a challenge exists in matching market needs on one side with capabilities of suppliers on another side. In this respect, on the market side, for a given market sector associated with an enterprise, there is a considerable number of projects in progress to satisfy various market scenarios. The projects translate into an even larger number of technology applications that have to be implemented in order to achieve completion of the projects. As a result, an extremely large number of performance attributes or parameters, for example those characterising social, technical and business aspects of the engineering, exist in connection with the various projects. On the supply side, a given supply organisation has a number of operations that support delivery of the projects, the various operations performing a larger number of tasks in their respective roles supporting the organisation. A considerable number of technology attributes or parameters are associated with the tasks. In the light of the vast numbers of both performance attributes and technology attributes, the challenge in matching the market needs with the capabilities of suppliers is therefore complex. Indeed, a given enterprise will have many concurrent technology applications at different stages of development at any one time.
[0009] In an attempt to simplify matters, enterprise modelling represents, typically computationally, structures, activities, processes, information, resources, people, behaviours, goals, and/or constraints of a business, government, or other enterprises, entity or programme. Such representation includes not only relationships with internal structures of the organisation, but also with the environment with which the organisation interacts. The purpose of such representations is to serve as a tool to enable effective management of an enterprise in the light of the complex structures, processes and inter-relationships associated with the enterprise.
[0010] Enterprise models can be broadly categorised into two classes: static and dynamic. A static enterprise model is a "snapshot" of the organisation at a particular instant in time. The snapshot can include, inter a/ia, information concerning the structure of the organisation, boundaries with external entities, process information, strategic objective information, information concerning external influences on the organisation and/or Strength-Weakness-Opportunity-Threats (SWOT) analysis results.
[0011] In contrast, a dynamic enterprise model, as its name suggests, represents changes of the organisation with time.
[0012] Enterprise models are employed for many different purposes. One example of an application of an enterprise model is for business strategic purposes, where it is necessary to identify and implement business strategies for an organisation to succeed. Organisation and process design is another application of the enterprise model and is particularly useful where it is necessary to implement management and/or operational improvements in an organisation.
Enterprise models are also used in relation to Information Technology (IT) planning where cost sensitivity is a concern in relation to most or all aspects of the IT, but particularly procurement and deployment. The enterprise model can also extend to organisational structures to support the IT and policies to support development coordination.
[0013] It is known to employ enterprise models in relation to so-called requirements and system development. In this context, the enterprise model is a technology enterprise model and is used to define requirements for applications, system developments, and planning of transitioning and/or bridging from legacy systems to new systems.
[0014] One known type of technology enterprise modelling apparatus and method is described in UK patent application number GB0815652.3, the contents of which are hereby incorporated by reference.
[0015] Unfortunately, as mentioned above, existing user interface techniques, for example the forms-based approach, do not lend themselves well to data entry and/or access. Typically, the information that a user can wish to access may not be intuitively accessible due to the two-dimensional nature of the user interface.
[0016] It is therefore an object of the present invention to obviate, or at least mitigate, the above-described difficulties.
[0017] According to the first aspect of the present invention, there is provided a user interface apparatus comprising: a graphical user interface; a processing resource arranged to control, when in use, the graphical user interface for navigating through a data structure associated with data stored in a database, the data structure having a topology; wherein the processing resource controls the graphical user interface in order to present three-dimensional objects, the three-dimensional objects being selectable in order to enable visual navigation through the topology of the data structure.
[0018] The data structure may comprise data sub-structures arranged according to the topology; the processing resource may be arranged to provide a visual indication between a number of the three-dimensional objects where relationships respectively exist between a number of the data sub-structures associated with the number of the three-dimensional objects in accordance with the topology.
[0019] The three-dimensional objects may comprise a first three-dimensional object and a second three-dimensional object; the processing resource may be arranged to reveal selectively the second three-dimensional object in response to a selection in relation to the first three-dimensional object.
[0020] The data structure may comprise a first data sub-structure associated with the data structure in accordance with the topology, and the second three- dimensional object may be drawn as logically subtending from the first three- dimensional object so as to represent a relationship between the second three-dimensional object and the first data sub-structure associated with the first data sub-structure.
[0021] The topology may be hierarchical.
[0022] The three-dimensional object may comprise a third three-dimensional object; the processing resource may be arranged to reveal selectively the third three-dimensional object in response to another selection in the relation to the second three-dimensional object.
[0023] The data structure may comprise a third data sub-structure associated with the second data sub-structure in accordance with the topology, and the third three-dimensional object may logically subtend from the second three-dimensional object so as to represent the relationship between the third data sub-structure and the second data sub-structure according to the topology.
[0024] The topology may comprise a node corresponding to the data structure as a whole, and the data structure may comprise a data sub-structure corresponding to a sub-node in the topology that subtends from the most senior node; wherein the processing resource may be arranged to control, when in use, the graphical user interface in order to present a first three-dimensional object visually representing the data stored in the database; the first three-dimensional object may comprise a graphical feature selectable via the graphical user interface that visually represents the data sub-structure.
[0025] The processing resource may be arranged to control, when in use, the graphical user interface in order to present a second three-dimensional object subtending from the first three-dimensional object in response to selection of the graphical feature.
[0026] The second three-dimensional object may subtend from the graphical feature of the first three-dimensional object.
[0027] The second three-dimensional object may be a polyhedron.
[0028] The second three-dimensional object may be rotatable about at least one axis.
[0029] The first three-dimensional object may be a non-geometric shape.
[0030] The first three-dimensional object may be a graphical representation of a machine.
[0031] The second three-dimensional object may comprise a plurality of
selectable elements.
[0032] The processing resource may be arranged to control, when in use, the graphical user interface in order to present a third three-dimensional object subtending from the second three-dimensional object in response to another selection in respect of the second three-dimensional object.
[0033] The third three-dimensional object may be presented in response to the another selection via the graphical user interface in respect of an element of the plurality of selectable elements.
[0034] The third three-dimensional object may be presented in response to the another selection via the graphical user interface in respect of an element of the plurality of selectable elements.
[0035] The plurality of selectable elements may be a plurality of facets.
[0036] The first three-dimensional object may be rotatable about at least one axis.
[0037] The first three-dimensional object may be scalable in response to a scaling command provided via the graphical user interface.
[0038] A portion of the data may be categorisable into a set of attributes; the second three-dimensional object may correspond to a primary set of attributes.
[0039] The processing resource may be further arranged to access, when in use, the database in order to retrieve attribute data associated with an attribute of the primary set of attributes in response to a selection via the graphical user interface in relation to the second three-dimensional object.
[0040] Attribute data associated with the attribute of the primary set of attributes may be categorisable into a set of secondary attributes; the third three-dimensional object may corresponding to the secondary set of attributes, and the secondary set of attributes may subtend from the attribute of the primary set of attributes.
[0041] The third three-dimensional object may be manipulatable in order to control access of data in the database associated with an attribute of the secondary set of attributes.
[0042] The second and third three-dimensional objects may have different appearances.
[0043] The processing resource may be arranged to control, when in use, the graphical user interface in order to present a first terminus three-dimensional object.
[0044] The third three-dimensional object may be the first terminus three-dimensional object and may be revealed in response to the another selection in relation to the second three-dimensional object.
[0045] The first terminus three-dimensional object may subtend from the third three-dimensional object and may be revealed in response to a further selection in relation to the third three-dimensional object.
[0046] The processing resource may be arranged to control, when in use, the graphical user interface in order to present a second terminus three-dimensional object; the processing resource may be arranged to reveal selectively the second terminus three-dimensional object substantially at the same time as the first terminus three-dimensional object.
[0047] Both the first and second terminus objects may subtend from the third three-dimensional object.
[0048] Both the first and second terminus objects may subtend from the second three-dimensional object.
[0049] The processing resource may be arranged to retrieve, when in use, data from the database in response to a yet further selection in relation to the first terminus three-dimensional object.
[0050] The processing resource may be arranged to control, when in use, the graphical user interface in order to present a plurality of terminus three-dimensional objects subtending from the third three-dimensional object in response to a further selection in respect of the third three-dimensional object.
[0051] The processing resource may be arranged to retrieve, when in use, data from the database in response to a yet further selection in relation to a first terminus three-dimensional object of the plurality of terminus three-dimensional objects.
[0052] The first terminus three-dimensional object may be rotatable about at least one axis.
[0053] The data retrieved may be presented in a two-dimensional format.
[0054] The data retrieved may be presented as a form or a report.
[0055] At least one of the three-dimensional objects may be spatially
manipulatable.
[0056] The processing resource may support drilling-down through the data structure by selection in respect of a number of the three-dimensional objects.
[0057] The processing resource may support drilling-down through the data structure by spatial manipulation of a number of three three-dimensional objects.
[0058] Presentation of a three-dimensional object of the three-dimensional objects may be subject to a user access privilege.
[0059] Presentation of the second three-dimensional object may be subject to a user access privilege.
[0060] The database may comprise retrievable management information in respect of a three-dimensional object to facilitate optimisation of interpretation, manipulation, sequencing and/or communication of data structures associated with a three-dimensional object.
[0061] The apparatus may further comprise an image comprising a two-dimensional representation of the first three-dimensional object.
[0062] The second three-dimensional object may be revealed in response to positioning a cursor over the two-dimensional representation of the three-dimensional object.
[0063] According to the second aspect of the present invention, there is provided a user interface system comprising: a user interface apparatus as set forth above in relation to the first aspect of the invention; and a motion sensitive controller translatable, when in use, by a movement gesture, the motion sensitive controller being capable of detecting the movement gesture and communicating movement gesture data associated with the movement gesture to the processing resource; wherein the processing resource is arranged to manipulate a selected three-dimensional object in accordance with the movement gesture detected.
[0064] The movement gesture may be a rotation gesture.
[0065] A view of the first three-dimensional object may be controllable using the movement gesture data from the motion sensitive controller.
[0066] A view of the first three-dimensional object and the second three-dimensional object may be controllable using the movement gesture data from the motion sensitive controller.
[0067] The motion sensitive controller may be a controller for a games console.
[0068] The motion sensitive controller may be a portable communications device comprising a motion sensing device.
[0069] The motion sensitive controller may be a Personal Digital Assistant (PDA) comprising a motion sensing device.
[0070] The motion sensitive controller may be an iPhone®.
[0071] According to the third aspect of the present invention, there is provided a method of navigating through a data structure, the method comprising: controlling a graphical user interface for navigating through a data structure associated with data stored in a database, the data structure having a topology; wherein controlling the graphical user interface comprises presenting three-dimensional objects, the three-dimensional objects being selectable in order to enable visual navigation through the topology of the data structure.
[0072] It is thus possible to provide an apparatus and method that enables easy and intuitive navigation through a data structure of data stored, for example, in a database. The apparatus and method also supports provision of data in context.
It is also therefore possible to increase the integrity of information access, to access data in one context can be permitted whilst access by the same accessor of the same data in a different context can be reasonably excluded.
[0073] Additionally, it is possible for data presented in a context generated by the apparatus or in accordance with the method to have processes, instructions and tools associated with the data and to be organised and presented by the apparatus or in accordance with the method to allow users with access rights to aggregate or otherwise manipulate data, read and update data and, when the context allows, exchange data between authorised users.
[0074] At least one embodiment of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 is a schematic diagram of a technology enterprise management system; Figure 2 is a schematic diagram of an application development model supported by the system of Figure 1; Figure 3 is a schematic diagram of a technique for assessing attributes supported by the system of Figure 1; Figure 4 is a schematic diagram of a stage and gateway process supported by the system of Figure 1; Figure 5 is a schematic diagram of tasks and resources arranged in relation to the stage and gateway process of Figure 4; Figure 6 is a schematic diagram of a multidimensional data structure associated with the stage and gateway process of Figure 4; Figure 7 is a schematic diagram of a technology application life cycle in the context of the stage and gateway methodology depicted in Figure 4; Figure 8 is a schematic diagram of tasks associated with one or more stages of the stage and gateway process Figure 4; Figure 9 is a schematic diagram of a user interface apparatus supported by an application server of Figure 1; Figure 10 is a schematic diagram of a navigation display generated by the apparatus of Figure 9 and constituting an embodiment of the invention; Figure 11 is a perspective view of the navigation display of Figure 10; Figure 12 is a schematic diagram of attribute classes in the context of an aerial urban surveillance system; Figure 13 is a flow diagram of a method of navigating through a data structure using the apparatus of Figure 9; Figure 14 is schematic diagram of an expanded navigation display generated by the apparatus of Figure 9; and Figure 15 is a perspective view of the navigation display of Figure 13.
[0075] Throughout the following description identical reference numerals will be used to identify like parts.
[0076] Referring to Figure 1, a technology enterprise management system 100 provides a data layer 102 interfaced with a logical layer 104, the logical layer 104 being interfaced with a presentation layer 106.
[0077] The data layer 102 is supported by a database server 108 having access to a data store 110, for example a storage medium, such as a Hard Disc Drive (HDD). Although shown external to the data base server 108, the skilled person should appreciate that the data store 110 can be, and is in this example, part of the database server 108. The database server 108 is supported by any suitable hardware and runs any suitable operating system, for example a Microsoft Windows Server 2007 or a distribution of Linux, for example a suitable version of Ubuntu. A database server application, for example a relational database application, is supported by the database server 108, for example: a Structured Query Language (SQL) database, such as that available from Oracle Corporation or Microsoft Corporation. In the event that the operating system of the database server 108 is the Linux distribution as mentioned above, the database server application can be a MySQL or HSQL database server application. Although not shown in Figure 1, the database server 108 using the data store 110 serves to store and provide access to information recorded in relation to a technology enterprise model, details of which will be described later herein. The information is stored within tables as records; the records including, in this example, one or more fields, the records being associated with a respective table created and stored in the data store 110 by the database server 108. The records are used to record and track progress of tasks being performed or to be performed by one or more organisations and/or individuals in relation to the technology enterprise model.
The records are therefore organised in a manner suitable to support the technology enterprise model. Management of the structure of the records and/or inter-relationship between the records is performed in the logical layer 104.
[0078] The logical layer 104 is, in this example, supported by a cluster of inter-communicating servers 111, for example a first application server 112, a second application server 114, a third application server 116 and a fourth application server 118. However, the skilled person should appreciate that a greater or smaller number of servers can be employed, for example a single suitably powerful server. Similarly, the skilled person should appreciate that one or more of the first, second, third and fourth application servers 112, 114, 116, 118 need not be co-located. The logical layer 104, as supported by the cluster of servers 111, serves to implement the functionality of the technology enterprise model.
[0079] The first and second application servers 112, 114 of the cluster of servers 111 support the presentation layer 106. In this example, the first application server 112 is used to visually generate and therefore render visible forms, for example using a HyperText Mark-up Language (HTML) engine application (not shown). As mentioned above, the second application server 114 also supports the presentation layer 106 by providing a graphical representation of one or more aspects of the technology enterprise model, for example using a Computer Aided Design (CAD) application and/or a Computer Aided Engineering application and/or Computer Aided Manufacture (CAM), such as CATIA available from IBM®. The third application server 116 supports the data processing functionality used in relation to the technology enterprise model. In this respect, the third application server 116 is arranged to generate queries in order to mine data stored in the data store 110 and acquire data. The data captured by the third application server 116 is organised by the third application server 116 as one or more forms and/or reports that are presented by the first application server 112. The forms are pre-coded in order to support a framework of the technology enterprise model and comprise one or more code fragments in order to implement logic that supports user-interaction with the technology enterprise management system 100. The fourth application server 118 is arranged to interact with the database server 108 in order to support functionality of the technology enterprise model in relation to life-cycle management of technology applications. In this respect, the fourth application server 118 is configured to manage technology maturity levels of any one information set with respect to another information set so that the consequences of mismatches and inappropriate connections between data sets are avoided. The fourth application server 118 temporarily stores data from the database server 108 in structured form and is capable of re-structuring the data temporarily stored in response to manipulation of the GUI by the user.
[0080] As mentioned above, the presentation layer 102 is supported by the first and second application servers 112, 114. The first and second application servers 112, 114 support a Graphical User Interface (GUI) to present information content in two different manners: the first application server 112, as mentioned above, presents first information content in a first manner as forms or reports, whereas the second application server 114 presents second information content in a second manner as time-varying and/or manipulatable graphics, details of which will be described later herein in relation to Figures 9 et seq. Through the GUI, users of the technology enterprise management system 100 can retrieve information, provide new information and/or update information. The GUI enables the users, where appropriate access privileges apply, to customise the forms, for example form and/or field titles in an organisation-specific manner and in relation to tasks of the technology enterprise model mentioned above. Similarly, forms can be completed and/or existing forms stored by the database server 108 can be maintained by users as the tasks progress or do not progress.
[0081] As mentioned above, the structure and operation of the technology enterprise model is supported by the logical layer 104. Referring to Figure 2, the technology enterprise model supports an organisation in realising a desired performance in a market scenario 200. The market scenario 200 is a top-level description of aspects of a market engagement to be undertaken by an organisation or organisations. The market scenario 200 is proposed by the organisation or by co-operation of a number of organisations. In order to record the scenario 200 in the context of the technology enterprise model, the market scenario 200 is firstly characterised using a number of categories. The categories are, in this example, expressed in forms generated by the third application server 116 and presented by the first application server 112.
[0082] In this example, a first category is players. Information content recorded in relation to the first category is a description of people-oriented entities involved in a marketplace, for example competitors, partners, suppliers and bystanders, those entities that indirectly operate in the marketplace, such as so-called "shapers" that influence a market environment but not a given product, for example a standards body and/or a trade union. A second category is context, the context being recorded as a description of a physical environment of the market scenario 200, for example interaction with other products; physical requirements, such as temperature requirements; geospatial requirements, such as line of sight, terrestrial deployment or spatial deployment. A third category is tactics, the tactics being recorded as a description of behaviour of the first category, for example marketing practices, such as branding to promote higher prices; protection of sources of supply to reduce competition and/or reduce cost; and/or collaboration techniques that are public and/or private. A fourth category is goals, the goals being recorded as a description of objectives that the organisation wishes to achieve in the market scenario 200, for example, profit, market share and/or ethical targets. A fifth category is one or more measures of success, the measures of success being recorded as a description of metrics and/or methods of measurement to determine a degree of achievement of the fourth category, for example audits, surveys, financial criteria, environmental criteria and/or social criteria. As mentioned above, the categories are recorded in order to record a characterisation of the market scenario 200. In this respect, the forms are used to capture the characterisation of the market scenario 200 in the technology enterprise model. Reports can then be used to present the recorded information to communicate the market scenario 200 to users of the technology enterprise management system 100.
[0083] From the market scenario 200, one or more projects are devised by an analysis, for example a gap analysis, of the status of the organ isation at a present point in time with respect to the market scenario 200, which is inherently predictive. Indeed, it should be appreciated that more than one market scenario can be devised and taken into account when performing the analysis in order to determine a most probable combination of parameters respectively associated with the different scenarios. One or more technology applications are then defined that are to be undertaken by the organisation in order to bring about the market scenario 200. When a market scenario requires one or more technology applications, the technology applications are sequenced and coordinated as a project.
[0084] In order to arrive at the one or more technology applications mentioned above, market capabilities 202 and market engagement needs 204 are cross-referenced and correlated 206 with the market scenario 200 to yield a set of performance attributes 208 that characterise what is required to achieve the market scenario 200. The set of performance attributes 208 is expressed in technology independent language to the extent that the detailed means of solution provision is unconstrained, for example: the aircraft will have to fly at supersonic speeds, detect other aircraft to avoid collision or be capable of earth-orbiting space flight. In addition to the considerations of the market, consideration is also given to organisation and capability to supply in relation to the market. Consequently, in relation to supply, there exists a large number of contracts, ways of contracting and suppliers, hereinafter referred to as contract scenarios 210. Whilst the contract scenarios 210 are of a different nature to the market scenario 200, there is synergy between the respective manners in which both types of information are processed. Thus, supply networks 212 and supply capabilities 214 associated with the contract scenarios 210 are cross-referenced and correlated 216 with the contract scenario 210 to yield a set of technology attributes 218 that characterise what is required to achieve the contracting scenario 210. The set of technology attributes is expressed in a language which constrains the technology options available, for example: the resulting aircraft will be a derivative of an existing type having a new propulsion system, inertial navigation system and a modified flight control system and/or fuselage, fuel system and external control surfaces will be carried over from the existing aircraft.
[0085] As suggested above, in order to realise the market scenario 200, a number of technology applications 220 typically need to be developed, each technology application 220 being characterised by a number of the performance attributes from the set of performance attributes 208 determined above and a number of technology attributes from the set of technology attributes 218 determined above. The output of each technology application 220 is characterised by augmentation in capabilities of people, new and/or improved processes, improved information quality and/or new and/or improved equipment or tools in the enterprise, and the output resulting from actual implementation of an application constitutes an increase in inventory 222 of the enterprise and hence growth 224 of the enterprise.
[0086] Referring to Figure 3, in relation to the set of performance attributes 208, the skilled person should appreciate that an overall collection of performance attributes 300 comprises the set of performance attributes 208, which were determined above in relation to the market scenario 200. Hence, it should be understood that other market scenarios also contribute to the overall collection of performance attributes 300. Similarly, an overall collection of technology attributes 302 comprises the set of technology attributes 218, which were determined above in relation to the contract scenario 210. Hence, it should be understood that other contract scenarios also contribute to the overall collection of technology attributes 302. The overall collection of performance attributes 300 and the overall collection of technology attributes 302 can be grouped into so-called "module classes" 304 in order to improve accessibility to performance attributes and technology attributes.
In this respect, each module class constitutes a building block of a project, where an instance of a module class constitutes at least part of an application 320.
Seven exemplary module classes are: a Platform module class 306, a Facilities module class 308, a Sensors module class 310, a Tools module class 312, an Effectors module class 314, a Communications module class 316, and a Human Platform module class 318. However, other module classes exist and can be derived, for example: a People Performance module class, a People Health module class, a People Organisation module class, a Process Administration module class, a Process Operation module class, a Process Education module class, an Information Security module class, an Information Storage module class, an Information Communication module class, and an Information Reconstitution module class. The skilled person should appreciate that the performance attributes 300 and the technology attributes 302 mentioned herein for use by the technology enterprise model are not exhaustive and a greater or fewer number of attributes can be employed. In this respect, when configuring the technology enterprise model for a particular purpose, for example a particular business sector, the available attributes, both performance attribute-related and technology attribute-related can be classified using a plurality of decision elements 320 in order to group the available attributes into module classes appropriate to a given attribute. The module classes are stored in respective data structures associated therewith.
[0087] Referring to Figure 4, once the set of module classes 304 have been determined, the technology enterprise model is used to implement a stage and gateway process 400 supported by the logical layer 104, constituting a processing resource. The stage and gateway process 400 comprises a plurality of stages interlaced with a plurality of gateways, the process 400 being for managing the life cycle of a first application. The stage and gateway process 400 comprises a first gateway 402. The first gateway is an initiating or "kick-off' gateway and is used to determine whether a first, resource evidence, check list 404 has been reviewed and all items on the first resource evidence check list have been provided. The first resource evidence check list is a form stored in the data store 110 by the database server 108. At the first gateway 402, the timescales are also agreed in relation to development of a project.
[0088] A first stage 406 is located adjacent the first gateway 402. Whilst all the items on the first resource evidence check list do not have to be satisfied to permit processing in relation to the first stage 406, the logical layer 104 provides one or more warnings in relation to unsatisfied items on the first resource evidence check list. It should be appreciated that, in this example, the logical layer 104 polices compliance with the first, and other, resource evidence check lists. However, the skilled person should appreciate that operators of the technology enterprise management system 100 can determine whether or not progression to a subsequent stage in the stage and gateway process 400 should be permitted.
[0089] The purpose of the first stage 406 is to perform appropriate tasks in order to determine whether underlying scientific principles are known and are parameterised. The technology validation stage 406 is sometimes referred to as a "Laws of Physics" stage on the basis that all technology is considered to reduce to a matter of physics, including for example the fields of chemistry and molecular biology.
[0090] A second gateway 408 is located adjacent the first stage 406 opposite the first gateway 402. The second gateway 408 serves to provide verification as to whether the first stage 406 has been sufficiently completed, for example scientifically to validate technology associated with a proposed design configuration. The completion of the first stage 404 is measured using a second evidence check list 410 associated with the activities required in relation to the first stage 406, for example viability of the technology being developed and that no further technology needs to be developed to be able to realise a given module class and/or technology application. Whilst all the items on the second, proof of science, evidence check list 410 do not need to be satisfied for the logical layer 104 to permit progression to a second stage 412, in this example the logical layer 104 provides one or more warnings in relation to unsatisfied items of the second evidence check list 410.
[0091] The second stage 412 is located adjacent the second gateway 408. In this example, the second stage 412 is a sociological validation stage, sometimes referred to as a "Laws of Society" stage. The purpose of the second stage 412 is to perform appropriate tasks in order to establish a range of parameters required and allowed by standards, codes of practice and legislation relating to the markets mentioned above, thereby ensuring that any aspect of the given technology application and/or module class does not contravene the standards, codes of practice or legislation mentioned above, for example statutes or moral or other standards of society. A third gateway 414 is located adjacent the second stage 412 and opposite the second gateway 408. The third gateway 414 serves to provide verification as to whether the second stage 412 has been sufficiently completed, for example use of a technology is permitted or unconstrained in respect of a proposed design configuration in the light of sociological restraint data. The completion of the second stage 412 is measured using a third evidence check list 416 associated with the activities required in relation to the second stage 412, for example that the technology associated with the given technology application and/or module class does not offend any laws of society as described above. Whilst all the items on the third, proof of concept, evidence check list 416 do not need to be satisfied for the logical layer 104 to permit progression to a third stage 418, in this example the logical layer 104 provides one or more warnings in relation to unsatisfied items of the third evidence check list 416.
[0092] The third stage 418 is located adjacent the third gateway 414. In this example, the third stage 418 is a design capability validation stage. The purpose of the third stage 418 is to perform appropriate tasks in order to establish that all stakeholders, for example parties involved in end-user and manufacturing communities, have quantified resources required to perform their respective downstream roles in relation to the stage and gateway process 400. A fourth gateway 420 is located adjacent the third stage 418 and opposite the third gateway 414. The fourth gateway 420 serves to provide verification as to whether the third stage 418 has been sufficiently completed, for example to determine whether one or more stakeholders are capable of supporting development of a proposed design configuration. The completion of the third stage 418 is measured using a fourth evidence check list 422 associated with the activities required in relation to the third stage 418, for example that, as mentioned above, all stakeholders have quantified resources required to perform their respective downstream roles. Whilst all the items on the fourth, enterprise evidence, check list 422 do not need to be satisfied for the logical layer 104 to permit progression to a fourth stage 424, in this example the logical layer 104 provides one or more warnings in relation to unsatisfied items of the fourth evidence check list 420.
[0093] The fourth stage 424 is located adjacent the fourth gateway 420. In this example, the fourth stage 424 is a supply capability validation stage. The purpose of the fourth stage 424 is to perform appropriate tasks in order to manage project design resources, for example people, processes, information and tools, in order to provide solution specifications. A fifth gateway 426 is located adjacent the fourth stage 424 and opposite the fourth gateway 420. The fifth gateway 426 serves to provide verification as to whether the fourth stage 424 has been sufficiently completed, for example to determine whether project design resources are able to specify a solution for a proposed design configuration. The completion of the fourth stage 424 is measured using a fifth evidence check list 428 associated with the activities required in relation to the fourth stage 424, for example that, as mentioned above, all necessary design specifications are completed, mutually compatible and collectively capable of operating to meet prioritised requirements mentioned later herein. Whilst all the items on the fifth, project evidence, check list 428 do not need to be satisfied for the logical layer 104 to permit progression to a fifth stage 430, in this example the logical layer 104 provides one or more warnings in relation to unsatisfied items of the fifth evidence check list 428.
[0094] The fifth stage 430 is located adjacent the fifth gateway 426. In this example, the fifth stage 430 is a design verification stage. The purpose of the fifth stage 430 is to perform appropriate tasks to manage manufacturing and delivery of resources in order to get design compliant goods and/or services to a point of sale.
A sixth gateway 432 is located adjacent the fifth stage 430 and opposite the fifth gateway 426. The sixth gateway 432 serves to provide verification as to whether the fifth stage 430 has been sufficiently completed, for example to determine whether an organisation is able to manufacture a product in accordance with a proposed design configuration. The completion of the fifth stage 430 is measured using a sixth evidence check list 434 associated with the activities required in relation to the fifth stage 430, for example that, as mentioned above, all necessary manufacturing and delivery resources are in place in order to ensure that the goods and/or service will reach the point of sale. Whilst all the items on the sixth, product evidence, check list 434 do not need to be satisfied for the logical layer 104 to permit progression to a sixth stage 436, in this example the logical layer 104 provides one or more warnings in relation to unsatisfied items of the sixth evidence check list 434.
[0095] The sixth stage 436 is located adjacent the sixth gateway 432. In this example, the sixth stage 436 is a product satisfaction stage. The purpose of the sixth stage 436 is to perform appropriate tasks to ensure that support is provided, for example in respect of a proposed design configuration, to customers by way of, for example, product education and warranty support. A seventh gateway 438 is located adjacent the sixth stage 436 and opposite the sixth gateway 432. The seventh gateway 438 serves to provide verification as to whether the sixth stage 436 has been sufficiently completed. The completion of the sixth stage 436 is measured using a seventh evidence check list 440 associated with the activities required in relation to the sixth stage 436, for example that, as mentioned above, all necessary support has been provided for customers. Whilst all the items on the seventh, customer usage evidence, check list 440 do not need to be satisfied for the logical layer 104 to permit progression to a seventh stage (not shown), in this example the logical layer 104 provides one or more warnings in relation to unsatisfied items of the seventh evidence check list 440.
[0096] Although not shown in Figure 4, a seventh stage is located adjacent the seventh gateway 438. In this example, the seventh stage is a recycling stage. The purpose of the seventh stage is to perform appropriate tasks to ensure negative social and commercial impact resulting from operation of goods and/or services, for example according to a proposed design configuration, in the marketplace is minimised. An eighth gateway (also not shown) is located adjacent the seventh stage and opposite the seventh gateway 438. The eighth gateway serves to provide verification as to whether the seventh stage has been sufficiently completed. The completion of the seventh evidence stage is measured using a seventh check list (also not shown) associated with the activities required in relation to the seventh stage, for example that, as mentioned above, all necessary provisions have been made for recycling of the goods and/or services. Whilst all the items on the eighth, recycling provision evidence, check list do not need to be satisfied for the logical layer 104 to deem progression to be complete, in this example the logical layer 104 provides one or more warnings in relation to unsatisfied items of the eighth check list.
[0097] For each stage of the stage and gateway process 400, as described above, the technology enterprise management system 100 supports a respective set of tasks of the technology enterprise model, the results of which are monitored and managed by the technology enterprise management system 100 according to the technology enterprise model in the manner described above, in order to attempt to ensure completion of each stage to a required degree. In this respect, each task of the set of tasks, details of which will be described later herein, permeate through the stages of the stage and gateway process 400 and constitute a "Requirements Management" function 442, a "Solutions" function 444, a "Critical Parameters" function 446, a "Performance Assessment" function 448, an "Intellectual Capital" function 450 and a "Stakeholder Management" function 452.
Each function is formed by taking an aggregate view of an instance of the same type of task from each stage across the stage and gateway process 400 for example, the requirements function 442 is comprised of the aggregate of the requirements task 520 which is undertaken at each of the first, second, third, fourth, fifth and sixth stages 406, 412, 418, 424, 430, 436.
[0098] Referring to Figure 5, in this example, each of the above six functions 442, 444, 446, 448, 450, 452 is supported by four types of resources: People 500, Information 502, Processes 504, and Tools 506, the collection and configuration of resources for each task differing between stages of the stage and gateway process 400 for each task being performed.
[0099] In the context of the project mentioned above, and referring to Figure 7, in this example the project comprises a number of applications, each application being defined in the manner described above. A first application 700 is at the first stage 406 of the stage and gateway process 400 and hence a first stage of a first life cycle associated with the first application 700. A second application 702 is at the second stage 412 of the stage and gateway process 400 and hence a second stage of a second life cycle associated with the second application 702. A third application 704 is at the third stage 418 of the stage and gateway process 400 and hence a third stage of a third life cycle associated with the third application 704. A fourth application 706 is at the fourth stage 424 of the stage and gateway process 400 and hence a fourth stage of a fourth life cycle associated with the fourth application 706. A fifth application 708 is at the fifth stage 430 of the stage and gateway process 400 and hence a fifth stage of a fifth life cycle associated with the fifth application 708. A sixth application 710 is at the sixth stage 436 of the stage and gateway process 400 and hence a sixth stage of a sixth life cycle associated with the sixth application 710. The skilled person should hence appreciate that a given stage in each respective lifecycle of a given application can vary depending upon the project, but that the above example of a distribution of applications at various stages of respective life cycles is provided for illustrative purposes only.
[0100] Turning to Figure 8, in order to develop the applications and/or module classes, the stage and gateway process 400 comprises an application development process having a stage engineering module structure; the application development process constitutes a repeatable process structure to construct complex, highly developed technology applications. The application development process is repeated for each stage of the life cycle of a given application. The application development process is an organisation of the tasks mentioned above and sequenced to receive inputs and to produce outputs in a systematic and scalable manner. The application development process contains the set of tasks 520, 522, 524, 526, 528, 530 mentioned above. Resources, which include the People 500, Information 502, Processes 504, and Tools 506, are applied to each of the appropriate tasks 520, 522, 524, 526, 528, 530 to produce an output from performance of the task, the output constituting evidence of delivery of the task, irrespective of degree, each time the application development process is executed. The nature of the resources applied at each stage is determined by the purpose of each stage, for example resources capable of undertaking research at the Laws of Physics stage 406, and current status of access rights to those resources, for example the intellectual capital access rights accessible to the enterprise as determined by the contract scenario 210. In this example, the application development process, in particular the stage engineering module structure, is expressed as a so-called "control level network" 600 for the sake of visualisation and ease of understanding and comprises a plurality of task modules, hereinafter referred to as "tasks".
[0101] The control level network 600 is implemented in the logical layer 104. The Requirements management function 442 is supported in the control level network 600 by a Requirements task 520 that comprises a first input 602 for performance attributes 208 and a first output 604 for design verification results, the design verification results serve as evidence for an Intellectual Capital Management task 552 that supports the Intellectual Capital function 450 mentioned above. The derivation of the design verification results will be described later herein. The Requirements task 520 is a collection of resources characterised by people, information, processes and tools, and regulated by the logical layer 104, in order to optimise the range over which a given application will operate for each attribute of the performance attributes 208. The optimisation of the range serves to specify further aspects of the performance attributes 208 in order to direct a design process. By way of example, in order to constrain design to certain staple engineering solutions, one can specify that the aircraft has to fly below a height of 3000 m, thereby avoiding certain design complexities and costs associated with aircraft that are capable of flying above 3000 m. Optimisation can involve further iterations of the cross-reference and correlation 206 derived from the market engagement needs 204, the market scenario information 200 and the market capability information 202. The Requirements task 520 also comprises a second input 606 the purpose of which will become apparent later herein and a second output 609 for providing first prioritised requirements 608 in respect of a design configuration, the first prioritised requirements 608 serving as evidence for a Solutions task 524. The requirements for the proposed design configuration can be prioritised using a so-called House of Quality tool. The Solutions task 524 supports, in part, the Solutions function 444 of the stage and gateway process 400.
[0102] The Solutions task 524 has a first input 610 for receiving the first prioritised requirements 608 and a first output 612 for providing second prioritised requirements 613 (mentioned above in relation to the fourth stage 424 of the stage and gateway process 400) to the Intellectual Capital Management task 522 and a Critical Parameters task 526 that supports, in part, the Critical Parameters function 446. The Solutions task 524 is a set of resources that are characterised by people, information, processes and tools and that relate to scientific and/or engineering techniques necessary to provide technical specifications for products. The Solutions task 524 also comprises a second input 614 for technology attributes and a second output 616 for providing proposed configuration data 617 for a design and that serve as evidence for the Critical Parameters task 526 and the Requirements task 520, the second input 606 of the Requirements task 520 serving to receive the proposed configurations information 617. The Solutions task 524 also comprises a third input 618 the purpose of which will be described later herein.
[0103] The Critical Parameters task 526 comprises a first input 620 for receiving the second prioritised requirements 613 mentioned above, a second input 622 for receiving the proposed configuration information 617 mentioned above, and a third input 624 for receiving current inventory information 625. The current inventory configuration represents those inventory items already in existence and accessible to an enterprise to which the proposed configuration must interface in order to function satisfactorily. The Critical Parameters task 526 is a set of people, information, processes and tools arranged to provide critical analysis of the proposed configuration information 617 provided by the Solutions task 524. The Critical Parameters task 526 implements any suitable critical analysis methodology, for example a parameter design methodology, a design of experiments methodology and/or a Taguchi methodology with the aim of effective partitioning of parameters to make designs insensitive to noise sources.
[0104] The Critical Parameters task 526 also has a first output 626 for providing Requirements validation report information 627 and a second output 628 for providing configuration validation report information 629. A third output 630 of the Critical Parameters task 526 is used to provide verification test plan information 631 for the design and a fourth output 632 of the Critical Parameters task 526 is used to provide final configuration information 633 for the design. The requirements validation report information 627 serves as evidence for the Requirements task 520, the requirements validation report information 627 being received by the Requirements task 520 via the second input 606 thereof. The configuration validation report information 629 serves as evidence for the Solutions task 524, the configuration validation report information 629 being received by the Solutions task 524 via the third input 618 thereof. The verification test plan information 631 and the final configuration information 633 serve as respective evidence for a Performance Assessment task 528, the Performance Assessment task 528 supporting, in part, the Performance Assessment function 448.
[0105] In overview, the Critical Parameter task 526 relates to use of, inter a/ia, information generated by Requirements and Solution tasks 520, 524 with the aim, as mentioned above, of effective partitioning of parameters to make designs insensitive to noise sources. The aim of the Critical Parameter task 526 is to find an effective balance between data generated by the Requirements task 520 and the Solution task 524 by growing Intellectual Capital, for example Intellectual Property relating to the solution configuration which works best in the presence of noise factors.
[0106] The Performance Assessment task 528 comprises a first input 634 for receiving the final configuration information 633 and a second input 636 for receiving the verification test plan information 631. The Performance Assessment task 528 is a set of people, information, processes and tools arranged to provide design of test methods and equipment to verify the performance of the design proposed, and as specified by the final configuration information 633 following critical parameter analysis, against the prioritised requirements 613 contained within the verification test plan 631. A third input 638 of the Performance Assessment task 528 is arranged to receive relevant assessment capability information 639. The Performance Assessment task 528 also comprises a first output 640 for providing design verification results information 641, the design verification results information 641 serving as evidence for the Requirements task 520 and is received by the second input 606 thereof. The design verification results information 641 can then be provided to the Intellectual Capital Management task 522 via the first output 604 of the Requirements task 520, the design verification results information 641 being released to the Intellectual Capital Management task 522 by the Requirements task 520 once the Requirements task 520 has correlated the design verification results information 641 with the requirements validation report information 627 obtained from the Critical Parameters task 526 mentioned above. A second output 642 of the Performance Assessment task 528 provides final configuration information 643 that serves as evidence for the Intellectual Capital Management task 522. The Performance Assessment task 528 also comprises a third output 644 for providing verification test plan information 645 that serves as evidence for the Intellectual Capital Management task 522. A fourth output 646 of the Performance Assessment task 528 provides test equipment methods and procedures information 647 that also serves as evidence for the Intellectual Capital Management task 522.
[0107] The Intellectual Capital Management task 522 comprises a first input 468 for receiving the design verification results information 605 described above. The Intellectual Capital Management task 522 is a set of people, information, processes and tools arranged to provide identification of and/or access to intellectual capital, and/or protection of intellectual capital, sometimes protected by intellectual property laws, the intellectual capital relating to delivery of a solution as verified by the Performance Assessment task 528 for the purpose of providing a competitive advantage and rights can subsist, for example, in a proposed design configuration. Intellectual Capital can be generated as a result of implementation of any of the six tasks 520, 522, 524, 526, 528, 530 described herein. As mentioned above, Intellectual Capital can be protected by law, for example but not limited to laws relating to trade marks, industrial designs, copyright and patents. A second input 470 of the Intellectual Capital Management task 522 serves to receive the final configuration information 643, the verification test plan information 645 and the test equipment, methods and procedures information 647 mentioned above. A third input 472 is arranged to receive the second prioritised requirements information 613 produced by the Solutions task 524 as mentioned above. A first output 474 of the Intellectual Capital Management task 522 provides Intellectual Capital access rights information 475 to be stored by the data layer 102 for use in a subsequent stage and evidence access in the current stage. A number of further outputs 476 are provided for a number of items of evidence 477 which comprise, for example the design verification results 605, the second prioritised requirements 613, the final configuration information 643, the verification test plan information 645 and the test equipment, methods and procedures information 647, to be communicated with a Stakeholder Management task 530, the Stakeholder Management task 530 supporting, in part, the Stakeholder Management function 452.
[0108] The Stakeholder Management task 530 comprises a number of corresponding inputs 478 and a first output 482 for providing investment estimate information 483 for an investment case to be stored by the data layer 102 for use in a subsequent stage. The Stakeholder Management task 530 is a set of people, information, processes and tools arranged to manage access to suitable Intellectual Capital for use in each stage of the life cycle for the technology application. The Stakeholder Management task 530 manages links to enterprises that own parameters and functions relating to development of a given module class and/or technology application and includes, for example, legal, financial, and human resources organisations. The Stakeholder Management task 452 also comprises a number of outputs 484 for storing a number of items of evidence 486 comprising for example, the design verification results 605, the second prioritised requirements 613, the final configuration information 643, the verification test plan information 645 and the test equipment, methods and procedures information 647 within the data layer 102.
[0109] The information structures produced by the continuous enterprise functions 442, 444, 446, 448, 450, 452 at each stage when, in this example the presentation layer 106, the logical layer 104 and the data layer 102, interact form multidimensional metadata definitions 800 (Figure 6) for which industrial and commercial ownership and authorities are established, in this example, by the logical layer 104. The enterprise rnetadata structure 800 is maintained by the fourth application server 118. An operator of the technology management system 100 is able to interrogate the metadata structure 800 using forms presented by the first application server 112 via the presentation layer 106. The operator is therefore able to view resource data along a set of defined axes 801. Each axis represents a particular point of view required by an enterprise. In this example, the defined axes are resources by module class 802, resources by stage 804 and resources by function 806. An operator, responsible for the requirements function in an enterprise, might wish to know the cost of the requirements task for each stage of the stage and gateway process 400 with a view to optimising the use of resources for each requirements task at each stage to produce the most cost effective and timely provision of the overall requirements function 442. The skilled person should appreciate that the point of view can be pivoted around each axis to provide information relevant to other users of the technology enterprise management system 100. In order to support this functionality, the fourth application server 118 supports a structure data server module that interrogates the database server 108 along a point of view mentioned above and aggregates resulting information. The aggregated results are then rendered by the second application server 114 for display by the presentation layer 106. The continuous enterprise functions allow innovations in methods of delivery of the tasks at each stage and the gateways manage the accessibility of information from one stage to the next. These concepts provide the mechanism to bridge the infamous "valley of death" where investment in R & D fails to get a suitable return on investment. The application servers 112, 114, 116, 118 and the database server 108 cooperate to enable operators to store information in the form of, for example forms, documents and checklists that constitute evidence and use the information stored by providing access thereto by relevant operators in order to participate in operation of the control level network 600. The evidence can take any suitable form for use in relation to the project and the system 100 is sufficiently flexible to permit custom recordal of evidence irrespective of form.
[0110] The above examples have been described in the context of a technology application. However, the skilled person should appreciate that the stage and gateway process 400 can be, and is in this example, applied at a greater level of granularity. Consequently, the stage and gateway process 400 is applied to module classes relating to the technology application.
[0111] Furthermore, the above examples have been provided in order to illustrate a context for at least one embodiment of a user interface apparatus, system and method therefor described herein. However, the skilled person should appreciate that the user interface apparatus and system can be employed in respect of other applications and/or data structures where a user needs to be able to navigate through a complex data structure.
[0112] Referring to Figure 9, the second application server 114 comprises a processing resource 900, for example a microprocessor or a number of processing devices, coupled to a video driver circuit 902. The video driver circuit 902 is coupled to an output device 904, for example a display device, such as a liquid crystal display (LCD) device, or a video projector. Of course, the skilled person should appreciate that any suitable output device can be employed in order to provide a visible output to a user.
[0113] Turning to Figures 10 and 11, the database server 108 stores data arranged according to the data structure mentioned above. The data structure comprises a number of data sub-structures, for example a first data sub-structure, a second data sub-structure, a third data sub-structure, a fourth data sub-structure, a fifth data sub-structure, and a sixth data sub-structure. The data structure has a topology associated with inter-relationships between the number of data sub-structures. In order to illustrate operation of the second application server 114 in further detail, the following example will be described in the context of an aviation market. In particular, the example is directed to an aerial surveillance system in urban areas scenario, the data structure being associated with the aerial surveillance system.
[0114] Referring to Figure 12, the context of the example is established by reference to the data structure being associated with the aerial urban surveillance system 782. The aerial surveillance system 782 relates to a complex object, comprising for example, an Uninhabited Aerial Vehicle (UAV) 781 per se, as well as other complex objects, for example a ground control station and an operator.
The UAV 781, which was predetermined previously as a preferred combination of a platform module 306, a sensor module 310 and communications module 316 for the aerial surveillance scenario. The skilled person should, however, appreciate that other complex objects relating to an aerial surveillance scenario can be employed, the complex object being defined by the module classes associated with that aerial surveillance scenario. Hence, for example, a platform module class other than that relating to the UAV can be associated with the complex object.
[0115] In this respect, the first data sub-structure relates to the UAV 781, the second data sub-structure relating to the platform module 306. The third data sub-structure relates to the sensor module 310 and the fourth data sub-structure relates to the communications module 316.
[0116] In order to drill-down through the data structure associated with the aerial surveillance system 782, the processing resource 900 controls the video driver circuit 902 and hence the GUI in order that three-dimensional objects 906 can be presented in a selectable manner in order to enable visual navigation through the topology of the data structure.
[0117] Referring to Figure 13, in this example and as mentioned above, the complex object is the UAV 781 and so, via the GUI, the second application server 114 models the UAV 781 and presents the UAV 781 as a three-dimensional (3D) representation 908 of the UAV 781 (Step 750). The 3D representation is spatially manipulatable (Step 752) by the user, for example scalable and/or rotatable (Step 754) about at least one axis, such as an x-, y-and/or z-axis, using one or more input devices (not shown), for example a keyboard, a mouse and/or a motion sensitive controller; details relating to the motion sensitive controller will be set out later herein. To scale the representation of the complex object, a scaling command, for example zoom in or zoom out, can be provided by the user via the keyboard, mouse and/or motion sensitive controller. The complex object is not, in this example, geometric as the use of polyhedra is reserved for navigation cursors.
The complex object is usually a machine, device or other macroscopic functional unit. Although, in this example, the complex object has been modelled in three dimensions, the complex object can be represented in two dimensions.
[0118] As will be appreciated, the complex object comprises, in this example, a number of elements; the communications system and/or the sensor system. The data structure is arranged to record data relating to these and any other elements, for example, a propulsion system (not shown in Figure 12, but part of the platform module class 306). In topological terms, the data structure constitutes a node that corresponds to the data structure that corresponds to the aerial surveillance system 782 as a whole.
[0119] The processing resource 900, using the video driver circuit 902 and hence the GUI, includes in the model of the complex object, representations of the elements, for example a graphical feature 910 to represent the sensor module 310 and hence the third data sub-structure. In this regard, the graphical feature 910 is arranged to be visibly recognisable as relating to the sensor system. The graphical feature 910 is selectable by the user using the input device. In topological terms, the first, second, third, and fourth data sub-structures constitute sub-nodes of the data structure and, in this example, the third data sub-structure constitutes a sub-node of the sub-node associated with the first data sub-structure.
[0120] The processing resource 900 is capable of detecting selection (Step 756) of the graphical feature 910 with, for example, the mouse. In response to the selection in relation to the graphical feature 910, the third application server 116 determines that attributes are available and then interrogates the database server 108 in order to identify (Step 758) a set of primary attribute classes associated with the third data sub-structure. The processing resource 900 controls, with information structured by the fourth application server 118, the GUI to display a second three-dimensional object 912, for example a polyhedron, such as a tetrahedron, having a number of selectable elements corresponding to the number of attribute classes in the set of primary attributes (Steps 760 and 762), for example a first facet 916, a second facet 918 and a third facet 920, corresponding to fields of the third data sub-structure relating to the graphical feature 910. In order to identify a causal, for example contextual, relationship between the facets of the tetrahedron 912 and the third data sub-structure associated with the graphical feature 910, a first link line 914 is generated by the GUI and extends from the graphical feature 910 to the tetrahedron 912. The first link line 914, and any others described herein, serves to provide a visual indication between a number of the three-dimensional objects where relationships respectively exist between a number of the data sub-structures respectively associated with the number of the three-dimensional objects in accordance with the topology. In relation to the first link line 914, the first link line 914 logically subtends from the graphical feature 910 to the tetrahedron 912 to represent the relationship therebetween. In this example, the tetrahedron 912 is employed by virtue of the three side-facets thereof, each facet representing the technical attribute class, performance attribute class and inventory attribute class, respectively.
[0121] In this example, the technical 218, performance 208 and inventory 222 attribute classes constitute the primary attribute class of the sensors module class 310 associated with the third data sub-structure. Although not shown in Figures 10 and 11, the selectable elements, or visible facets, of the tetrahedron 912 are labelled according to the names of the primary attribute classes. The facets of the tetrahedron 912 can be spatially manipulated, for example rotated, such as about a z-axis relative to a base facet of the tetrahedron 912, by use of left-and right-arrow keys of the keyboard. In this example, the left-arrow key can be used to rotate the tetrahedron 912 anticlockwise and the right-arrow key can be used to rotate the tetrahedron 912 clockwise. Rotation of the tetrahedron 912 enables the user to reveal a facet of the tetrahedron 912 that is not visible from a current point of view being displayed by the GUI. Additionally, the rotation of a facet to a frontmost position when viewed enables the facet to be in a position where it can be selected. In this regard, pressing the spacebar of the keyboard results in the frontmost-facing facet, or selectable element, of the tetrahedron 912 to be selected. Additionally or alternatively, the mouse can be employed to rotate the tetrahedron 912 by placing a mouse cursor over a corner of the tetrahedron 912 and selecting or "grabbing" the corner by clicking on the left button of the mouse.
The user can then rotate the tetrahedron 912 in a clockwise direction or an anticlockwise direction by a dragging or flicking movement of the mouse. When the left button of the mouse is released, the tetrahedron 912 is no longer rotatable until a corner of the tetrahedron is selected or grabbed again. If the user wishes to select (Step 764) one of the visible facets of the tetrahedron 912, the mouse cursor is, in this example, placed over the facet to be selected and the left button of the mouse is double-clicked.
[0122] In response to a selection by a user of one of the facets of the tetrahedron 912, the processing resource 900 interrogates the fourth application server 118 to determine (Step 766) if another data sub-structure subtends from the third data sub-structure associated with the tetrahedron 912 in relation to the attribute class corresponding to the facet selected. If the processing resource 900 determines that the fifth data sub-structure subtends from or relates to the third data sub-structure, the processing resource 900 determines (Step 758), from information available from the database server 108, a set of secondary attribute classes associated with the fifth data sub-structure and controls the GUI to display a third 3D object 922 and indeed this is the case in relation to the data structure described herein. In this example, the third 3D object 922 is another polyhedron, such as a square pyramid, having a number of selectable elements, for example facets, corresponding to the number of attributes in the set of secondary attribute classes (Steps 760 and 762). Hence, it can be seen that the second and third three-dimensional objects have different appearances. In order to identify a causal relationship between the primary attribute class of the third data sub-structure associated with the selected, second, facet 918 and the fifth data sub-structure associated with the square pyramid 922, a second link line 924 is drawn as extending from the selected facet 918 to the square pyramid 922 in order to show that the set of secondary attribute classes subtend from the primary attribute class associated with the selected, second, facet 918. Hence, it can be seen that the tetrahedron 912 or any other polyhedron described herein provides a degree of control over access to data stored by the database server 108.
[0123] The square pyramid 922 also has a number of selectable elements, for example a first facet 926, a second facet 928, a third facet 930 and a fourth facet 932. As in relation to the tetrahedron 912, the user can use the keyboard and/or mouse and/or the motion sensitive controller in order to manipulate the square pyramid 922, for example to rotate the square pyramid 922, and select (Step 764) one of the facets 926, 928, 930, 932, of the square pyramid 922. In this example, the second, third and fourth facets 928, 930, 932 of the square pyramid 922 represent the market scenario attribute class 200, the market capability attribute class 202 and the engagement needs attribute class 204, which constitute secondary attribute classes of the performance attribute class 208. In this example, the first facet 926 is used for version control wherein the information subset attributes and the data substructure associated with the first facet 926 are arranged to be identical to those associated with the second facet 918 of the tetrahedron 912. The first facet 926 is derived from the datasets associated with the second, third and fourth facets 928, 930, 932 by the application server 118.
Although the attributes and dataset structures are identical, the data values in relation to second facet 918 and the first facet 926 can differ. The second link line 924 signifies the dataset attributes and data substructures necessary for effective communication between the owners of the datasets in relation to the tetrahedron 912 and the owners of the datasets in relation to square pyramid 922.
[0124] The second link line 924 can have user account privileges associated with therewith, the account privileges being stored on the database server 108 such that a user who has read access to the data subsets at one end of the second link line 924 may or may not have access to the data contained in the data subsets at another other end of the second link line 924.
[0125] In response to selection of one of first, second, third or fourth facets 926, 928, 930, 932 of the square pyramid 922, for example the fourth facet 932, the third application server 116 determines (Step 766) whether another data sub-structure subtends from the fifth data sub-structure associated with the square pyramid 922 in relation to the attribute class corresponding to the facet selected. If the processing resource 900 determines that the sixth data sub-structure subtends from or relates to the fifth data sub-structure, the third application server 116 determines (Step 758), from information available from the database server 108, a set of tertiary attribute classes associated with the sixth data sub-structure. The processing resource 900 then controls, with information structured by the fourth application server 118, the GUI to display (Steps 760 and 762) at least one fourth 3D object. In this example, the at least one fourth 3D object comprises a first cube 934, a second cube 936, a third cube 938, and a fourth cube 940 corresponding to the number of attribute classes in the set of tertiary attribute classes. In this example, the first, second, third and fourth cubes 934, 936, 938, 940 are revealed substantially at the same time. In order to identify a causal relationship between the secondary attribute class of the fifth data sub-structure associated with the selected, fourth, facet 932 of the square pyramid 922 and the sixth data sub-structure associated with the first, second, third and fourth cubes 934, 936, 938, 940, a third link line 942 extends from the selected, fourth, facet 932 to the first, second, third and fourth cubes 934, 936, 938, 940. In this example, the third link line 942 comprises a first branching portion 944 to signify a relationship between each of the first, second, third and fourth cubes 934, 936, 938, 940 and the selected, fourth, facet 932. The first, second, third and fourth cubes 934, 936, 938, 940 each constitute a terminus three-dimensional object. The skilled person should appreciate that although this example has been described in the context of four cubes, a greater or fewer number of cubes can be employed, for example two cubes, depending upon the nature of the data structure.
[0126] As in relation to the tetrahedron 912 and the square pyramid 922, each of the first, second, third and fourth cubes 934, 936, 938, 940 can be manipulated, for example rotated, in order to reveal and/or select data relating to tertiary attribute classes associated with the secondary attribute class associated with the selected, fourth, facet 932. In this example, each facet of the first cube 934 represents sensor owners, each facet of the second cube 936 represents sensor operators, each facet of the third cube 938 represents sensor legislatory bodies and each facet of the fourth cube 940 represents sensor maintenance. Upon detection by the processing resource 900 of a selection (Step 764) in respect of one of the cubes 934, 936, 938, 940, for example by pressing the "enter" key of the keyboard, the processing resource 900 interrogates the database server 108 in order to retrieve (Step 768) the data associated with the tertiary attribute class selected, for example evidence, such as textual evidence, because the processing resource 900 determines (Step 766) that further attribute classes are not required or do not exist beyond the level in the data structure reached. Typically, the data retrieved is displayed by the GUI in a two dimensional format, for example a form, report, graph or chart. However, it should be appreciated that the retrieved information can be presented in a three-dimensional format if desired. In another example, a list can be generated adjacent each of the first, second, third, and fourth cubes 934, 936, 938, 940 to represent each facet of each cube, each facet representing a data heading or field in a table. The terminal three-dimensional objects can also be labelled, for example with the names of tables.
[0127] The above process is repeated unit exploration of the data structure by the user has ceased. It should be appreciated that where a "branch", for example a polyhedron and any subsidiary objects, subtend from a selected facet of another polyhedron and are currently displayed, re-selection of the selected facet is interpreted by the processing resource 900 as a "collapse" command as opposed to an "expand" command and the branch is no longer displayed.
[0128] Although, in the above example, the terminus three-dimensional objects are described as subtending from the third three-dimensional object, namely the square pyramid 922, in another embodiment the number of levels through which the user can drill down can be fewer than described herein and a terminus three-dimensional object can constitute the third three-dimensional object, the terminus three-dimensional object being accompanied by one or more other terminus three-dimensional objects where in accordance with the data structure. Of course, the converse situation can exist and the data structure can have a greater number of levels, through which the user can drill down, than described herein.
[0129] In this example, the traversal of the levels between data structures and exploring attribute classes at any given level has privilege rights associated therewith. For example, access to certain data can need to be restricted due to restrictions of use governed by intellectual property rights associated with such access. The database server 108 can therefore be arranged to have an account structure that links an account of a user to access privileges for data sub-structures. Such access rights can be set with respect to read, write and/or execution as may be required. The three-dimensional objects can have a user community having the skills relevant to the task that the data structures support.
[0130] Referring again to Figure 10, as can be seen from the above examples, a user having appropriate access privileges can interrogate any of the three-dimensional objects that represent an association of attribute classes, for example the secondary attribute class object 922, to access information in relation to how the different classes of attributes of the secondary attribute class object 922 are derived and managed with respect to the second link line 924 and the selected facet of the primary attribute object 912. Additionally, the user can interrogate any three-dimensional object by way of a unique user input action (Step 764), such as depressing the shift key and simultaneously clicking on a left button of the mouse, which causes recognition by the processing unit 900 of the event as a user selection of a data management request (Step 770) and cause management information required by the authorised user to be recovered (Step 772) from the database server 108. The management information in respect of a given three-dimensional object describes how the data of data sub-structures are mapped onto higher structures and vice versa. For example, the Market capability mapping process 206 produces the performance attribute class 208 from the scenario attribute class 200, the engagement needs attribute class 204 and the market capabilities class 202. Hence, in the context of the primary attribute object 912 and the secondary attribute class object 922, the management information describes how resources are used to transform the data contained in the data sub-structures associated with the second, third and fourth facets 928, 930, 932 of the secondary attribute class object 922 into the data contained in the data structure associated with the second facet 918 of the primary attribute object 912.
[0131] In another example (Figures 14 and 15), the data structure can be explored further using the user interface apparatus to view, for example, other attribute classes and/or data. For example, selection of another facet, such as the second facet 928 of the square pyramid 922, and detection of selection thereof results in the processing resource 900 controlling the GUI to display at least one fifth 3D object. In this example, the at least one fifth 3D object comprises a fifth cube 946, a sixth cube 948, a seventh cube 950 and an eighth cube 952. In order to identify a causal relationship between the second attribute class chosen by selection of the second facet 928 of the square pyramid 922 and another data sub-structure associated with the fifth, sixth, seventh and eighth cubes 946, 948, 950, 952, a fourth link line 954 extends from the selected, second, facet 928 to the fifth, sixth, seventh and eighth cubes 946, 948, 950, 952. Additionally, the fourth link line 954 comprises a second branching portion 956 to signify a relationship between the fifth, sixth, seventh and eighth cubes 946, 948, 950, 952 and the selected, second, facet 928.
[0132] Similarly, by modifying the viewing angle associated with three-dimensional objects 906, the user can select a further facet, such as the third facet 930 of the square pyramid 922, and detection of selection thereof results in the processing resource 900 controlling the GUI to display at least one sixth 3D object.
In this example, the at least one sixth 3D object comprises a ninth cube 958, a tenth cube 960, an eleventh cube 962 and a twelfth cube 964. In order to identify a causal relationship between the second attribute class chosen by selection of the third facet 930 of the square pyramid 922 and a further data sub-structure associated with the ninth, tenth, eleventh and twelfth cubes 958, 960, 962, 964, a fifth link line 966 extends from the selected, third, facet 930 to the ninth, tenth, eleventh and twelfth cubes 958, 960, 962, 964. Additionally, the fifth link line 966 comprises a third branching portion 968 to signify a relationship between the ninth, tenth, eleventh and twelfth cubes 958, 960, 962, 964 and the selected, third, facet 930.
[0133] As can be seen, the data structure is hierarchical and, in this example, comprises levels and branches in accordance with the topology associated with the data structure. Using the user interface apparatus, the user can "drill-down" through the cascaded data structure by selection in respect of a number of the 3D objects 906 in order to access data stored by the database server 108.
[0134] Irrespective of the lowest level of the data structure currently being displayed by the GUI and hence the number of 3D objects currently being displayed, as suggested above, the view of the current state of navigation can be manipulated, for example rotated about at least one axis and/or scaled, by use of the control mentioned above.
[0135] In another embodiment, the above-described user interface functionality can be used in conjunction with a user interface feature that is responsive to a user positioning of the cursor over an icon or link associate with a target, for example of the type described in US patent publication no. 2008/0244460. Upon positioning of the cursor over an appropriate element being displayed, for example the graphical feature 910, the user interface described herein can be used to display automatically the first link line 914 and/or the second three-dimensional object 912 for interaction therewith in the manner already described above. In an alternative embodiment, the user interface can be arranged to display the graphical feature 910 in response to the cursor being positioned over an appropriate part of the representation 908 of, for example, the UAV 781. The user interface can then be used in the manner described above to drill down and/or explore the data structure.
[0136] In a further embodiment, an image captured, for example, via a digital camera can be used as a basis for displaying elements of the user interface described above. In this respect, the cursor can be positioned over a part of the image containing, for example, an image of the UAV 781. An image recognition engine (not shown) is thus provided by the second application server 114 to recognise the UAV 781 and to display, for example, the graphical feature 910.
The user interface can then be employed in the manner already described above.
The skilled person should appreciate that the image captured need not be a static image pre-stored and retrieved, and the image can be obtained in real-time or near real-time via the digital camera, for example a digital camera of a electronic device, such as a portable electronic device, of any type described later herein.
[0137] As also mentioned in the above examples, control of the user interface apparatus can be achieved by alternative or additional input devices. Referring back to Figure 9, in one embodiment, a motion sensitive controller 903 is employed to detect a movement gesture made by the user. The motion sensitive controller 903 can be any suitable device capable of sensing movement thereof, for example translation and/or rotation movement. Typically, the motion sensitive controller 903 comprises one or more motion sensors, for example accelerometers and/or gyroscopic devices. Suitable devices include: a portable communications device, for example a cellular communications handset equipped with one or more motion sensitive devices, such as an iPhone® available from Apple, Inc., or a Personal Digital Assistant (PDA) equipped with one or more motion and/or direction sensitive devices; or a controller for a games console, such as a Wii® baton controller available from Nintendo®. In addition to the motion sensing device(s), the motion sensing controller also can comprise a button, keypad or other input device to support expression of a selection by a user.
[0138] The motion sensitive controller 903 is, in this example wireless, and capable of communicating motion-related data and selection-related data to the second application server 114, for example via a Bluetooth® communications link (the second application server 114 comprises a Bluetooth® interface module 901 to support the Bluetooth® communications link). The motion-related data, including any selection-related data, is received by the processing resource 900 and the processing resource 900 controls the GUI to provide any of the manipulation functionality described above.
[0139] One of the 3D objects 906 can consequently be selected by the user using the motion sensitive controller 903 and spatially manipulated in accordance with the movement gesture made by the user. For example, the movement gesture can be a rotating movement in any axis in relation to a view of the polyhedron 912. If more than one polyhedron are visible, the more than one polyhedron can be spatially manipulated together. Likewise, the complex object and any visible polyhedra can be spatially manipulated, for example the first and second 3D objects 908, 912. Similarly, the motion sensitive controller 903 can be used to select a part of one of the polyhedra and rotate the selected polyhedron about at least one axis.
[0140] Alternative embodiments of the invention can be implemented as a computer program product for use with a computer system, the computer program product being, for example, a series of computer instructions stored on a tangible data recording medium, such as a diskette, CD-ROM, ROM, or fixed disk, or embodied in a computer data signal, the signal being transmitted over a tangible medium or a wireless medium, for example, microwave or infrared. The series of computer instructions can constitute all or part of the functionality described above, and can also be stored in any memory device, volatile or non-volatile, such as semiconductor, magnetic, optical or other memory device.

Claims (58)

  1. Claims: 1. A user interface apparatus comprising: a graphical user interface; a processing resource arranged to control, when in use, the graphical user interface for navigating through a data structure associated with data stored in a database, the data structure having a topology; wherein the processing resource controls the graphical user interface in order to present three-dimensional objects, the three-dimensional objects being selectable in order to enable visual navigation through the topology of the data structure.
  2. 2. An apparatus as claimed in Claim 1, wherein the data structure comprises data sub-structures arranged according to the topology, the processing resource being arranged to provide a visual indication between a number of the three-dimensional objects where relationships respectively exist between a number of the data sub-structures associated with the number of the three-dimensional objects in accordance with the topology.
  3. 3. An apparatus as claimed in Claim 1, wherein the three-dimensional objects comprise a first three-dimensional object and a second three-dimensional object, the processing resource being arranged to reveal selectively the second three- dimensional object in response to a selection in relation to the first three-dimensional object.
  4. 4. An apparatus as claimed in Claim 3, wherein the data structure comprises a first data sub-structure associated with the data structure in accordance with the topology, and the second three-dimensional object is drawn as logically subtending from the first three-dimensional object so as to represent a relationship between the second three-dimensional object and the first data sub-structure associated with the first data sub-structure.
  5. 5. An apparatus as claimed in any one of the preceding claims, wherein the topology is hierarchical.
  6. 6. An apparatus as claimed in Claim 3 or Claim 4, wherein the three-dimensional objects comprise a third three-dimensional object, the processing resource being arranged to reveal selectively the third three-dimensional object in response to another selection in the relation to the second three-dimensional object.
  7. 7. An apparatus as claimed in Claim 4, wherein the data structure comprises a third data sub-structure associated with the second data sub-structure in accordance with the topology, and the third three-dimensional object logically subtends from the second three-dimensional object so as to represent the relationship between the third data sub-structure and the second data sub-structure according to the topology.
  8. 8. An apparatus as claimed in Claim 1, wherein the topology comprises a node corresponding to the data structure as a whole, and the data structure comprises a data sub-structure corresponding to a sub-node in the topology that subtends from the most senior node; wherein the processing resource is arranged to control, when in use, the graphical user interface in order to present a first three-dimensional object visually representing the data stored in the database, the first three-dimensional object comprising a graphical feature selectable via the graphical user interface that visually represents the data sub-structure.
  9. 9. A apparatus as claimed in Claim 8, wherein the processing resource is arranged to control, when in use, the graphical user interface in order to present a second three-dimensional object subtending from the first three-dimensional object in response to selection of the graphical feature.
  10. 10. An apparatus as claimed in Claim 9, wherein the second three-dimensional object subtends from the graphical feature of the first three-dimensional object.
  11. 11. An apparatus as claimed in any one of Claims 3 to 7 or 9, wherein the second three-dimensional object is a polyhedron.
  12. 12. An apparatus as claimed in any one of Claims 3 to 7 or 9 to 11, wherein the second three-dimensional object is rotatable about at least one axis.
  13. 13. An apparatus as claimed in any one of Claims 3 to 12, wherein the first three-dimensional object is a non-geometric shape.
  14. 14. An apparatus as claimed in any one of Claims 3 or 8 to 13, wherein the first three-dimensional object is a graphical representation of a machine.
  15. 15. An apparatus as claimed in any one of Claims 3 to 7 or 9 to 14, wherein the second three-dimensional object comprises a plurality of selectable elements.
  16. 16. An apparatus as claimed in Claim 15, when dependent upon Claim 9, wherein the processing resource is arranged to control, when in use, the graphical user interface in order to present a third three-dimensional object subtending from the second three-dimensional object in response to another selection in respect of the second three-dimensional object.
  17. 17. An apparatus as claimed in Claim 16, wherein the third three-dimensional object is presented in response to the another selection via the graphical user interface in respect of an element of the plurality of selectable elements.
  18. 18. An apparatus as claimed in Claim 15, when dependent upon Claim 6, wherein the third three-dimensional object is presented in response to the another selection via the graphical user interface in respect of an element of the plurality ofselectable elements.
  19. 19. An apparatus as claimed in any one of Claims 15 to 18, wherein the plurality of selectable elements is a plurality of facets.
  20. 20. An apparatus as claimed in any one of Claims 3 to 19, wherein the first three-dimensional object is rotatable about at least one axis.
  21. 21. An apparatus as claimed in any one of Claims 3 to 20, wherein the first three-dimensional object is scalable in response to a scaling command provided via the graphical user interface.
  22. 22. An apparatus as claimed in any one of Claims 3 to 21, wherein a portion of the data is categorisable into a set of attributes, the second three-dimensional object corresponding to a primary set of attributes.
  23. 23. An apparatus as claimed in Claim 22, wherein the processing resource is further arranged to access, when in use, the database in order to retrieve attribute data associated with an attribute of the primary set of attributes in response to a selection via the graphical user interface in relation to the second three-dimensional object.
  24. 24. An apparatus as claimed in Claim 23, when dependent upon Claim 16, wherein attribute data associated with the attribute of the primary set of attributes is categorisable into a set of secondary attributes, the third three-dimensional object corresponding to the secondary set of attributes, the secondary set of attributes subtending from the attribute of the primary set of attributes.
  25. 25. An apparatus as claimed in Claim 24, wherein the third three-dimensional object is manipulatable in order to control access of data in the database associated with an attribute of the secondary set of attributes.
  26. 26. An apparatus as claimed in any one of Claims 3 to 5 or Claims 8 to 25, wherein the second and third three-dimensional objects have different appearances.
  27. 27. An apparatus as claimed in Claim 6 or Claim 16, wherein the processing resource is arranged to control, when in use, the graphical user interface in order to present a first terminus three-dimensional object.
  28. 28. An apparatus as claimed in Claim 27, wherein the third three-dimensional object is the first terminus three-dimensional object and is revealed in response to the another selection in relation to the second three-dimensional object.
  29. 29. An apparatus as claimed in Claim 27, wherein the first terminus three-dimensional object subtends from the third three-dimensional object and is revealed in response to a further selection in relation to the third three-dimensional object.
  30. 30. An apparatus as claimed in Claim 28 or Claim 29, wherein the processing resource is arranged to control, when in use, the graphical user interface in order to present a second terminus three-dimensional object, the processing resource being arranged to reveal selectively the second terminus three-dimensional object substantially at the same time as the first terminus three-dimensional object.
  31. 31. An apparatus as claimed in Claim 30, when dependent upon Claim 29, wherein both the first and second terminus objects subtend from the third three-dimensional object.
  32. 32. An apparatus as claimed in Claim 30, when dependent upon Claim 28, wherein both the first and second terminus objects subtend from the second three-dimensional object.
  33. 33. An apparatus as claimed in Claim 27, wherein the processing resource is arranged to retrieve, when in use, data from the database in response to a yet further selection in relation to the first terminus three-dimensional object.
  34. 34. An apparatus as claimed in Claim 6 or Claim 16, wherein the processing resource is arranged to control, when in use, the graphical user interface in order to present a plurality of terminus three-dimensional objects subtending from the third three-dimensional object in response to a further selection in respect of the third three-dimensional object.
  35. 35. An apparatus as claimed in Claim 34, wherein the processing resource is arranged to retrieve, when in use, data from the database in response to a yet further selection in relation to a first terminus three-dimensional object of the plurality of terminus three-dimensional objects.
  36. 36. An apparatus as claimed in Claim 27 or Claim 35, wherein the first terminus three-dimensional object is rotatable about at least one axis.
  37. 37. An apparatus as claimed in Claim 33 or Claim 35, wherein the data retrieved is presented in a two-dimensional format.
  38. 38. An apparatus as claimed in Claim 33 or Claim 35, wherein the data retrieved is presented as a form or a report.
  39. 39. An apparatus as claimed in Claim 1, wherein at least one of the three-dimensional objects is spatially manipulatable.
  40. 40. An apparatus as claimed in Claim 1, wherein the processing resource supports drilling-down through the data structure by selection in respect of a number of the three-dimensional objects.
  41. 41. An apparatus as claimed in Claim 39, wherein the processing resource supports drilling-down through the data structure by spatial manipulation of a number of three three-dimensional objects.
  42. 42. An apparatus as claimed in Claim 1, wherein presentation of a three-dimensional object of the three-dimensional objects is subject to a user access privilege.
  43. 43. An apparatus as claimed in Claim 2, wherein presentation of the second three-dimensional object is subject to a user access privilege.
  44. 44. An apparatus as claimed in Claim 1, wherein the database comprises retrievable management information in respect of a three-dimensional object to facilitate optimisation of interpretation, manipulation, sequencing and/or communication of data structures associated with a three-dimensional object.
  45. 45. An apparatus as claimed in Claim 3, further comprising an image comprising a two-dimensional representation of the first three-dimensional object.
  46. 46. An apparatus as claimed in Claim 45, wherein the second three- dimensional object is revealed in response to positioning a cursor over the two-dimensional representation of the three-dimensional object.
  47. 47. A user interface system comprising: a user interface apparatus as claimed in any one of the preceding claims; and a motion sensitive controller translatable, when in use, by a movement gesture, the motion sensitive controller being capable of detecting the movement gesture and communicating movement gesture data associated with the movement gesture to the processing resource; wherein the processing resource is arranged to manipulate a selected three-dimensional object in accordance with the movement gesture detected.
  48. 48. A system as claimed in Claim 47, wherein the movement gesture is a rotation gesture.
  49. 49. A system as claimed in Claim 47 or Claim 48, when dependent upon Claim 3 or Claim 8, wherein a view of the first three-dimensional object is controllable using the movement gesture data from the motion sensitive controller.
  50. 50. A system as claimed in Claim 47 or Claim 48, when dependent upon Claim 3 or Claim 9, wherein a view of the first three-dimensional object and the second three-dimensional object is controllable using the movement gesture data from the motion sensitive controller.
  51. 51. A system as claimed in any one of Claims 47 to 50, wherein the motion sensitive controller is a controller for a games console.
  52. 52. A system as claimed in any one of Claims 47 to 50, wherein the motion sensitive controller is a portable communications device comprising a motion sensing device.
  53. 53. A system as claimed in any one of Claims 47 to 50, wherein the motion sensitive controller is a Personal Digital Assistant (PDA) comprising a motion sensing device.
  54. 54. A system as claimed in Claim 52, wherein the motion sensitive controller is an iPhone®.
  55. 55. A method of navigating through a data structure, the method comprising: controlling a graphical user interface for navigating through a data structure associated with data stored in a database, the data structure having a topology; wherein controlling the graphical user interface comprises presenting three-dimensional objects, the three-dimensional objects being selectable in order to enable visual navigation through the topology of the data structure.
  56. 56. A user interface apparatus substantially as hereinbefore described with reference to Figures 1 and 9to 11, 13 and 14.
  57. 57. A method of interfacing with a user substantially as hereinbefore described with reference to Figures 1 and 9 to 14.
  58. 58. A user interface system substantially as hereinbefore described with reference to Figures 1 and 9 to 14.
GB0917293A 2009-10-02 2009-10-02 Graphical user interface using three dimensional objects. Withdrawn GB2474053A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0917293A GB2474053A (en) 2009-10-02 2009-10-02 Graphical user interface using three dimensional objects.

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0917293A GB2474053A (en) 2009-10-02 2009-10-02 Graphical user interface using three dimensional objects.

Publications (2)

Publication Number Publication Date
GB0917293D0 GB0917293D0 (en) 2009-11-18
GB2474053A true GB2474053A (en) 2011-04-06

Family

ID=41393759

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0917293A Withdrawn GB2474053A (en) 2009-10-02 2009-10-02 Graphical user interface using three dimensional objects.

Country Status (1)

Country Link
GB (1) GB2474053A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2891047A4 (en) * 2012-08-29 2016-04-20 Samsung Electronics Co Ltd Performing actions through a user interface
US10642898B1 (en) * 2017-04-11 2020-05-05 Northrop Grumman Systems Corporation Three-dimensional graph

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0435601A2 (en) * 1989-12-29 1991-07-03 Xerox Corporation Display of hierarchical three-dimensional structures
WO1997036251A1 (en) * 1996-03-28 1997-10-02 Critical Thought, Inc. User interface navigational system and method for interactive representation of information contained within a database
WO2000054139A1 (en) * 1999-03-08 2000-09-14 The Procter & Gamble Company Method and apparatus for interactively displaying three-dimensional representations of database contents
GB2352940A (en) * 1999-07-08 2001-02-07 Gordon Ross Multi-dimensional communications using database linking
US20010028369A1 (en) * 2000-03-17 2001-10-11 Vizible.Com Inc. Three dimensional spatial user interface
WO2002023402A2 (en) * 2000-09-12 2002-03-21 Althea Mitchell Multidimensional database analysis tool

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0435601A2 (en) * 1989-12-29 1991-07-03 Xerox Corporation Display of hierarchical three-dimensional structures
WO1997036251A1 (en) * 1996-03-28 1997-10-02 Critical Thought, Inc. User interface navigational system and method for interactive representation of information contained within a database
WO2000054139A1 (en) * 1999-03-08 2000-09-14 The Procter & Gamble Company Method and apparatus for interactively displaying three-dimensional representations of database contents
GB2352940A (en) * 1999-07-08 2001-02-07 Gordon Ross Multi-dimensional communications using database linking
US20010028369A1 (en) * 2000-03-17 2001-10-11 Vizible.Com Inc. Three dimensional spatial user interface
WO2002023402A2 (en) * 2000-09-12 2002-03-21 Althea Mitchell Multidimensional database analysis tool

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2891047A4 (en) * 2012-08-29 2016-04-20 Samsung Electronics Co Ltd Performing actions through a user interface
US10048832B2 (en) 2012-08-29 2018-08-14 Samsung Electronics Co., Ltd. Performing actions through a user interface
US10642898B1 (en) * 2017-04-11 2020-05-05 Northrop Grumman Systems Corporation Three-dimensional graph

Also Published As

Publication number Publication date
GB0917293D0 (en) 2009-11-18

Similar Documents

Publication Publication Date Title
Wang et al. Integrating Augmented Reality with Building Information Modeling: Onsite construction process controlling for liquefied natural gas industry
Cheng et al. State-of-the-art review on mixed reality applications in the AECO industry
US10304021B2 (en) Metadata-configurable systems and methods for network services
Martínez-Rojas et al. The role of information technologies to address data handling in construction project management
Seffino et al. WOODSS—a spatial decision support system based on workflows
US20160103903A1 (en) Systems, devices, and methods for generation of contextual objects mapped by dimensional data to data measures
US20170235466A1 (en) System and Method to Generate Interactive User Interface for Visualizing and Navigating Data or Information
US20140152564A1 (en) Presentation selection by device orientation
US20120310602A1 (en) Facilities Management System
US20120162265A1 (en) Computer-implemented method for specifying a processing operation
US20130159036A1 (en) Runtime generation of instance contexts via model-based data relationships
US7672969B1 (en) Context based configuration management system
Lucchi Digital twins for the automation of the heritage construction sector
Camba et al. On the integration of model-based feature information in Product Lifecycle Management systems
CN109952752A (en) For the authorization of having ready conditions of isolation set
Harinath et al. Professional SQL server analysis services 2005 with MDX
GB2474053A (en) Graphical user interface using three dimensional objects.
Hu et al. Creativity-based design innovation environment in support of robust product development
US9268883B2 (en) User interface for presenting information about a product structure for a product
EP2220601A1 (en) Technology enterprise management apparatus and method therefor
Xue et al. A Conceptual Architecture for Adaptive Human‐Computer Interface of a PT Operation Platform Based on Context‐Awareness
EP2992486A1 (en) Systems, devices, and methods for generation of contextual objects mapped by dimensional data to data measures
US10679669B2 (en) Automatic narration of signal segment
US20180204473A1 (en) Sharing signal segments of physical graph
US20230351327A1 (en) Category classification of records of e-procurement transactions

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)