US20020069072A1 - Augmented-reality system with voice-based recording of information data, in particular of service reports - Google Patents

Augmented-reality system with voice-based recording of information data, in particular of service reports Download PDF

Info

Publication number
US20020069072A1
US20020069072A1 US09945774 US94577401A US2002069072A1 US 20020069072 A1 US20020069072 A1 US 20020069072A1 US 09945774 US09945774 US 09945774 US 94577401 A US94577401 A US 94577401A US 2002069072 A1 US2002069072 A1 US 2002069072A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
data
recording
system
means
voice
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09945774
Inventor
Wolfgang Friedrich
Wolfgang Wohlgemuth
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/41875Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by quality surveillance of production
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/409Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details, by setting parameters
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/4183Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/31From computer integrated manufacturing till monitoring
    • G05B2219/31027Computer assisted manual assembly CAA, display operation, tool, result
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32014Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35482Eyephone, head-mounted 2-D or 3-D display, also voice and other control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35494Online documentation, manual, procedures, operator, user guidance, assistance
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35495Messages to operator in multimedia, voice and image and text
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
    • Y02P90/04Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS] characterised by the assembly processes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
    • Y02P90/10Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS] characterised by identification, e.g. of work pieces or equipment
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
    • Y02P90/22Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS] characterised by quality surveillance of production

Abstract

The invention relates to an augmented-reality system and method for transmitting information data from a user, in particular of a process controlled by automation technology, of a production plant and/or of a machine, for example from a service technician to a storage medium in the augmented-reality system, where the augmented-reality system have recording means for recording voice inputs by the user, in particular service logs, using voice-based input. This means that service logs can easily be created quickly, stored centrally and archived, so that the appropriate information is reliably available even in the [lacuna] for later, similar instances of action.

Description

    BACKGROUND OF THE INVENTION
  • The invention relates to an augmented-reality system for transmitting information data from a user, for example from a service technician, to a storage medium in the augmented-reality system. [0001]
  • Such a system and method are used, for example, in the field of automation technology, for production machinery and machine tools, in diagnostic/service support systems, and for complex components, equipment and systems, such as vehicles and industrial machinery and installations. [0002]
  • The technical article Daude R. et al: “Head-Mounted Display als facharbeiterorientierte Unterstützungskomponente an CNC-Werkzeugmaschinen” [Head-mounted display as expert-oriented support component on CNC machine tools], Werkstattstechnik, DE, Springer Verlag, Berlin, Vol. 86, No. 5, May 1, 1996, pages 248-252, XP000585192 ISSN: 0340-4544, describes, with the head-mounted display (HMD), a component for supporting the expert for setting up, running in and fault management for milling work. The engineering link between the HMD and a modern NC-controller is explained and the results of a laboratory experiment with the HMD are stated. [0003]
  • The invention is based on the object of specifying a system and a method which permit simple and reliable recording and storage of information data, in particular of service reports. [0004]
  • This object is achieved by a system and by a method having the features specified in claims [0005] 1 and 9, respectively. The invention is based on the insight that service staff with simple training often have only limited ability to document a servicing action in writing. The result of this is that no reliable information is available for future instances of action. The voice-based recording of the service reports, possibly also in connection with information data, i.e. documentation data and/or process data for an installation controlled by automation technology, provides a simple and reliable way of recording and storing the information data, in particular the service reports. In this case, the service reports can be stored and associated under voice control or under the control of the other recording means, such as image-recording means, i.e. in context-dependent fashion by the respective current operating situation ascertained by the recording means. When recording the action reports, in addition to the pure voice-based recording, it is also possible to record other data, such as process values, signal values, video images, at the same time. The augmented-reality system can provide the service technician who has been given a particular task with access to suitable, earlier, stored service logs, in the form of audio and/or image data, in situ on the basis of the problem which is set, so that the service technician profits, in a simple manner, from service actions which have already been recorded previously by colleagues, and an expert needs to be employed only in exceptional cases.
  • Advantageous refinements involve the documentation data being static and/or dynamic information data. Examples of such static information are engineering data from handbooks, exploded drawings, maintenance instructions etc. Examples of dynamic information are process values such as temperature, pressure, signals etc. [0006]
  • The invention is based on the insight that service staff with simple training often have only limited ability to document a servicing action in writing. The result of this is that no reliable information is available for future instances of action. The voice-based recording of the service reports, possibly also in connection with information data, i.e. documentation data and/or process data for an installation controlled by automation technology, provides a simple and reliable way of recording and storing the information data, in particular the service reports. In this case, the service reports can be stored and associated under voice control or under the control of the other recording means, such as image-recording means, i.e. in context-dependent fashion by the respective current operating situation ascertained by the recording means. When recording the action reports, in addition to the pure voice-based recording, it is also possible to record other data, such as process values, signal values, video images, at the same time. The augmented-reality system can provide the service technician who has been given a particular task with access to suitable, earlier, stored service logs, in the form of audio and/or image data, in situ on the basis of the problem which is set, so that the service technician profits, in a simple manner, from service actions which have already been recorded previously by colleagues, and an expert needs to be employed only in exceptional cases. [0007]
  • Advantageous refinements involve the documentation data being static and/or dynamic information data. Examples of such static information are engineering data from handbooks, exploded drawings, maintenance instructions etc. Examples of dynamic information are process values such as temperature, pressure, signals etc. [0008]
  • Rapid situation-related access to the documentation data is supported further by virtue of the recording means having an image-sensing apparatus, by virtue of the evaluation means being provided for evaluating the real information such that a use context, in particular an object of the documentation data, is ascertained from the real information, and by virtue of the system having visualization means for visualizing the documentation data. [0009]
  • Rapid situation-related access to the documentation data is supported further by virtue of the recording means being user-controlled and being, in particular, in the form of voice-controlled recording means and/or recording means controlled by control data. [0010]
  • Augmented-reality techniques on the basis of the static and/or dynamic documentation and/or process data can be used in a manner which is optimum for a large number of application instances by virtue of the recording means and/or the visualization means being in the form of data goggles.[0011]
  • The invention is described in more detail and explained below using the exemplary embodiments shown in the Figures, in which: [0012]
  • FIG. 1 shows a block diagram of a first exemplary embodiment of an augmented-reality system; [0013]
  • FIG. 2 shows another block diagram of an exemplary embodiment of an augmented-reality system; and [0014]
  • FIG. 3 shows an application example of situation-related access to expert knowledge and/or documentation data.[0015]
  • FIG. 1 shows a basic illustration of an augmented-reality system augmented-reality system for transmitting information data from a user, in particular of a process controlled by automation technology, of a production plant and/or of a machine, for example from a service technician to a storage medium in the augmented-reality system. The augmented-reality system has recording means [0016] 11 for recording voice inputs by the user 7. Such voice inputs comprise, in particular, service logs using voice-based input. For this purpose, the user, who is not shown explicitly in FIG. 1, is equipped with mobile equipment 4, 6. The mobile equipment 4, 6 comprises data goggles 4 holding a video camera 2 and a microphone 11. The data goggles are coupled to a device for wireless communication, for example a radio transceiver apparatus 6 which can communicate with the automation system A1 . . . An via a radio interface 15. The automation system A1 . . . An can be coupled by means of a data link 14 to an augmented-reality system 10, which is also referred to as AR system for short below. The AR system contains an information module 1 b for storing or accessing information data, an AR base module 8 and an AR application module 9. In one advantageous embodiment, the AR system 10 can be connected to the one data network, for example to the Internet 5, by means of a data link 13, with an internet connection 12 (shown by way of example) permitting access to memory and documentation data 1 a. Similarly, the voice data and/or other data from the user can be stored at the second location O2.
  • The user, equipped with the data goggles [0017] 4 and the mobile radio transmission device 7, can move freely in the installation A1 . . . An for maintenance and servicing purposes. If, by way of example, maintenance or repair of a particular subcomponent in the installations A1 . . . An is necessary, then the camera 2 on the data goggles 4 is used, possibly controlled by voice commands recorded by the microphone 11, to set up appropriate access to the relevant documentation data 1 a, 1 b, for example earlier, already recorded service reports. To this end, the radio interface 15 is used to set up a data link to the installation A1 . . . An or to an appropriate radio transmission module, and to transmit the data to the AR system 10. In the AR system, the data obtained from the user are evaluated in relation to the situation, and information data 1 a, 1 b are accessed automatically or else under interactive control by the user. The relevant documentation data 1 a, 1 b ascertained are transmitted to the radio transmission device 6 via the data links 14, 15, and, on the basis of the operating situation recorded, all in all an analysis is thus performed which is the basis of the selection of data from the available static information. This results in situation-related, object-oriented or component-oriented selection of relevant knowledge from the most up-to-date data sources 1 a, 1 b. The information is displayed using the respective visualization component used, for example a hand-held PC or data goggles. AR-based technologies are referred to. The user in situ is thus provided only with the information which he needs. This information is always at the most up-to-date level. The service technician is not overloaded with information, for example by a “100 page manual”.
  • FIG. 2 shows another application example of a system for documentation processing for servicing and maintenance. The system comprises an augmented-reality system [0018] 10 which contains an information module 1 b for storing information data, an AR base system 8 and an AR application module 9. The AR system 10 can be coupled to the Internet 5 by means of link lines 13, 18. From the Internet, an illustrative data link 12 can be used to connect to a remote PC 16 with a remote expert 22. The individual modules of the AR system 10 are coupled together by means of connections 19, 20, 21. The user communication between a user 7 and the AR system takes place via interfaces 8, 23. To this end, the AR system can be coupled to a transceiver apparatus which permits two-way data communication between the AR system 10 and the user 7, using data goggles 4, either directly via the interface 8 or via an interface 23 using a radio transceiver device 17 arranged in the area of the user 7. The connection 23 can be produced using a separate data link or using the electricity mains in the form of a “power-line” modem. Besides a display apparatus arranged in the area of the goggle lenses, the data goggles 4 contain an image-sensing apparatus 2 in the form of a camera, and also a microphone 11. The user 7 can move in the area of the installations A1 . . . An using the data goggles 4 and can carry out servicing or maintenance work.
  • The data goggles [0019] 4 and the corresponding radio transceiver apparatuses, for example the radio transceiver apparatus 17 worn by the staff directly on the body, can be used to achieve preventive functionality: first, the respective operating situation is recorded, for example by the camera 2 or by localization by the staff 7. On the basis of the recorded operating situation, the AR system selects data [lacuna] installation A1 . . . An being maintained. The fundamental advantage of the system shown in FIG. 3 is that this system supports the interaction of the individual single functionalities on an application-related basis: thus, a concrete operating situation is first recorded automatically and this operating situation is then analyzed, with the currently relevant aspects being automatically ascertained from the most up-to-date, available static information in combination with the presently recorded dynamic data. This correlates assembly instructions, for example, to current process data. This provides the staff 7 with a situation-related display of the relevant information, for example by means of overlaid visualization of the appropriate data such that the real operating situation is extended by the ascertained information in the field of view of the staff. This very quickly equips the staff 7 to take action, and hence safeguards necessary machine execution times. The maintenance technician 7 can also obtain support in situ from the remote expert 22 and the knowledge 16 available at the location of the remote expert 22.
  • FIG. 3 shows an application example of situation-related access to documentation data. FIG. 3 shows a first screen area B[0020] 1 showing an installation component. The right-hand screen area B2 shows a user 7 looking at an individual installation component, for example. The user 7 is equipped with data goggles 4 containing a camera 2 as recording means. The data goggles 4 additionally hold a microphone 11 and a loudspeaker 16. The left-hand screen area B1 shows a view of pipelines which can be observed using the data goggles shown in the image window B2. In the left-hand screen area B1, two points B1, B2 are marked which respectively represent two image details observed using the data goggles 4. After observation of the first point P1, i.e. after observation of the pipeline arranged in the area of the point P1, additional information is visualized in the data goggles 4 for the user 7. This additional information 11 comprises documentation data which, for the first point P1, contain work instructions for this pipeline, and for the point P2, contain the installation instruction to be carried out in a second step. In this case, the installation instruction involves the user 7 being informed of the torque and the direction of rotation of the screw connection at the point P2 by means of visualization of the supplementary data 112. The user 7thus very quickly obtains a situation-related instruction for theobject being observed. If an intelligent tool is used which is capable of recording the torque currently being used, it is additionally possible for the user also to be requested to increase or reduce the torque appropriately on the basis of the current torque. Such additional information can be embedded into the AR system in a service report under voice control when service logs are created by the service technician. This means that, if there is a new fault, maintenance which has been carried out earlier or a service action which has been carried out earlier can be reconstructed exactly. Analysis of the service logs recorded in this manner leads to better recognition of quality deficiencies in an installation and/or in a process controlled by automation technology.
  • The text below provides background information on the field of use of the invention: this involves application-oriented requirement analysis and development of AR-based systems for supporting work processes in development, production and servicing of complex engineering products and installations in fabrication and process technology, and also for service support systems, as in the case of motor vehicles, or for maintenance of any engineering equipment. [0021]
  • Augmented reality, AR for short, is a novel type of man-machine interaction with great potential for supporting industrial work processes. With this technology, the observer's field of view is enriched with computer-generated virtual objects, which means that product or process information can be used intuitively. Besides the very simple interaction, the use of portable computers opens up AR application fields with high mobility requirements, for example if process, measurement or simulation data are linked to the real object. [0022]
  • The situation of German industry is characterized by increasing customer requirements in terms of individuality and quality of products and by the development processes taking substantially less time. Especially in developing, producing and servicing complex engineering products and installations, it is possible, by means of innovative solutions to man-machine interaction, both to achieve jumps in efficiency and productivity and to design the work so as to enhance competence and training, by the user's need for knowledge and information being supported in a situation-related manner on the basis of data available in any case. [0023]
  • Augmented reality is a technology with numerous innovative fields of application: [0024]
  • In development for example, a “mixed mock-up” approach based on a mixed-virtual environment can result in a distinct acceleration of the early phases of development. Compared with immersive “virtual reality” (VR) solutions, the user is at a substantial advantage in that the haptic properties can be depicted faithfully with the aid of a real model, whereas aspects of visual perception, e.g. for display variants, can be manipulated in a virtual manner. In addition, there is a major potential for user-oriented validation of computer-assisted models, e.g. for component verification or in crash tests. [0025]
  • In flexible production, it is possible, inter alia, to considerably facilitate the process of setting up machinery for qualified skilled workers by displaying, e.g. via mobile AR components, mixed-virtual clamping situations directly in the field of view. Fabrication planning and fabrication control appropriate to the skilled worker in the workshop is facilitated if information regarding the respective order status is perceived directly in situ in connection with the corresponding products. This also applies to assembly, with the option of presenting the individual work steps to the assembler in a mixed-virtual manner in the actual training phase. In this connection, it is possible, e.g. by comparing real assembly procedures with results of simulations, to achieve comprehensive optimizations which both improve the quality of work scheduling and simplify and accelerate the assembly process in the critical start-up phase. Finally, regarding servicing, conventional technologies are by now barely adequate for supporting and documenting the complex diagnostic and repair procedures. Since, however, these processes in many fields are in any case planned on the basis of digital data, AR technologies provide the option of adopting the information sources for maintenance purposes and of explaining the dismantling process to an engineer, e.g. in the data goggles, by overlaying real objects. Regarding cooperative work, the AR-assisted “remote eye” permits a distributed problem solution by virtue of a remote expert communicating across global distances with the member of staff in situ. This case is particularly relevant for the predominantly medium-sized machine tool manufacturers. Because of globalization, they are forced to set up production sites for their customers worldwide. Neither, however, is the presence of subsidiaries in all the important markets achievable on economic grounds, nor is it possible to dispense with the profound knowledge of experienced service staff of the parent company with respect to the increasingly more complex installations. [0026]
  • The special feature of man-machine interaction in augmented reality is the very simple and intuitive communication with the computer, supplemented, for example, by multimode interaction techniques such as voice processing or gesture recognition. The use of portable computer units additionally enables entirely novel mobile utilization scenarios, with the option of requesting the specific data at any time via a wireless network. Novel visualization techniques permit direct annotation, e.g. of measured data or simulation data, to the real object or into the real environment. In conjunction with distributed applications, a number of users are able to operate in a real environment with the aid of a shared database (shared augmented environments) or to cooperate in different environments with AR support. [0027]
  • Augmented reality has been the subject of intense research only in the last few years. Consequently, only a few applications exist, either at the national or the international level, usually in the form of scientific prototypes in research establishments. [0028]
  • USA: As with many novel technologies, the potential uses of augmented reality were first tapped in North America. Examples include cockpit design or maintenance of mechatronic equipment. The aircraft manufacturer Boeing has already carried out initial field trials using AR technology in the assembly field. The upshot is that in this hi-tech area too, the USA occupy a key position, potentially making them technological leaders. [0029]
  • Japan: Various AR developments are being pushed in Japan, e.g. for mixed-virtual building design, telepresence or “cyber-shopping”. The nucleus is formed by the Mixed Reality Systems Laboratory founded in 1997, which is supported jointly as a center of competence by science and by commerce and industry. Particular stimuli in the consumer goods field are likely in the future from the Japanese home electronics industry. [0030]
  • Europe: So far, only very few research groups have been active in Europe in the AR field. One group at the University of Vienna is working on approaches to mixed-real visualization. The IGD Group, as part of the ACTS project CICC, which has now come to an end, has developed initial applications for the building industry and a scientific prototype for staff training in car manufacturing. [0031]
  • The in the invention should be seen in particular in the specific context of the fields of application “production machinery and machine tools” (NC-controlled, automation-technology processes) and “diagnostics/service support systems for complex engineering components/equipment/systems” (e.g. vehicles, but also industrial machinery and installations). [0032]
  • In summary, the invention therefore relates to an augmented-reality system and method for transmitting information data from a user, in particular of a process controlled by automation technology, of a production plant and/or of a machine, for example from a service technician to a storage medium in the augmented-reality system, where the augmented-reality system have recording means for recording voice inputs by the user, in particular service logs, using voice-based input. This means that service logs can easily be created quickly, stored centrally and archived, so that the appropriate information is reliably available even in the [lacuna] for later, similar instances of action. The method permits voice-based recording of action reports, in which service logs etc. are stored and managed as voice-based inputs instead of in written form. [0033]

Claims (16)

    We claim:
  1. 1. A system for recording and storing information data, where the system has recording means for recording a service report in the form of voice inputs by a user means for recording documentation data and process data for an installation controlled by automation technology, and a storage medium for storing the voice inputs, the documentation data and the process data.
  2. 2. The system according to claim 1, wherein the documentation data and the process data are recorded in connection with the voice inputs by the user.
  3. 3. The system according to claim 1, wherein voice-controlled storage and association of the service reports are provided.
  4. 4. The system as claimed according to claim 1, wherein recording means for recording a current operating situation are provided, and in that storage and association of the service reports on the basis of the current operating situation are provided.
  5. 5. The system as claimed according to claim 1, wherein the system has visualization means for visualizing information data stored in the system.
  6. 6. The system according to claim 4, wherein means are provided for displaying the stored information data on the basis of the current operating situation.
  7. 7. The system according to claim 1, wherein the recording means are user-controlled.
  8. 8. The system according to claim 5, wherein the visualization means are in the form of display apparatuses arranged in the area of goggle lenses of data goggles, in that the recording means provided is an image-recording apparatus arranged on the data goggles, and in that a microphone arranged on the data goggles is provided for recording voice commands.
  9. 9. A method for recording and storing information data, where recording means are used to record a service report in the form of voice inputs by a user documentation data and process data for an installation controlled by automation technology are recorded, and the voice inputs, the documentation data and the process data are stored.
  10. 10. The method according to claim 9, wherein the documentation data and the process data are recorded in connection with the voice inputs by the user.
  11. 11. The method according to claim 9, wherein voice-controlled storage and association of the service reports are provided.
  12. 12. The method according to claim 9, wherein recording means are used to ascertain a current operating situation, and in that storage and association of the service reports on the basis of the current operating situation are provided.
  13. 13. The method according to claim 9, wherein information data stored in the system are visualized using visualization means.
  14. 14. The method according to claim 12, wherein the stored information data are displayed on the basis of the current operating situation.
  15. 15. The method according to claim 9, wherein the recording means are user-controlled.
  16. 16. The method according to claim 13, wherein the visualization means are in the form of display apparatuses arranged in the area of goggle lenses of data goggles, in that the recording means provided is an image-recording apparatus arranged on the data goggles, and in that a microphone arranged on the data goggles is provided for recording voice commands.
US09945774 1999-03-02 2001-09-04 Augmented-reality system with voice-based recording of information data, in particular of service reports Abandoned US20020069072A1 (en)

Priority Applications (20)

Application Number Priority Date Filing Date Title
DE19909154 1999-03-02
DE19909011 1999-03-02
DE19909010.6, 1999-03-02
DE19909016 1999-03-02
DE19909011.4, 1999-03-02
DE19909018 1999-03-02
DE19909016.5 1999-03-02
DE19909010 1999-03-02
DE19909012 1999-03-02
DE19909009 1999-03-02
DE19909023.8 1999-03-02
WO00/52538 1999-03-02
DE19909013.0, 1999-03-02
DE19909012.2, 1999-03-02
DE19909013 1999-03-02
DE19909154.4, 1999-03-02
DE19909009.2 1999-03-02
DE19909023 1999-03-02
DE19909018.1, 1999-03-02
PCT/DE2000/000660 WO2000052538A1 (en) 1999-03-02 2000-03-02 Augmented reality system having voice-based recording of information data, in particular, information data contained in service reports

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/DE2000/000660 Continuation WO2000052538A1 (en) 1999-03-02 2000-03-02 Augmented reality system having voice-based recording of information data, in particular, information data contained in service reports

Publications (1)

Publication Number Publication Date
US20020069072A1 true true US20020069072A1 (en) 2002-06-06

Family

ID=27576004

Family Applications (5)

Application Number Title Priority Date Filing Date
US09945771 Abandoned US20020067372A1 (en) 1999-03-02 2001-09-04 Utilizing augmented reality-based technologies to provide situation-related assistance to a skilled operator from remote experts
US09945777 Active US6941248B2 (en) 1999-03-02 2001-09-04 System for operating and observing making use of mobile equipment
US09945776 Abandoned US20020046368A1 (en) 1999-03-02 2001-09-04 System for, and method of, situation-relevant asistance to interaction with the aid of augmented-reality technologies
US09945774 Abandoned US20020069072A1 (en) 1999-03-02 2001-09-04 Augmented-reality system with voice-based recording of information data, in particular of service reports
US11857931 Active 2023-03-26 US8373618B2 (en) 1999-03-02 2007-09-19 Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US09945771 Abandoned US20020067372A1 (en) 1999-03-02 2001-09-04 Utilizing augmented reality-based technologies to provide situation-related assistance to a skilled operator from remote experts
US09945777 Active US6941248B2 (en) 1999-03-02 2001-09-04 System for operating and observing making use of mobile equipment
US09945776 Abandoned US20020046368A1 (en) 1999-03-02 2001-09-04 System for, and method of, situation-relevant asistance to interaction with the aid of augmented-reality technologies

Family Applications After (1)

Application Number Title Priority Date Filing Date
US11857931 Active 2023-03-26 US8373618B2 (en) 1999-03-02 2007-09-19 Augmented-reality system for situation-related support of the interaction between a user and an engineering apparatus

Country Status (4)

Country Link
US (5) US20020067372A1 (en)
EP (5) EP1157316B1 (en)
JP (5) JP2002538541A (en)
WO (7) WO2000052540A1 (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070088526A1 (en) * 2003-11-10 2007-04-19 Wolfgang Friedrich System and method for carrying out and visually displaying simulations in an augmented reality
US20070273557A1 (en) * 2006-05-26 2007-11-29 Itt Manufacturing Enterprises,Inc. Augmented reality-based system and method providing status and control of unmanned vehicles
US20070273610A1 (en) * 2006-05-26 2007-11-29 Itt Manufacturing Enterprises, Inc. System and method to display maintenance and operational instructions of an apparatus using augmented reality
US20080218331A1 (en) * 2007-03-08 2008-09-11 Itt Manufacturing Enterprises, Inc. Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness
US20100082118A1 (en) * 2008-09-30 2010-04-01 Rockwell Automation Technologies, Inc. User interface display object for logging user-implemented solutions to industrial field problems
US20110221657A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Optical stabilization of displayed content with a variable lens
US20130083063A1 (en) * 2011-09-30 2013-04-04 Kevin A. Geisner Service Provision Using Personal Audio/Visual System
US8760471B2 (en) 2010-04-28 2014-06-24 Ns Solutions Corporation Information processing system, information processing method and program for synthesizing and displaying an image
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
WO2016108182A1 (en) * 2014-12-29 2016-07-07 Abb Technology Ltd. Method for identifying a sequence of events associated with a condition in a process plant
US9759917B2 (en) 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices
US9838844B2 (en) 2015-09-25 2017-12-05 Ca, Inc. Using augmented reality to assist data center operators
US9955059B2 (en) 2014-10-29 2018-04-24 Kabushiki Kaisha Toshiba Electronic device, method, and computer program product

Families Citing this family (117)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10024412A1 (en) * 2000-05-19 2001-11-29 Westfalia Separator Ind Gmbh A method for controlling machines and information systems
DE10027136C2 (en) * 2000-05-31 2002-11-21 Luigi Grasso Mobile system for creating a virtual display
US20120105740A1 (en) 2000-06-02 2012-05-03 Oakley, Inc. Eyewear with detachable adjustable electronics module
DE10127396A1 (en) * 2000-06-13 2001-12-20 Volkswagen Ag Method for utilization of old motor vehicles using a sorting plant for removal of operating fluids and dismantling of the vehicle into components parts for sorting uses augmented reality (AR) aids to speed and improve sorting
JP4701479B2 (en) * 2000-07-05 2011-06-15 ソニー株式会社 Link information display device and a display method
DE10048743C2 (en) 2000-09-29 2002-11-28 Siemens Ag automation system
DE10048563B4 (en) * 2000-09-30 2010-11-25 Meissner, Werner Device for remote maintenance of technical equipment
US20020044152A1 (en) * 2000-10-16 2002-04-18 Abbott Kenneth H. Dynamic integration of computer generated and real world images
EP1207441A1 (en) * 2000-10-18 2002-05-22 G.D Societa' Per Azioni Method and automatic machine for processing a product
DE10063089C1 (en) * 2000-12-18 2002-07-25 Siemens Ag User controlled linking of information within an augmented reality system
DE10103922A1 (en) * 2001-01-30 2002-08-01 Physoptics Opto Electronic Gmb Interactive Datensicht- and operating system
DE10108064A1 (en) * 2001-02-20 2002-09-05 Siemens Ag Linked eye tracking information within an augmented reality system
FI20012231A (en) * 2001-06-21 2002-12-22 Ismo Rakkolainen System to create a user interface
US7013009B2 (en) 2001-06-21 2006-03-14 Oakley, Inc. Eyeglasses with wireless communication features
GB0118594D0 (en) * 2001-07-31 2001-09-19 Hewlett Packard Co Method and apparatus for interactive broadcasting
US6871322B2 (en) 2001-09-06 2005-03-22 International Business Machines Corporation Method and apparatus for providing user support through an intelligent help agent
US6973620B2 (en) * 2001-09-06 2005-12-06 International Business Machines Corporation Method and apparatus for providing user support based on contextual information
US6976067B2 (en) * 2001-09-06 2005-12-13 International Business Machines Corporation Method and apparatus for providing entitlement information for interactive support
JP2003080482A (en) * 2001-09-07 2003-03-18 Yaskawa Electric Corp Robot teaching device
US7451126B2 (en) * 2001-10-18 2008-11-11 Omron Corporation State space navigation system, user system and business methods for machine to machine business
US7126558B1 (en) 2001-10-19 2006-10-24 Accenture Global Services Gmbh Industrial augmented reality
DE10159610B4 (en) * 2001-12-05 2004-02-26 Siemens Ag System and method for creation of documentation of operations, particularly in the area of ​​production, installation, service and maintenance
EP1487616B1 (en) * 2002-03-20 2010-06-30 Volkswagen Aktiengesellschaft Automatic process control
DE10320268B4 (en) * 2002-05-31 2012-08-16 Heidelberger Druckmaschinen Ag Apparatus and method for retrieving and displaying information
DE10255056A1 (en) * 2002-11-25 2004-06-03 Grob-Werke Burkhart Grob E.K. Station with operator panel, esp. in processing or manufacturing-line, has portable operator panel wirelessly connected with station or control device of station and/or central control unit
EP1435737A1 (en) * 2002-12-30 2004-07-07 Abb Research Ltd. An augmented reality system and method
DE10305384A1 (en) 2003-02-11 2004-08-26 Kuka Roboter Gmbh Method and apparatus for visualization of computerized information
JP4352073B2 (en) 2003-02-24 2009-10-28 バイエリッシェ モートーレン ウエルケ アクチエンゲゼルシャフトBayerische Motoren Werke Aktiengesellschaft Method and apparatus for visualizing the repair procedures in a vehicle
DE10325894B4 (en) 2003-06-06 2010-12-09 Siemens Ag Tool or production machine with display for visualization of workflows
DE10325895A1 (en) * 2003-06-06 2005-01-05 Siemens Ag Tool or production machine with head-up display
DE10326627A1 (en) * 2003-06-11 2005-01-05 Endress + Hauser Gmbh + Co. Kg A process for the functional display of a field device of process automation technology
US20050022228A1 (en) * 2003-07-21 2005-01-27 Videotest Llc Digital recording-based computer testing and debugging system
DE102004016329A1 (en) * 2003-11-10 2005-05-25 Siemens Ag System and method for performing and visualization of simulations in an augmented reality
US9948885B2 (en) * 2003-12-12 2018-04-17 Kurzweil Technologies, Inc. Virtual encounters
US9971398B2 (en) * 2003-12-12 2018-05-15 Beyond Imagination Inc. Virtual encounters
US8600550B2 (en) * 2003-12-12 2013-12-03 Kurzweil Technologies, Inc. Virtual encounters
US9841809B2 (en) * 2003-12-12 2017-12-12 Kurzweil Technologies, Inc. Virtual encounters
US20050130108A1 (en) * 2003-12-12 2005-06-16 Kurzweil Raymond C. Virtual encounters
DE102005011616B4 (en) * 2004-05-28 2014-12-04 Volkswagen Ag Mobile tracking unit
CN1744098A (en) * 2004-08-31 2006-03-08 希森美康株式会社 Remote control method, remote control system
DE102004044718A1 (en) * 2004-09-10 2006-03-16 Volkswagen Ag Augmented reality help instruction generating system for e.g. aircraft, has control unit producing help instruction signal, representing help instruction in virtual space of three-dimensional object model, as function of interaction signal
DE102004053774A1 (en) * 2004-11-08 2006-05-11 Siemens Ag System for the measurement and interpretation of brain activities
US7715037B2 (en) 2005-03-01 2010-05-11 Xerox Corporation Bi-directional remote visualization for supporting collaborative machine troubleshooting
DE102005009437A1 (en) * 2005-03-02 2006-09-07 Kuka Roboter Gmbh Method and apparatus for superimposing AR objects
US8150666B2 (en) * 2005-03-14 2012-04-03 Holomar, Inc. Methods and systems for combining models of goods and services
JP4933164B2 (en) * 2005-07-01 2012-05-16 キヤノン株式会社 The information processing apparatus, information processing method, program, and storage medium
US7362738B2 (en) * 2005-08-09 2008-04-22 Deere & Company Method and system for delivering information to a user
EP2095178B1 (en) 2006-12-14 2015-08-12 Oakley, Inc. Wearable high resolution audio visual interface
DE102007025796B4 (en) * 2007-06-02 2010-07-15 Koenig & Bauer Aktiengesellschaft Mobile control desk of a rotary printing machine
US20090037378A1 (en) * 2007-08-02 2009-02-05 Rockwell Automation Technologies, Inc. Automatic generation of forms based on activity
WO2009036782A1 (en) 2007-09-18 2009-03-26 Vrmedia S.R.L. Information processing apparatus and method for remote technical assistance
EP2206041A4 (en) * 2007-10-01 2011-02-16 Iconics Inc Visualization of process control data
KR100914848B1 (en) * 2007-12-15 2009-09-02 한국전자통신연구원 Method and architecture of mixed reality system
US8485038B2 (en) 2007-12-18 2013-07-16 General Electric Company System and method for augmented reality inspection and data visualization
US8786675B2 (en) * 2008-01-23 2014-07-22 Michael F. Deering Systems using eye mounted displays
US9812096B2 (en) 2008-01-23 2017-11-07 Spy Eye, Llc Eye mounted displays and systems using eye mounted displays
US20090189830A1 (en) * 2008-01-23 2009-07-30 Deering Michael F Eye Mounted Displays
DE102008009446A1 (en) * 2008-02-15 2009-08-20 Volkswagen Ag Method for examining complex system, particularly motor vehicle, on deviations from quality specifications and on defectiveness, involves entering method state information by data input device in state information storage by testing person
DE102008020772A1 (en) * 2008-04-21 2009-10-22 Carl Zeiss 3D Metrology Services Gmbh Showing results of measurement of workpieces
DE102008020771A1 (en) * 2008-04-21 2009-07-09 Carl Zeiss 3D Metrology Services Gmbh Deviation determining method, involves locating viewers at viewing position of device and screen, such that viewers view each position of exemplars corresponding to measured coordinates of actual condition
US7980512B1 (en) * 2008-06-13 2011-07-19 The Boeing Company System and method for displaying aerial refueling symbology
DE102009021729A1 (en) * 2009-05-11 2010-11-18 Michael Weinig Ag Machine for machining workpieces of wood, plastics and the like
US8555171B2 (en) * 2009-12-09 2013-10-08 Industrial Technology Research Institute Portable virtual human-machine interaction device and operation method thereof
US8351057B2 (en) 2010-03-18 2013-01-08 Xerox Corporation Self-powered user interface providing assembly instructions
CN102667881B (en) 2010-03-30 2013-11-27 新日铁住金系统集成株式会社 Information processing apparatus, information processing method, and program
CN101833896B (en) * 2010-04-23 2011-10-19 西安电子科技大学 Geographic information guide method and system based on augment reality
US20110316845A1 (en) * 2010-06-25 2011-12-29 Palo Alto Research Center Incorporated Spatial association between virtual and augmented reality
KR101363559B1 (en) * 2010-07-23 2014-02-17 주식회사 팬택 Apparatus and Method for providing augment reality using additional information
JP2012043396A (en) * 2010-08-13 2012-03-01 Hyundai Motor Co Ltd System and method for managing vehicle consumables using augmented reality
KR101219933B1 (en) 2010-09-13 2013-01-08 현대자동차주식회사 Devices in the control system and method for a vehicle using the AR
JP5870929B2 (en) * 2010-11-02 2016-03-01 日本電気株式会社 The information processing system, information processing apparatus and information processing method
US8490877B2 (en) 2010-11-09 2013-07-23 Metrologic Instruments, Inc. Digital-imaging based code symbol reading system having finger-pointing triggered mode of operation
CN102116876B (en) * 2011-01-14 2013-04-17 中国科学院上海技术物理研究所 Method for detecting spatial point target space-base on basis of track cataloguing model
US8621362B2 (en) 2011-01-21 2013-12-31 Xerox Corporation Mobile screen methods and systems for collaborative troubleshooting of a device
JP2012155403A (en) * 2011-01-24 2012-08-16 Yokogawa Electric Corp Field apparatus monitoring system
US20120249588A1 (en) * 2011-03-22 2012-10-04 Panduit Corp. Augmented Reality Data Center Visualization
DE102011017305A1 (en) * 2011-04-15 2012-10-18 Abb Technology Ag Control and monitoring system for technical equipment
WO2011137764A3 (en) 2011-05-11 2012-04-19 华为终端有限公司 Method and system for implementing augmented reality applications
US20120308969A1 (en) * 2011-06-06 2012-12-06 Paramit Corporation Training ensurance method and system for copmuter directed assembly and manufacturing
US20120326948A1 (en) * 2011-06-22 2012-12-27 Microsoft Corporation Environmental-light filter for see-through head-mounted display device
US8872852B2 (en) 2011-06-30 2014-10-28 International Business Machines Corporation Positional context determination with multi marker confidence ranking
US9864211B2 (en) 2012-02-17 2018-01-09 Oakley, Inc. Systems and methods for removably coupling an electronic device to eyewear
DE102013010719A1 (en) 2012-07-30 2014-01-30 Heidelberger Druckmaschinen Ag Machine condition-based display of documentation
US8933970B2 (en) 2012-09-11 2015-01-13 Longsand Limited Controlling an augmented reality object
DE102012217570A1 (en) * 2012-09-27 2014-03-27 Krones Ag Method of supporting operating and Umstellprozessen
US9120226B2 (en) 2012-10-23 2015-09-01 Lincoln Global, Inc. System and method for remotely positioning an end effector
US9952438B1 (en) * 2012-10-29 2018-04-24 The Boeing Company Augmented reality maintenance system
US9959190B2 (en) * 2013-03-12 2018-05-01 International Business Machines Corporation On-site visualization of component status
EP2972613A1 (en) * 2013-03-12 2016-01-20 G.D Societa' per Azioni Automatic machine management operator support system and corresponding method and automatic machine
CN205177388U (en) 2013-03-15 2016-04-20 奥克利有限公司 Eyepiece system
CN205691887U (en) 2013-06-12 2016-11-16 奥克利有限公司 Modularization communication system and glasses communication system
ES2525104B1 (en) * 2013-06-17 2015-09-29 Proyectos, Ingeniería Y Gestión, Sociedad Anónima (P.R.O.I.N.G.E., S.A.) System monitoring and support of industrial operations manual assembly using augmented reality and method of use
DE102013211502A1 (en) 2013-06-19 2014-12-24 Robert Bosch Gmbh identification device
JP6355909B2 (en) * 2013-10-18 2018-07-11 三菱重工業株式会社 Inspection recording apparatus and inspection records evaluation method
US9740935B2 (en) * 2013-11-26 2017-08-22 Honeywell International Inc. Maintenance assistant system
US9993335B2 (en) 2014-01-08 2018-06-12 Spy Eye, Llc Variable resolution eye mounted displays
WO2015125066A1 (en) * 2014-02-19 2015-08-27 Fieldbit Ltd. System and method for facilitating equipment maintenance using smartglasses
EP3132390A1 (en) 2014-04-16 2017-02-22 Exxonmobil Upstream Research Company Methods and systems for providing procedures in real-time
EP3145385A4 (en) * 2014-05-22 2018-02-14 Invuity, Inc. Medical device featuring cladded waveguide
DE102014012710A1 (en) * 2014-08-27 2016-03-03 Steinbichler Optotechnik Gmbh Method and apparatus for determining the 3D coordinates of an object
US9746913B2 (en) 2014-10-31 2017-08-29 The United States Of America As Represented By The Secretary Of The Navy Secured mobile maintenance and operator system including wearable augmented reality interface, voice command interface, and visual recognition systems and related methods
US9697432B2 (en) 2014-12-09 2017-07-04 International Business Machines Corporation Generating support instructions by leveraging augmented reality
US9869996B2 (en) * 2015-01-08 2018-01-16 The Boeing Company System and method for using an internet of things network for managing factory production
DE102015201290A1 (en) * 2015-01-26 2016-07-28 Prüftechnik Dieter Busch AG Positioning two bodies by means of an alignment system with a data goggles
WO2016153628A3 (en) * 2015-02-25 2016-12-22 Brian Mullins Augmented reality content creation
EP3073452A1 (en) * 2015-03-26 2016-09-28 Skidata Ag Method for monitoring and controlling an access control system
US9589390B2 (en) 2015-05-13 2017-03-07 The Boeing Company Wire harness assembly
DE102015214350A1 (en) * 2015-07-29 2017-02-02 Siemens Healthcare Gmbh Methods for communication between a medical network and a medical operator by means of a mobile data goggles, and a mobile data goggles
WO2017027682A1 (en) 2015-08-11 2017-02-16 Delta Energy & Communications, Inc. Enhanced reality system for visualizing, evaluating, diagnosing, optimizing and servicing smart grids and incorporated components
US10055966B2 (en) 2015-09-03 2018-08-21 Delta Energy & Communications, Inc. System and method for determination and remediation of energy diversion in a smart grid network
DE102015116401A1 (en) 2015-09-28 2017-03-30 ESSERT Steuerungstechnik GmbH System, especially augmented reality system to an operator and / or a maintenance of a technical facility
WO2017070648A1 (en) 2015-10-22 2017-04-27 Delta Energy & Communications, Inc. Augmentation, expansion and self-healing of a geographically distributed mesh network using unmanned aerial vehicle technology
EP3214586A1 (en) 2016-03-04 2017-09-06 Thales Deutschland GmbH Method for maintenance support and maintenance support system
EP3223208A1 (en) * 2016-03-22 2017-09-27 Hexagon Technology Center GmbH Self control
US20170280188A1 (en) * 2016-03-24 2017-09-28 Daqri, Llc Recording Remote Expert Sessions
CN105929948A (en) * 2016-04-14 2016-09-07 佛山市威格特电气设备有限公司 Augmented reality based self-learning type intelligent helmet and running method therefor
CN107358657A (en) * 2017-06-30 2017-11-17 海南职业技术学院 Method and system for achieving interaction based on augmented reality technology

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4796206A (en) * 1986-06-02 1989-01-03 International Business Machines Corporation Computer assisted vehicle service featuring signature analysis and artificial intelligence
US5712649A (en) * 1991-11-01 1998-01-27 Sega Enterprises Head-mounted image display
US5734569A (en) * 1992-01-06 1998-03-31 Snap-On Technologies, Inc. Computer interface board for electronic automotive vehicle service equipment
US6085428A (en) * 1993-10-05 2000-07-11 Snap-On Technologies, Inc. Hands free automotive service system
US6195618B1 (en) * 1998-10-15 2001-02-27 Microscribe, Llc Component position verification using a probe apparatus
US6442460B1 (en) * 2000-09-05 2002-08-27 Hunter Engineering Company Method and apparatus for networked wheel alignment communications and services
US6512968B1 (en) * 1997-05-16 2003-01-28 Snap-On Technologies, Inc. Computerized automotive service system
US6556971B1 (en) * 2000-09-01 2003-04-29 Snap-On Technologies, Inc. Computer-implemented speech recognition system training
US7124101B1 (en) * 1999-11-22 2006-10-17 Accenture Llp Asset tracking in a network-based supply chain environment
US7130807B1 (en) * 1999-11-22 2006-10-31 Accenture Llp Technology sharing during demand and supply planning in a network-based supply chain environment
US7133845B1 (en) * 1995-02-13 2006-11-07 Intertrust Technologies Corp. System and methods for secure transaction management and electronic rights protection
US7165041B1 (en) * 1999-05-27 2007-01-16 Accenture, Llp Web-based architecture sales tool

Family Cites Families (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0145683B1 (en) * 1983-09-30 1988-01-07 Asea Ab Industrial robot
FR2594968B1 (en) * 1986-02-21 1988-09-16 Alsthom Assistance device to the mounting operations of an assembly with SELF
US4834473A (en) * 1986-03-26 1989-05-30 The Babcock & Wilcox Company Holographic operator display for control systems
US5003300A (en) * 1987-07-27 1991-03-26 Reflection Technology, Inc. Head mounted display for miniature video display system
US5136526A (en) * 1987-09-03 1992-08-04 Reinhold Baur Determination of the thickness of a magnetic tape
US5121319A (en) * 1989-10-23 1992-06-09 International Business Machines Corporation Method and apparatus for selective suspension and resumption of computer based manufacturing processes
JP2947840B2 (en) * 1989-12-22 1999-09-13 株式会社日立製作所 Plant operation monitoring device
US5717598A (en) * 1990-02-14 1998-02-10 Hitachi, Ltd. Automatic manufacturability evaluation method and system
DE69030879T2 (en) * 1990-07-09 1997-10-02 Bell Helicopter Textron Inc Method and apparatus for semi-automated insertion of conductors in trapezoidal connections
JP2865828B2 (en) * 1990-08-22 1999-03-08 株式会社日立製作所 Display method and apparatus of operating procedures
DE4119803A1 (en) * 1991-06-15 1992-12-17 Bernd Dipl Ing Kelle Acoustic prompting method for machine tool operation - using speech synthesiser module with programmed instructions and warnings, coupled to position display via interface
US5781913A (en) * 1991-07-18 1998-07-14 Felsenstein; Lee Wearable hypermedium system
US5450596A (en) * 1991-07-18 1995-09-12 Redwear Interactive Inc. CD-ROM data retrieval system using a hands-free command controller and headwear monitor
US5644493A (en) * 1991-08-30 1997-07-01 Nsk Ltd. Production information processing system
US6081750A (en) * 1991-12-23 2000-06-27 Hoffberg; Steven Mark Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
WO1993014454A1 (en) * 1992-01-10 1993-07-22 Foster-Miller, Inc. A sensory integrated data interface
JPH05324039A (en) * 1992-05-26 1993-12-07 Fanuc Ltd Numerical controller
US5320538A (en) * 1992-09-23 1994-06-14 Hughes Training, Inc. Interactive aircraft training system and method
JP3443121B2 (en) * 1993-04-23 2003-09-02 三菱電機株式会社 Woodworking machinery of numerical control device
US5590062A (en) * 1993-07-02 1996-12-31 Matsushita Electric Industrial Co., Ltd. Simulator for producing various living environments mainly for visual perception
EP1326121B1 (en) * 1993-08-12 2006-09-13 Seiko Epson Corporation Head-mounted image display device and data processing apparatus including the same
US6061064A (en) * 1993-08-31 2000-05-09 Sun Microsystems, Inc. System and method for providing and using a computer user interface with a view space having discrete portions
US5815411A (en) * 1993-09-10 1998-09-29 Criticom Corporation Electro-optic vision system which exploits position and attitude
US6278461B1 (en) * 1993-09-10 2001-08-21 Geovector Corporation Augmented reality vision systems which derive image information from other vision systems
US5475797A (en) 1993-10-22 1995-12-12 Xerox Corporation Menu driven system for controlling automated assembly of palletized elements
US6424321B1 (en) * 1993-10-22 2002-07-23 Kopin Corporation Head-mounted matrix display
US5815126A (en) * 1993-10-22 1998-09-29 Kopin Corporation Monocular portable communication and display system
JPH086708A (en) 1994-04-22 1996-01-12 Canon Inc Display device
JPH07311857A (en) * 1994-05-16 1995-11-28 Fujitsu Ltd Picture compositing and display device and simulation system
JPH085954A (en) * 1994-06-21 1996-01-12 Matsushita Electric Ind Co Ltd Spectacles type picture display device
WO1996003715A1 (en) * 1994-07-22 1996-02-08 Monash University A graphical display system
JP3069014B2 (en) * 1994-10-21 2000-07-24 株式会社東京精密 Operation guidance with coordinate measuring system
JPH08161028A (en) * 1994-12-06 1996-06-21 Mitsubishi Electric Corp Operation support system
US5850352A (en) * 1995-03-31 1998-12-15 The Regents Of The University Of California Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images
US6181371B1 (en) * 1995-05-30 2001-01-30 Francis J Maguire, Jr. Apparatus for inducing attitudinal head movements for passive virtual reality
US5745387A (en) * 1995-09-28 1998-04-28 General Electric Company Augmented reality maintenance system employing manipulator arm with archive and comparison device
JPH09114543A (en) * 1995-10-02 1997-05-02 Xybernaut Corp Hand free computer system
US5742263A (en) * 1995-12-18 1998-04-21 Telxon Corporation Head tracking system for a head mounted display system
DE69736039T2 (en) * 1996-02-26 2007-01-11 Seiko Epson Corp. Portable information display device
US6064749A (en) * 1996-08-02 2000-05-16 Hirota; Gentaro Hybrid tracking for augmented reality using both camera motion detection and landmark tracking
JP3338618B2 (en) * 1996-10-07 2002-10-28 ▲すすむ▼ ▲たち▼ Display method and display device in the real space image and virtual space image
US5912650A (en) * 1996-10-16 1999-06-15 Kaiser Electro-Optics, Inc. Dichoptic display utilizing a single display device
JP3106107B2 (en) * 1996-11-20 2000-11-06 株式会社東芝 Plant in the information communication system
JP3697816B2 (en) * 1997-01-29 2005-09-21 株式会社島津製作所 Patrol inspection support system
JPH10214035A (en) 1997-01-30 1998-08-11 Canon Inc Back light device and liquid crystal display device using the same
JPH10222543A (en) * 1997-02-07 1998-08-21 Hitachi Ltd Checking, maintaining and supporting portable terminal and checking and maintaining method using it
US5912720A (en) * 1997-02-13 1999-06-15 The Trustees Of The University Of Pennsylvania Technique for creating an ophthalmic augmented reality environment
JPH10293790A (en) * 1997-04-17 1998-11-04 Toshiba Corp Power equipment work management device
DE19716327A1 (en) * 1997-04-18 1998-10-29 Branscheid Industrieelektronik Display device for manufacturing information
US6094625A (en) * 1997-07-03 2000-07-25 Trimble Navigation Limited Augmented vision for survey work and machine control
GB2327289B (en) * 1997-07-15 1999-09-15 Honda Motor Co Ltd Job aiding apparatus
JPH1141166A (en) * 1997-07-18 1999-02-12 Omron Corp Radio communication system and terminal equipment therefor
US6064335A (en) * 1997-07-21 2000-05-16 Trimble Navigation Limited GPS based augmented reality collision avoidance system
US6037914A (en) 1997-08-25 2000-03-14 Hewlett-Packard Company Method and apparatus for augmented reality using a see-through head-mounted display
US6522312B2 (en) * 1997-09-01 2003-02-18 Canon Kabushiki Kaisha Apparatus for presenting mixed reality shared among operators
JPH11102438A (en) * 1997-09-26 1999-04-13 Minolta Co Ltd Distance image generation device and image display device
US6037882A (en) * 1997-09-30 2000-03-14 Levy; David H. Method and apparatus for inputting data to an electronic system
DE69840547D1 (en) * 1997-10-30 2009-03-26 Myvu Corp Interface system for glasses
US5980084A (en) * 1997-11-24 1999-11-09 Sandia Corporation Method and apparatus for automated assembly
US6625299B1 (en) * 1998-04-08 2003-09-23 Jeffrey Meisner Augmented reality technology
US6255961B1 (en) 1998-05-08 2001-07-03 Sony Corporation Two-way communications between a remote control unit and one or more devices in an audio/visual environment
US6629065B1 (en) * 1998-09-30 2003-09-30 Wisconsin Alumni Research Foundation Methods and apparata for rapid computer-aided design of objects in virtual reality and other environments
US6356437B1 (en) * 1999-03-29 2002-03-12 Siemens Dematic Postal Automation, L.P. System, apparatus and method for providing a portable customizable maintenance support instruction system
US6697894B1 (en) * 1999-03-29 2004-02-24 Siemens Dematic Postal Automation, L.P. System, apparatus and method for providing maintenance instructions to a user at a remote location
US6574672B1 (en) * 1999-03-29 2003-06-03 Siemens Dematic Postal Automation, L.P. System, apparatus and method for providing a portable customizable maintenance support computer communications system
US6725184B1 (en) * 1999-06-30 2004-04-20 Wisconsin Alumni Research Foundation Assembly and disassembly sequences of components in computerized multicomponent assembly models
US6408257B1 (en) * 1999-08-31 2002-06-18 Xerox Corporation Augmented-reality display method and system
JP3363861B2 (en) * 2000-01-13 2003-01-08 キヤノン株式会社 Mixed reality presentation apparatus and a mixed reality presentation method and storage medium
US20020010734A1 (en) * 2000-02-03 2002-01-24 Ebersole John Franklin Internetworked augmented reality system and method
US6587783B2 (en) * 2000-10-05 2003-07-01 Siemens Corporate Research, Inc. Method and system for computer assisted localization, site navigation, and data navigation

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4796206A (en) * 1986-06-02 1989-01-03 International Business Machines Corporation Computer assisted vehicle service featuring signature analysis and artificial intelligence
US5712649A (en) * 1991-11-01 1998-01-27 Sega Enterprises Head-mounted image display
US5734569A (en) * 1992-01-06 1998-03-31 Snap-On Technologies, Inc. Computer interface board for electronic automotive vehicle service equipment
US6085428A (en) * 1993-10-05 2000-07-11 Snap-On Technologies, Inc. Hands free automotive service system
US7133845B1 (en) * 1995-02-13 2006-11-07 Intertrust Technologies Corp. System and methods for secure transaction management and electronic rights protection
US6512968B1 (en) * 1997-05-16 2003-01-28 Snap-On Technologies, Inc. Computerized automotive service system
US6195618B1 (en) * 1998-10-15 2001-02-27 Microscribe, Llc Component position verification using a probe apparatus
US7165041B1 (en) * 1999-05-27 2007-01-16 Accenture, Llp Web-based architecture sales tool
US7124101B1 (en) * 1999-11-22 2006-10-17 Accenture Llp Asset tracking in a network-based supply chain environment
US7130807B1 (en) * 1999-11-22 2006-10-31 Accenture Llp Technology sharing during demand and supply planning in a network-based supply chain environment
US6556971B1 (en) * 2000-09-01 2003-04-29 Snap-On Technologies, Inc. Computer-implemented speech recognition system training
US6442460B1 (en) * 2000-09-05 2002-08-27 Hunter Engineering Company Method and apparatus for networked wheel alignment communications and services

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070088526A1 (en) * 2003-11-10 2007-04-19 Wolfgang Friedrich System and method for carrying out and visually displaying simulations in an augmented reality
US7852355B2 (en) * 2003-11-10 2010-12-14 Siemens Aktiengesellschaft System and method for carrying out and visually displaying simulations in an augmented reality
US20070273557A1 (en) * 2006-05-26 2007-11-29 Itt Manufacturing Enterprises,Inc. Augmented reality-based system and method providing status and control of unmanned vehicles
US20070273610A1 (en) * 2006-05-26 2007-11-29 Itt Manufacturing Enterprises, Inc. System and method to display maintenance and operational instructions of an apparatus using augmented reality
US7920071B2 (en) 2006-05-26 2011-04-05 Itt Manufacturing Enterprises, Inc. Augmented reality-based system and method providing status and control of unmanned vehicles
US9323055B2 (en) 2006-05-26 2016-04-26 Exelis, Inc. System and method to display maintenance and operational instructions of an apparatus using augmented reality
US20080218331A1 (en) * 2007-03-08 2008-09-11 Itt Manufacturing Enterprises, Inc. Augmented reality-based system and method to show the location of personnel and sensors inside occluded structures and provide increased situation awareness
US9324229B2 (en) 2007-03-08 2016-04-26 Exelis, Inc. System and method to display maintenance and operational instructions of an apparatus using augmented reality
US20100082118A1 (en) * 2008-09-30 2010-04-01 Rockwell Automation Technologies, Inc. User interface display object for logging user-implemented solutions to industrial field problems
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
US9329689B2 (en) 2010-02-28 2016-05-03 Microsoft Technology Licensing, Llc Method and apparatus for biometric data capture
US8814691B2 (en) 2010-02-28 2014-08-26 Microsoft Corporation System and method for social networking gaming with an augmented reality
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US9875406B2 (en) 2010-02-28 2018-01-23 Microsoft Technology Licensing, Llc Adjustable extension for temple arm
US9759917B2 (en) 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US20110221659A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Augmented reality eyepiece with freeform optic, image source, and optical display
US20110221657A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Optical stabilization of displayed content with a variable lens
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
US8760471B2 (en) 2010-04-28 2014-06-24 Ns Solutions Corporation Information processing system, information processing method and program for synthesizing and displaying an image
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US9128520B2 (en) * 2011-09-30 2015-09-08 Microsoft Technology Licensing, Llc Service provision using personal audio/visual system
US20130083063A1 (en) * 2011-09-30 2013-04-04 Kevin A. Geisner Service Provision Using Personal Audio/Visual System
US9955059B2 (en) 2014-10-29 2018-04-24 Kabushiki Kaisha Toshiba Electronic device, method, and computer program product
WO2016108182A1 (en) * 2014-12-29 2016-07-07 Abb Technology Ltd. Method for identifying a sequence of events associated with a condition in a process plant
US9838844B2 (en) 2015-09-25 2017-12-05 Ca, Inc. Using augmented reality to assist data center operators

Also Published As

Publication number Publication date Type
EP1183578A1 (en) 2002-03-06 application
EP1159657B1 (en) 2003-08-20 grant
US20080100570A1 (en) 2008-05-01 application
US20020049566A1 (en) 2002-04-25 application
EP1157316A1 (en) 2001-11-28 application
WO2000052542A1 (en) 2000-09-08 application
US8373618B2 (en) 2013-02-12 grant
JP2003524814A (en) 2003-08-19 application
EP1159657A1 (en) 2001-12-05 application
WO2000052538A1 (en) 2000-09-08 application
EP1157315B1 (en) 2004-09-22 grant
EP1183578B1 (en) 2003-08-20 grant
EP1157315A1 (en) 2001-11-28 application
WO2000052537A1 (en) 2000-09-08 application
EP1157314B1 (en) 2004-09-22 grant
US20020067372A1 (en) 2002-06-06 application
EP1157316B1 (en) 2003-09-03 grant
EP1157314A1 (en) 2001-11-28 application
JP2002538543A (en) 2002-11-12 application
WO2000052539A1 (en) 2000-09-08 application
JP2002538541A (en) 2002-11-12 application
JP2002538700A (en) 2002-11-12 application
US6941248B2 (en) 2005-09-06 grant
WO2000052541A1 (en) 2000-09-08 application
JP2002538542A (en) 2002-11-12 application
WO2000052540A1 (en) 2000-09-08 application
US20020046368A1 (en) 2002-04-18 application
WO2000052536A1 (en) 2000-09-08 application

Similar Documents

Publication Publication Date Title
Weyrich et al. An interactive environment for virtual manufacturing: the virtual workbench
US6922599B2 (en) System and method for producing an assembly by directly implementing three-dimensional computer-aided design component definitions
US20040098148A1 (en) Industrial control and monitoring method and system
US20080066004A1 (en) Process Plant User Interface System Having Customized Process Graphic Display Layers in an Integrated Environment
US20070075916A1 (en) Generic utility supporting on-demand creation of customizable graphical user interfaces for viewing and specifying field device parameters
Kadir et al. Virtual machine tools and virtual machining—a technological review
Hoffmann et al. Virtual Commissioning of Manufacturing Systems
US20090319058A1 (en) Systems and methods for immersive interaction with actual and/or simulated facilities for process, environmental and industrial control
US6806847B2 (en) Portable computer in a process control environment
US7852355B2 (en) System and method for carrying out and visually displaying simulations in an augmented reality
US20090300535A1 (en) Virtual control panel
US7640007B2 (en) Wireless handheld communicator in a process control environment
US6219583B1 (en) Control system
US20160132046A1 (en) Method and apparatus for controlling a process plant with wearable mobile control devices
WO1998036335A2 (en) Process control system using a layered-hierarchy control strategy distributed into multiple control devices
US7356773B1 (en) Wizard builder, for application software, building a setup wizard which sets up a defacto interface between the application program and monitoring or control equipment
US20060244565A1 (en) Transmission of data into and out of automation components
Friedrich et al. ARVIKA-Augmented Reality for Development, Production and Service.
US20100114549A1 (en) Systems and methods for providing a simulation environment having a simulation user interface
US20070179645A1 (en) Enhanced tool for managing a process control network
US20020089544A1 (en) System and method for combined use of different display/appliance types with system-controlled, context-dependent information display
US20090271012A1 (en) Method or System for Displaying an Internet Page on a Visualization Device of an Industrial Automation Device
US20120046911A1 (en) Handheld field maintenance tool with integration to external software application
DE19953739A1 (en) Apparatus and method for object-oriented marking and assignment of information to selected technological components
US20120259436A1 (en) Methods and apparatus to manage process control resources

Legal Events

Date Code Title Description
AS Assignment

Owner name: SIEMENS AKTIENGESELLSCHAFT, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FRIEDRICH, WOLFGANG;WOHLGEMUTH, WOLFGANG;REEL/FRAME:012446/0805;SIGNING DATES FROM 20010809 TO 20010911