WO2023110993A1 - Création d'une instruction de travail - Google Patents

Création d'une instruction de travail Download PDF

Info

Publication number
WO2023110993A1
WO2023110993A1 PCT/EP2022/085809 EP2022085809W WO2023110993A1 WO 2023110993 A1 WO2023110993 A1 WO 2023110993A1 EP 2022085809 W EP2022085809 W EP 2022085809W WO 2023110993 A1 WO2023110993 A1 WO 2023110993A1
Authority
WO
WIPO (PCT)
Prior art keywords
expert
data
action
activity
work
Prior art date
Application number
PCT/EP2022/085809
Other languages
German (de)
English (en)
Inventor
Gero Jacob Corman
Miriam HERMANN
Timm SCHULZ-ISENBECK
Lukas Lehmann
Martin Kumke
Niklas Brandt
Nils Kessler
Christian Kiefer
Yannick Wehmann
Original Assignee
Volkswagen Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volkswagen Aktiengesellschaft filed Critical Volkswagen Aktiengesellschaft
Publication of WO2023110993A1 publication Critical patent/WO2023110993A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06398Performance of employee with respect to a job function
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/101Collaborative creation, e.g. joint development of products or services

Definitions

  • the invention relates to a device for creating work instructions for an expert for an activity and a corresponding system, method and computer program.
  • Augmented reality applications i.e. computer-assisted supplementation of reality with virtual elements
  • Augmented reality technology expands human perception in real time through texts, videos, images or three-dimensional animations.
  • Creating an application for an augmented reality is complex and not very intuitive.
  • a new application is mainly programmed on a computer.
  • US Pat. No. 8,836,768 B1 relates to users of wearable glasses with a pair of two-dimensional cameras that optically capture information for user gestures performed with a user object in an interaction zone in response to viewing displayed images with which the user can interact.
  • the eyewear system intelligently signals, captures, and processes optical information to quickly determine a sparse set of locations suitable for identifying user gestures.
  • the displayed images can be generated by the eyewear system and presented with a virtual display on the eyewear, or can be generated and/or viewed without the eyewear.
  • the user may see local views directly, but supplemented with images showing web-provided tags that identify viewed objects and/or provide information.
  • Eyewear-mounted systems can communicate wirelessly with cloud servers and with non-eyewear-mounted systems that the user can carry in their pocket or purse. The disadvantage here is that a person skilled in the art does not receive any suggestion from the publication as to how augmented reality software can be created,
  • the publication US 2020 004 335 4 A1 relates to a desktop system for intuitive guidance in an augmented reality long-distance video communication environment.
  • the table system includes augmented reality goggles worn by a field worker, equipped with a video camera that receives actual image information on site and an augmented shows instructions on a transparent display; a tabletop that receives the actual image information from the augmented reality glasses and displays the actual image information on a touch screen, captures hand motion information and instructions that are displayed by a remote expert in an upper area of the touch screen, and transmits the hand motion information and instructions to the augmented -Reality glasses transmit.
  • the disadvantage here is that a person skilled in the art does not receive any suggestion from this publication either as to how augmented reality software can be created. It only teaches how information can be transferred from an expert to augmented reality glasses.
  • the invention is now based on the object of specifying an improved possibility of creating work instructions, in particular in the form of augmented reality software.
  • An intuitive possibility should preferably be specified in order to enable rapid and error-free development of the software.
  • a device for creating work instructions for an expert for an activity having: an input interface for receiving expert data including operating data and/or sensor data with information on the behavior of the expert during the activity; an analysis unit for determining an action of the expert based on the expert data and determining a work instruction for the work instruction; an output interface for transmitting the work instructions to a storage unit, the analysis unit being designed to determine work instructions for the activity of the expert from one or more work instructions.
  • a system for creating work instructions for an expert for an activity comprising: a device as defined above; a sensor for acquiring expert data; a storage unit for storing the work instruction, and preferably an AR display device.
  • a method for creating work instructions for an expert for an activity comprising the steps: receiving expert data, including operating data and/or sensor data with information about the behavior of the expert during the activity;
  • a front-end interface can provide a cost-effective device that can preferably be used with existing systems, modules, and units.
  • An input interface can be wired and/or wireless and preferably support one or more communication protocols.
  • An action by the expert can be determined quickly and reliably and a work instruction can be determined by an analysis unit.
  • the work instructions preferably include an augmented reality program. It goes without saying that work instructions can also be created in text form, with the movements and activities of the expert being particularly preferably recorded.
  • an output interface enables a cost-effective device that can preferably be used with existing storage systems, in particular cloud storage.
  • a technically simple and cost-effective device can be created which, in particular, does not have to be designed to record the information itself.
  • the device can thus be integrated into already existing systems. It goes without saying that the device can also be used purely in terms of software can be created. In particular, it is conceivable to create the device in the form of an app on a smartphone.
  • the analysis unit is designed to translate the behavior of the expert during the activity into an augmented reality program and the work instructions include an augmented reality application containing the augmented reality program.
  • the work instructions include an augmented reality application containing the augmented reality program.
  • the analysis unit is designed to interpret an action by the expert and, based on the interpretation of the action by the expert, to determine the action. Determining the action means that the analysis unit records, based on the expert data, what the expert performs in his job. For example, it can be determined from a turning movement of the hand and the position of the hand that an expert is operating a rotary knob, and the extent of the turning movement can also be detected. For example, the action can be determined from this, such as setting a speed using a rotary control, the speed being increased by X units. For example, it can also be recognized when a screw is screwed or the like. In particular, it can be determined from the position of the screw and stored construction plans what kind of screw it is.
  • the analysis unit is designed to determine the action of the expert using artificial intelligence.
  • a determination of the action of the expert can be made more precise and improved.
  • an improved determination can take place with each use of the artificial intelligence.
  • a reliable and improving device can be provided.
  • the input interface is designed to receive expert data, including camera data from a camera that records an action by the expert; lidar data of a lidar detecting an action of the expert; Glove data, a sensor glove that performs an action of expert detects and/or receive suit data of a sensor suit detecting an action of the expert.
  • An action by the expert can be reliably detected by the aforementioned sensors and devices.
  • expert data recorded in this way allow the expert action to be reliably determined.
  • the expert data can be supplemented by operator inputs on a machine that the expert operates.
  • the expert data may include source code when the expert reprograms a machine.
  • the analysis unit is designed to recognize one or more of the following activities of the expert: assembly activities of the expert, carrying out work on a machine, eliminating errors and/or setting up a machine. Based on the pre-categorization of the activities of the expert, an improved recognition and determination of individual expert actions can take place.
  • the input interface is designed to receive expert data, including recognized gestures of the expert; the analysis unit is designed to determine the gestures and to determine a representation corresponding to the gestures and the output interface is designed to transmit a control command to a VR display device, in particular VR glasses of the expert, the control command an output of the determined display on the VR display device.
  • the expert can be given an intuitive opportunity to improve a work instruction or to specify it more precisely.
  • Instructions for use can be created in the form of an augmented reality program, which allows the expert to use the advantages of augmented reality without the expert having to have any prior knowledge in the field of augmented reality programming.
  • Machine learning or machine learning is a generic term for the "artificial" or machine generation of knowledge from experience: An artificial system learns from examples and can generalize them after the learning phase has ended. To do this, machine learning algorithms build a statistical model based on training data. This means that the examples are not simply learned by heart, but that patterns and regularities are recognized in the learning data. That's how it can be System also assess unknown data, a so-called learning transfer, or fail to learn unknown data, for example through overfitting.
  • Extended reality or augmented reality is the computer-aided extension of the perception of reality. This information can appeal to all human sensory modalities. Often, however, augmented reality is only understood as the visual representation of information, i.e. the supplementing of images or videos with computer-generated additional information or virtual objects by means of overlays/overlays.
  • An augmented reality headset is a type of semi-transparent display arranged on the head and preferably in front of the eyes of a user, which gives the user an insight into the augmented Reality (AR) should provide.
  • AR augmented Reality
  • virtual reality glasses can also be regarded as AR glasses if a video recording of the environment is shown on the displays of the VR glasses and supplemented by AR elements.
  • Work instructions are to be understood in particular as instructions on how work is to be carried out.
  • the work instructions can contain tips and additional information for the work to be carried out.
  • a work instruction can be designed in text form, in audio form, and video form and/or as an augmented reality application.
  • Figure 1 is a schematic representation of a device for creating a
  • Figure 2 is a schematic representation of a system for creating a
  • FIG. 3 shows a schematic representation of an expert in an activity
  • Figure 4 shows a schematic representation of various expert activities
  • FIG. 5 shows a schematic representation of the steps of a method according to the invention.
  • FIG. 1 schematically shows a device 10 for creating work instructions for an expert for an activity.
  • the device 10 comprises an input interface 12, an analysis unit 14 and an output interface 16.
  • the input interface 12 is designed to receive expert data, including operating data and/or sensor data, with information about the behavior of the expert during the activity.
  • the expert data can include, for example, positions and movements of the expert, actions recognized by the expert and/or operator inputs by the expert and can come, for example, from a sensor suit, sensor glove, AR input device, a smartphone, a radar, laser or lidar sensor, a camera and/or or come from smart glasses.
  • the input interface 12 can be designed for wireless communication or be connected to a proprietary transmission network, for example a wired transmission network.
  • the analysis unit 14 is designed to process and analyze the expert data.
  • the analysis unit 14 determines an action of the expert based on the expert data and determines a work instruction for the work instruction.
  • the work instructions can therefore include several work instructions.
  • the analysis unit 14 preferably translates the work instructions into work instructions in the form of an augmented reality program. Furthermore, the analysis unit 14 can interpret an action by the expert and determine the action based thereon.
  • tightening of a screw or operation of a button can be determined from a recorded and analyzed hand movement of the expert. It goes without saying that the work instructions can be supplemented with this information.
  • the action of the expert is preferably determined by means of artificial intelligence.
  • the analysis unit 14 can recognize various activities of the expert, such as assembly activities of the expert, carrying out work on a machine, eliminating errors and/or setting up a machine. It goes without saying that the expert can also enter his activity beforehand by input or speech recognition.
  • the output interface 16 is designed to transmit the control command to a vehicle unit of the motor vehicle, it being possible for the vehicle unit to include in particular a display device and/or a driver assistance system.
  • the output interface 16 can be designed analogously to the input interface 12 for communication. It goes without saying that the input interface 12 and the output interface 16 can also be designed in combination as a communication interface for sending and receiving.
  • Figure 2 shows a schematic representation of a system 18 according to the invention for creating work instructions for an expert for an activity with a device 10, units for recording expert data, a storage unit 28 and an augmented reality display device 30.
  • the units for acquiring expert data can in particular include an operating unit 20, an acquisition unit 22, in particular with a radar and/or lidar sensor and/or a camera, a sensor suit 24 and/or a sensor glove 26. It goes without saying that other units, such as a smartphone, are also conceivable. For reasons of clarity, no other units are shown in the drawing.
  • the input interface 12 can be configured in particular to receive expert data including gestures recognized by the expert.
  • the analysis unit 14 can determine the gestures of the expert and determine a representation corresponding to the gestures.
  • the output interface 16 can transmit a control command to an AR display device or augmented reality display device 30, in particular to the expert's AR glasses.
  • the control command causes the determined representation to be output on the AR display device. This allows the expert to supplement and improve the work instructions. In particular, the expert can influence how the activity is included in the work instructions directly when carrying out his activity.
  • Storage unit 28 may include non-volatile data storage, such as hard disk storage, flash memory, or the like. Furthermore, the storage unit 28 can also include cloud storage. The work instructions are stored in the storage unit 28 . It goes without saying that the analysis unit 14 in particular can also include a buffer memory for processing the expert data. In Figure 3, an expert 32 or an expert 32 is shown in an activity. For reasons of readability, the masculine form was chosen in the text, nevertheless the information refers to members of any gender.
  • the expert 32 is in a hall and carries out an activity on a system 34 .
  • the expert 32 is recorded by one or more recording units 22, with the recording units 22 generating expert data.
  • the expert 32 wears AR glasses and can directly view and/or change the work instruction that he generates during his activity. It goes without saying that the expert 32 can also carry out his activity without AR glasses.
  • the expert 32 can set up a machine in the system 34 and use the operating unit 20 of the system 34 for this purpose, for example. In addition to the movements and activities of the expert 32, his operating inputs can then also be recorded and stored in the work instructions.
  • FIG. 1 Various expert activities and the creation of work instructions for the expert activities are shown schematically in FIG. 1
  • the expert activity can include, for example, carrying out work on a machine 36, in particular an assembly activity or setting up a machine, correcting errors 38, training an apprentice 40 or another activity 42.
  • Other activities 42 can in particular include maintenance of a system 34, filling of a system 34 with resources or the like.
  • the activity of the expert 32 can be recorded by one or more recording units 22 . Additionally or alternatively, the expert 32 can also use a sensor suit 24, a sensor glove 26 and/or an AR input device or VR input device. This generates expert data with information about the behavior of the expert 32 during the activity.
  • FIG. 5 shows the steps of a method according to the invention for creating work instructions for an expert 32 for an activity.
  • a first step S1 expert data including operating data and/or sensor data with information about a behavior of the expert 32 during the activity is received.
  • a second step S2 an action by the expert 32 is determined based on the expert data.
  • a work instruction for the work instruction is determined based on the expert data and a work instruction for the activity of the expert 32 from one or more work instructions.
  • a fourth step S4 the work instructions are transmitted to a storage unit 28.
  • the creator ie the expert
  • the creator wears a camera or another sensor which can record the environment and the action of the application creator.
  • the creator can then carry out actions as they should appear in the later application. He can carry out assembly work, carry out work on a machine, rectify errors or the like. All of these gestures or other actions are recognized by the camera or sensor system and translated into an augmented reality program.
  • the creator of a new application carries out the activities to be programmed quite normally and thus intuitively creates a new program.
  • AR Augmented Reality

Abstract

L'invention concerne un appareil (10) pour créer une instruction de travail d'un expert (32) pour une activité, ainsi qu'un système (18), un procédé et un programme informatique correspondants. Afin d'effectuer une programmation intuitive, le créateur (32) d'une nouvelle application (46) porte une caméra (22) ou un autre capteur (22) qui peut détecter l'environnement et l'action du créateur d'application (32). Pour créer une nouvelle application (46), le créateur (32) peut ensuite effectuer des activités (36, 38, 40, 42) telles qu'elles devraient apparaître dans l'application ultérieure (46). Par exemple, le créateur peut effectuer un travail d'assemblage, effectuer un travail sur une machine (36), rectifier des défauts (38), ou similaires. Tous ces gestes ou d'autres actions sont reconnus par le système de caméra ou de capteur (22) et traduits en un programme de réalité augmentée (46). Par conséquent, le créateur (32) d'une nouvelle application effectue les activités à programmer de manière assez normale et crée ainsi, de façon intuitive, un nouveau programme (46).
PCT/EP2022/085809 2021-12-15 2022-12-14 Création d'une instruction de travail WO2023110993A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021214458.6 2021-12-15
DE102021214458 2021-12-15

Publications (1)

Publication Number Publication Date
WO2023110993A1 true WO2023110993A1 (fr) 2023-06-22

Family

ID=84785197

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/085809 WO2023110993A1 (fr) 2021-12-15 2022-12-14 Création d'une instruction de travail

Country Status (1)

Country Link
WO (1) WO2023110993A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8836768B1 (en) 2012-09-04 2014-09-16 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US20190354761A1 (en) * 2018-05-21 2019-11-21 Ptc Inc. Augmented Reality-Based Capture, Processing and Transfer of Occupational Knowledge
US20200004335A1 (en) 2018-03-19 2020-01-02 Kabushiki Kaisha Toshiba Eye movement detecting device, electronic device and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8836768B1 (en) 2012-09-04 2014-09-16 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US20200004335A1 (en) 2018-03-19 2020-01-02 Kabushiki Kaisha Toshiba Eye movement detecting device, electronic device and system
US20190354761A1 (en) * 2018-05-21 2019-11-21 Ptc Inc. Augmented Reality-Based Capture, Processing and Transfer of Occupational Knowledge

Similar Documents

Publication Publication Date Title
DE10063089C1 (de) Anwendergesteuerte Verknüpfung von Informationen innerhalb eines Augmented-Reality-Systems
DE112006002954B4 (de) Virtuelles Schnittstellensystem
DE102018109463B3 (de) Verfahren zur Benutzung einer mehrgliedrigen aktuierten Kinematik, vorzugsweise eines Roboters, besonders vorzugsweise eines Knickarmroboters, durch einen Benutzer mittels einer mobilen Anzeigevorrichtung
DE102018102194A1 (de) Elektronische Einrichtung, Informationsverarbeitungsverfahren und Programm
DE202017105200U1 (de) Ziehen von virtuellen Elementen einer erweiterten und/oder virtuellen Realitätsumgebung
US11500462B2 (en) Manufacturing assistance system
DE112013004801T5 (de) Multimodaler Berührungsbildschirmemulator
EP3695293A1 (fr) Procédé de fourniture d'une réponse haptique à un opérateur d'un dispositif d'affichage tactile
DE10215885A1 (de) Automatische Prozesskontrolle
AT15099U2 (de) System zur Überwachung einer technischen Vorrichtung
DE102019005884A1 (de) Schnittstellen und Techniken zum Einpassen von 2D-Anleitungsvideos in 3D-Tutorials in der virtuellen Realität
DE112016006769B4 (de) Verfahren für Gebärdenspracheneingaben in eine Benutzerschnittstelle eines Fahrzeugs und Fahrzeug
DE112020004921T5 (de) Vorausschauende virtuelle rekonstruktion physischer umgebungen
WO2023110993A1 (fr) Création d'une instruction de travail
DE102018220693B4 (de) Steuerungssystem und Verfahren zum Steuern einer Funktion eines Fahrzeugs, sowie Fahrzeug mit einem solchen
EP1487616B1 (fr) Commande de processus automatique
DE112019002798T5 (de) Informationsverarbeitungsvorrichtung, informationsverabeitungsverfahren und programm
DE102015215044A1 (de) Verfahren und System zur Verarbeitung multimodaler Eingabesignale
EP2977855A1 (fr) Clavier virtuel et procédé de saisie pour un clavier virtuel
DE102017007714A1 (de) Verbessertes Bemalen von Objekten durch die Verwendung von Perspektiven oder Transfers in einer digitalen Medienumgebung
DE102020101746A1 (de) Verfahren zur Nutzerinteraktionserfassung von einem Nutzer in einer virtuellen Umgebung
DE102018209717A1 (de) Verfahren zur Steuerung des Betriebs einer Medizintechnikeinrichtung, Medizintechnikeinrichtung, Computerprogramm und elektronisch lesbarer Datenträger
DE102004032996A1 (de) Einfache Roboterprogrammierung
DE102020123307A1 (de) Verfahren zur Interaktion in einer virtuellen Realität
AT525369B1 (de) Testumfeld für urbane Mensch-Maschine Interaktion

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22835413

Country of ref document: EP

Kind code of ref document: A1