WO2021058467A1 - Application pour l'examen de défaut guidé, assisté par réalité augmentée, d'un objet d'essai avec visualisation correspondante dans un système de cao - Google Patents

Application pour l'examen de défaut guidé, assisté par réalité augmentée, d'un objet d'essai avec visualisation correspondante dans un système de cao Download PDF

Info

Publication number
WO2021058467A1
WO2021058467A1 PCT/EP2020/076405 EP2020076405W WO2021058467A1 WO 2021058467 A1 WO2021058467 A1 WO 2021058467A1 EP 2020076405 W EP2020076405 W EP 2020076405W WO 2021058467 A1 WO2021058467 A1 WO 2021058467A1
Authority
WO
WIPO (PCT)
Prior art keywords
test object
finding
computer
data
marker
Prior art date
Application number
PCT/EP2020/076405
Other languages
German (de)
English (en)
Inventor
Johannes FASSBENDER
Sven ILLBERGER
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Publication of WO2021058467A1 publication Critical patent/WO2021058467A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41875Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by quality surveillance of production
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32014Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the invention relates to a device and a computer-implemented method for AR-supported, guided Be foundations of a test object with corresponding visualization in a CAD system.
  • the invention relates to the recording of findings from test objects that have been examined with the aid of visual or non-destructive tests (NDT) or the like.
  • test objects can be cast components or welded components, for example. These can be designed for example as a housing and or as combustion chamber components egg ner thermal flow machine.
  • the information recorded in this way is then compiled by hand in a computer system and assigned accordingly to a data record for the component being examined, which includes, among other things, all the findings and their local position.
  • findings technical defects or errors such as cracks, voids, inclusions, local corrosion, etc.
  • this type of location description is then comparatively imprecise.
  • the position of a finding can only be recorded in a 2D drawing, which, however, means a significant amount of additional effort and only helps to a limited extent to find the fault on the real component.
  • the object of the invention is therefore to provide a device and a computer-implemented method for error-free detection and / or for documenting findings of a test object, that is to say of a component or an assembly.
  • a computer-implemented method for recording findings on a test object comprising the steps: a. Determining the position of a reference point Pi okai ( 0/0/0) on the test object by means of an AR device with the aid of a reference marker attached to the test object or by means of recognition of a partial geometry of the test object, b. Detecting the position Pi okai ( x, y, z) of at least one finding marker by means of the AR device and an associated image of the relevant finding marker attached to the test object, c. Saving the image and the associated position of the relevant finding marker as part of a finding data record in a data processing system, d. Merging of CAD data of the test object and all
  • AR stands for the English term “augmented reality”, which can be translated into German as a computer-aided expansion of the perception of reality.
  • AR devices usually have a semi-transparent display, several cameras to capture the environment and several sensors to determine the position and perspective of the AR device in the room.
  • an AR device is understood to mean a device which, on the one hand, can detect its position in space and any subsequent change in position. It can therefore use a camera to record the surroundings and, here, the test object to be examined and markers attached to the test object and save them digitally.
  • the AR device is also able to capture a triangulated surface network of the environment or any representation of the surfaces.
  • a reference point on the test object is determined with the aid of the AR device.
  • This reference point is the origin Pi okai ( 0/0/0) of a local coordinate system, hereinafter also referred to as the reference system.
  • the position of the reference point is determined with the aid of a reference marker attached to the test object, to which the subsequently recorded positions of so-called finding markers relate later.
  • an inherently suitable, mostly unambiguous partial geometry of the test object could serve as the reference marker that is already present. This would have the advantage of being able to capture the reference point more quickly and always at the same position when testing a series of identical test objects.
  • the finding markers were placed by a user at that position on the real component at which a finding, for example a fault, a defect or any other irregularity in the material, was recognized or discovered using any test method.
  • the local position of the findings marker in relation to the reference position in particular its local position in relation to the reference point Pi okai ( 0/0/0), can be recorded and saved. Then the position of the finding, which was previously only known in relation to the surface network, is now described in the reference system.
  • test features of the finding can be recorded directly in a data processing system.
  • information on the finding itself is understood. These can include, for example: Type of finding: cracks, cavities, inclusions, local corrosion, etc.,
  • Scope of the finding Crack length: mean size of the inclusion, size / extent of the corrosion, etc. and / or date of the first appearance of the finding, etc.
  • an image is generated by a camera of the AR device, which visualizes the position of the finding and also documents it.
  • the position of the finding, the image and the test features of the relevant finding are then summarized as a finding data record. This allows manual entries to be reduced, which, in addition to automation, also leads to a reduction in errors and costs.
  • the perspective from which the user looks at the test object and in particular the observed finding is also recorded. This information can later be used when integrating the findings data or its data set into the CAD System make sense, especially to support a well-founded evaluation of the findings.
  • All information recorded with the AR device i.e. all findings data records, are transferred to a CAD system.
  • the CAD system also reads in the CAD data of the test object.
  • the CAD system then links the CAD data with the findings data records, for example by means of scripts and / or other inputs, so that the virtual test object with the findings can be displayed to a new user according to the real view by the user. This means that during the subsequent analysis of the test object, the test object is displayed to the new user from the same perspective as it was to the user at the time of his finding.
  • the display takes place on a screen, i.e. on a commercially available monitor or in the same or a different AR device, for example to look at the position again after an examination.
  • the position of a finding marker is preferably determined as follows: A global coordinate system is defined for the AR device in which the position of the reference point is indicated by an auxiliary vector. When the findings marker is detected, its position in the global coordinate system is described by a global location vector, so that a local location vector can be determined by calculating the difference between the auxiliary vector and the global location vector, through which the position of the findings marker in relation to the reference reference point can be determined. This makes it possible for the position of finding data to be determined even without the prior provision of CAD data for the test object.
  • steps b and c are repeated for each finding marker attached to the test object, before step d is only carried out after all steps b and c have been completed. Or for each finding markers attached to the test object, the Steps b - d performed.
  • the latter alternative has the advantage that the intermediate results are not lost in the event that the AR device fails in the meantime, for example due to a flat battery.
  • the invention further comprises a device for data processing which has means for executing the above-described method of the above-described method or one of its variants.
  • the invention also comprises a computer program product which has instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method described above or one of its variants.
  • the invention comprises a computer-readable medium, comprising instructions which, when executed by a computer, cause the computer to carry out the steps of the above-described method or one of its variants.
  • FIG. 2 shows an image determined by an AR device with a triangulated surface network projected thereon
  • FIG. 3 shows two images, one showing the use of an AR device to record findings from a test object, the other the image of the test object recorded by the AR device at the same time,
  • FIG. 4 the determination of the position of a test object on the basis of several coordinate systems with corresponding references,
  • FIG. 5 shows a test object in a CAD representation with several different findings
  • FIG. 6 shows the visualization of the test object and one of the associated findings as a screenshot of a data processing system
  • FIG. 7 shows a process chain for storing findings data.
  • FIG. 1 shows the concept according to the invention, which can be implemented as a computer-implemented method PR.
  • the computer-implemented method PR can be divided into three blocks B 1, B 2 and B 3.
  • the first block B1 comprises the provision of data in which order data about the test object can be collected and recorded using an independent graphical user interface (GUI).
  • GUI graphical user interface
  • the order data can come from different sources and are recorded via the GUI in order to be saved in a data processing system.
  • All order data for the test object are then structured in a transport file in the data processing system.
  • the transport file can provide the data in XML format. As a rule, however, this transport file does not yet contain any CAD data on the test object to be recorded.
  • the findings data acquisition and findings data processing according to the invention then take place in a second block B2, in particular according to steps b and c.
  • an AR device For example a Microsoft HoloLens
  • the associated software for example a HoloLens application such as “Unity 3D” or “C #”
  • the AR device records properties and / or images of the test object, the finding markers arranged thereon and the local position of the finding markers. This information is combined and saved as a record of findings.
  • test features are linked, ie the findings data of the test object, with their local position on the test object.
  • the data processing system uses an algorithm to acquire a local vector that describes the location of the finding in a reference system. All the findings data sets collected in this way are then saved as a log in a second, also in XML-formatted transport file for further use.
  • the second transport file can expediently contain the data of the first transport file, which has been supplemented by the findings data records.
  • the third block B3 of the computer-implemented method PR is the subsequent data visualization in a correspondingly suitable data processing system, according to step d.
  • a suitable data processing system can, for example, represent the software "Siemens-NX” with appropriate plugins such as “NXOpen” or other tools, for example in "C #”.
  • the data processing system is provided with a 3-D CAD file.
  • the data processing system can process the transport file according to findings after it has been read in, in order to provide a user with a virtual 3-D CAD model of the test object, enriched with the findings or findings recorded by the AR device, on a suitable output device. Display test features in a compact way.
  • the AR device shows an example of an image that has been captured by the camera of the AR device.
  • the picture essentially shows the corner of an office table BT and an office chair ST.
  • the AR device is able to capture the spatial structure of its entire camera image including both test objects and their surroundings and to superimpose a triangulated surface network OFN on it.
  • FIG. 3 two images are shown, the image on the left showing a user PPR wearing an AR device and a test object PO.
  • the test object PO is the upper housing half of a turbo machine.
  • the user PPR records part of the test object PO using his AR device ARG.
  • the image shown on the right in FIG. 3 shows the image captured by the camera of the AR device at the same time: a finding BD on a flange FL of the housing half, i.e. test object PO.
  • the finding BD is identified by a sequential number, shown in FIG. 3 as the value "20".
  • FIG. 4 shows how the computer-implemented method detects and correctly transforms the position of a test object (here the position of a cube):
  • a global coordinate system 1 is defined when it is switched on, the origin of which is P giobai (0/0 / 0) is accepted.
  • the AR device is then positioned by its carrier, the user PPR, in such a way that it can detect a reference marker RM attached to the test object and, in particular, its position.
  • the reference marker RM thus defines the origin Pi okai (0/0/0) of a local coordinate system 2, which is relevant for the test object PO.
  • This results in an auxiliary vector 3 which points from the origin of the global coordinate system 1 to the origin of the local coordinate system 2.
  • a reference marker there is of course also the option of capturing a definite part of the geometry of the test object and defining its position as the position of the reference marker, ie to be used as the origin of the local coordinate system.
  • Findings to be recorded on the test object PO are marked with corresponding finding markers BM, with an individual finding marker BM being assigned for each finding determined.
  • the finding marker is usually arranged in the immediate vicinity of the finding on the test object PO.
  • the AR device ARG is able to identify the finding markers BM as such and to determine their position by means of the surface network OFN in relation to the global coordinate system 1. This results in a global position vector 5 for the relevant finding (marker). A local location vector 4 can then be determined by forming the difference between the auxiliary vector 3 and the global location vector 5.
  • the data processing system is thus able to determine and store the local position of a finding, in other words a finding marker, in relation to the reference position of the test object PO in a comprehensible manner.
  • the AR device ARG either triggered by the user PPR or automatically - creates a local image of the test object and the associated findings marker and stores its position and perspective according to the above vector calculation as part of the relevant findings data record. For the perspective, in addition to the vector calculation, the corresponding rotation matrices must also be calculated for the conversion from the global to the local coordinate system.
  • test object PO Every finding of the test object PO is recorded by data record in a natural way. All records are summarized and saved in a transport file.
  • the aforementioned transport file can then be merged with the CAD data of the test object in order to be able to display it to a user in a suitable data processing system. All that is required for this is that the position of the origin of the reference system and its orientation in space be brought into agreement with a corresponding position and orientation of the CAD model of the test object, so that the local vectors 4 can be used to determine the actual position of the findings recorded in CAD -Model can be displayed in the correct position.
  • FIG. 5 shows a photo of a 3-D CAD representation of the test object PO enriched in this way with findings BD shown in different colors (yellow, green, blue) on the turbomachine housing described above.
  • the different colors are used to differentiate between different categories of findings, for example uncritical or critical findings or the like.
  • FIG. 6 shows the graphic representation BI of the test object PO according to FIG. 5 in the data processing system, which at the same time holds further information IN on the selected finding.
  • the additional information IN includes the camera image KB of the AR device associated with the finding and other test features SR shown in tabular form, such as the type of finding, size of the finding, relevance of the finding, etc.
  • FIG. 7 shows, in an alternative, schematic representation, the computer-implemented method PR for recording and storing findings data and linking them with CAD data MO.
  • the location of a finding of a test object PO is recorded with the aid of an AR device ARG and saved locally in a log file PD or in a cloud data storage device CL.
  • the log file can be downloaded from any location can be imported into a data processing system CAS, for example “Siemens NX”, which can then provide its user with an enriched 3-D CAD model DS of the test object and its findings.
  • a separate log file can be created for each embodiment of a component, in which all information about the state of the respective embodiment is stored and documented in a precise manner.
  • the invention thus relates to a computer-implemented method for detecting a test object.
  • the test object is recorded in detail with the help of an AR device.
  • Findings determined in advance are marked locally on it with the aid of a finding marker.
  • the AR device recognizes each of the finding markers and saves their local position in relation to the reference system together with an image of the finding marker and possibly a further digital photo of the finding fund (ultrasound image, image of the crack, image local damage, etc.) as a record.
  • Each data record therefore represents a finding.
  • All data records are saved in a log file that can be saved locally (or via a cloud connection as shown) in a corresponding CAD system (here "Siemens-NX” and the “NXOpen” plug-in or a CAD viewer) .
  • These data can be linked with the CAD data of the test object in electronic form in such a way that an enriched virtual model of the test object results in which, in addition to spatial-physical features of the test object, the data records on the findings on the corresponding spatial Place of their occurrence on the one hand shown schematically, selectable and then on the other hand can be shown in detail.
  • the virtual model also contains the perspective orientation of the test object according to the perspective of the user who recorded the data set.
  • the system can be scaled as required and can accordingly be used for a large number of different test objects and types of test.
  • Any AR devices can be used as long as a surface network is made available or a similar form of three-dimensional recording of the real 3-D geometry of the test object is possible.
  • the concept can be expanded to include any visual inspection.
  • AS-built documentation e.g. a location of the parts of an entire technical system, is possible.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un procédé mis en oeuvre par ordinateur et destiné à l'acquisition d'un objet d'essai au moyen d'un dispositif de réalité augmentée - RA -, selon lequel des défauts prédéterminés de l'objet d'essai ont été préalablement marqués localement à l'aide de marqueurs de défauts. Le dispositif RA reconnaît chaque marqueur de défauts et enregistre sa position locale par rapport à un système de référence, conjointement avec une image du marqueur de défauts et éventuellement avec des photos numériques supplémentaires du défaut, en tant que jeu de données de défaut. Tous ces jeux de données sont enregistrés dans un fichier journal qui, à son tour, peut être lu par un système de CAO approprié correspondant. Ces données peuvent être liées sous forme électronique aux données de CAO électronique de l'objet d'essai de manière à ce qu'il en résulte un modèle virtuel augmenté de l'objet d'essai, modèle dans lequel, en plus des caractéristiques physiques tridimensionnelles de l'objet d'essai, les jeux de données des défauts sont également représentés schématiquement au niveau des positions tridimensionnelles correspondantes où sont situés lesdits défauts, et lesdits jeux de données peuvent être sélectionnés puis présentés en détail.
PCT/EP2020/076405 2019-09-23 2020-09-22 Application pour l'examen de défaut guidé, assisté par réalité augmentée, d'un objet d'essai avec visualisation correspondante dans un système de cao WO2021058467A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019214468 2019-09-23
DE102019214468.3 2019-09-23

Publications (1)

Publication Number Publication Date
WO2021058467A1 true WO2021058467A1 (fr) 2021-04-01

Family

ID=72826824

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/076405 WO2021058467A1 (fr) 2019-09-23 2020-09-22 Application pour l'examen de défaut guidé, assisté par réalité augmentée, d'un objet d'essai avec visualisation correspondante dans un système de cao

Country Status (1)

Country Link
WO (1) WO2021058467A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19953739A1 (de) * 1999-11-09 2001-07-12 Siemens Ag Einrichtung und Verfahren zur objektorientierten Markierung und Zuordnung von Information zu selektierten technologischen Komponenten
WO2001096829A1 (fr) * 2000-06-13 2001-12-20 Volkswagen Aktiengesellschaft Procede de controle qualitatif d'un produit et systeme de mise en oeuvre dudit procede
US20190139320A1 (en) * 2017-11-09 2019-05-09 The Boeing Company Systems, methods, and tools for spatially-registering virtual content with physical environment in augmented reality platforms

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19953739A1 (de) * 1999-11-09 2001-07-12 Siemens Ag Einrichtung und Verfahren zur objektorientierten Markierung und Zuordnung von Information zu selektierten technologischen Komponenten
WO2001096829A1 (fr) * 2000-06-13 2001-12-20 Volkswagen Aktiengesellschaft Procede de controle qualitatif d'un produit et systeme de mise en oeuvre dudit procede
US20190139320A1 (en) * 2017-11-09 2019-05-09 The Boeing Company Systems, methods, and tools for spatially-registering virtual content with physical environment in augmented reality platforms

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
EL AMMARI KHALED ET AL: "Remote interactive collaboration in facilities management using BIM-based mixed reality", AUTOMATION IN CONSTRUCTION, ELSEVIER, AMSTERDAM, NL, vol. 107, 5 September 2019 (2019-09-05), XP085856693, ISSN: 0926-5805, [retrieved on 20190905], DOI: 10.1016/J.AUTCON.2019.102940 *

Similar Documents

Publication Publication Date Title
DE102010015566B4 (de) Verfahren und System zur Vermessung spiegelnder Oberflächen
EP2019283B1 (fr) Procédé et dispositif de mesure des données de mesure réelles d'un composant
DE102005061952A1 (de) Verfahren und System zur Bestimmung einer Ungenauigkeitsinformation in einem Augmented Reality System
DE10146834A1 (de) Verfahren und Vorrichtung für die computergestützte Fertigungs-Mess-Analyse
DE102017116952A1 (de) System und verfahren für verbessertes scoring von 3d-lagen und entfernen von störpunkten in 3d-bilddaten
WO2001096829A1 (fr) Procede de controle qualitatif d'un produit et systeme de mise en oeuvre dudit procede
WO2016015928A1 (fr) Procédé de création d'un protocole de mesure et ordinateur pour mettre en œuvre un tel procédé
DE102014104514B4 (de) Verfahren zur Messdatenvisualisierung und Vorrichtung zur Durchführung des Verfahrens
DE102018214280A1 (de) Inspektionssystem und Verfahren zum Korrigieren eines Bildes für eine Inspektion
WO2021058467A1 (fr) Application pour l'examen de défaut guidé, assisté par réalité augmentée, d'un objet d'essai avec visualisation correspondante dans un système de cao
DE112023000151T5 (de) Inspektionsunterstützungssystem, inspektionsunterstützungsverfahren und inspektionsunterstützungsprogramm
EP2636019A1 (fr) Procédé et dispositif d'évaluation servant à déterminer la position d'une structure se trouvant dans un objet à examiner au moyen d'une tomographie aux rayons x assistée par ordinateur
DE102019208448A1 (de) Verfahren zur Überprüfung eines Geräts oder einer Maschine
EP3911944B1 (fr) Procédé et dispositif d'inspection au boroscope
DE102018214307A1 (de) System und Verfahren zur Qualitätsprüfung bei der Herstellung von Einzelteilen
DE102013018364B4 (de) Verfahren zur Detektion und/oder Messung von Oberflächenfehlern eines Bauteils
WO2019229136A1 (fr) Procédé de fabrication pour un dispositif d'entraînement et dispositif de contrôle
WO2021165129A1 (fr) Procédé et dispositif pour générer des scénarios combinés
EP3174010A2 (fr) Procede de creation d'une representation en 3d et dispositif d'enregistrement d'images correspondant
DE19822392A1 (de) Verfahren zur Ermittlung von Koordinaten einer ausgewählten Stelle auf der Oberfläche eines Objektes sowie Verfahren zur Registrierung von Fehlerorten auf einer Qualitätsprüfung unterworfenen Prüfobjekten
DE102019110185A1 (de) Verfahren und System zum Registrieren eines Konstruktionsdatenmodells in einem Raum
EP1988506A2 (fr) Procédé de détermination automatique de la zone de contrôle, procédé de contrôle et système de contrôle
DE102006026433A1 (de) Prüfsystem und Prüfverfahren zur Überprüfung eines dreidimensionalen Körpers und zur optischen Anzeige von Maßabweichungen dieses Körpers
WO2018172267A1 (fr) Procédé et dispositif permettant de déterminer au moins deux positions de radiographie
WO2022053095A1 (fr) Procédé de test de composants

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20789456

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20789456

Country of ref document: EP

Kind code of ref document: A1