GB2587794A - Inspection apparatus and method - Google Patents

Inspection apparatus and method Download PDF

Info

Publication number
GB2587794A
GB2587794A GB1912253.0A GB201912253A GB2587794A GB 2587794 A GB2587794 A GB 2587794A GB 201912253 A GB201912253 A GB 201912253A GB 2587794 A GB2587794 A GB 2587794A
Authority
GB
United Kingdom
Prior art keywords
inspection
vehicle
asset
module
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1912253.0A
Other versions
GB201912253D0 (en
Inventor
Hafez Mohamad
Hadid Ahmed
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hybird Ltd
Original Assignee
Hybird Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hybird Ltd filed Critical Hybird Ltd
Priority to GB1912253.0A priority Critical patent/GB2587794A/en
Publication of GB201912253D0 publication Critical patent/GB201912253D0/en
Publication of GB2587794A publication Critical patent/GB2587794A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/579Depth or shape recovery from multiple images from motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30184Infrastructure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

An inspection system 1000 includes an inspection apparatus 1100 for inspection of a target asset 1200, for example a cracked water pipe, where the system generates a virtual three-dimensional model, and determines its own position within the model. The inspection apparatus may comprise a tracked vehicle 1110 and a mobile module 1120 having sub modules including a mapping module 1130, an inspection module 1140, an orientation module 1150 and a processing module 1160. The mapping module 1130 may use a plurality of sensors, such as cameras to collect data about the target 1200. Collected data is used to develop a three-dimensional virtual model of the target 1200 during the inspection. The mapping module 1130 determines the position and orientation of the vehicle 1110 with respect to the environment. By combining both orientations it is possible to determine the orientation of the mobile module with respect to the real world. This is required to know what the inspection sensor is looking at. This inspection data may be relayed, in real-time to the base station 1400 via the network 1300 using an on-board communication unit 1170.

Description

Inspection Apparatus and Method The present invention relates to an inspection apparatus and an inspection method and is concerned particularly, although not exclusively with an inspection apparatus and method for 10 inspecting industrial assets.
Industrial assets, such as buildings, machinery and chemical vessels, require periodic inspection to identify possible defects. Defects may be in the form of cracks, corrosion, missing objects and/or dropped objects and can be found on any part of the asset, therefore it is important that all surfaces of the asset are checked during an inspection. However, due to the awkward geometry of some assets, or their location, and the time it takes to complete a full inspection, assets are rarely fully checked.
A common solution is to inspect sample areas to suggest an overall condition of the asset. This can result in missed defects. Inspectors are not capable of visualising and confirming whether they have inspected an entire area, which can lead to some areas of the asset being checked multiple times whilst others are not checked at all. Inspectors go around the asset looking for defects, but the inspectors are not capable of visualising and confirming whether they have or have not inspected an area, or if so whether it has been inspected to a suitable level of detail or quality requirement. An inspector might typically have a drawing, or map, or checklist of the asset and will manually tick/highlight areas they think have been inspected.
In light of this, current methods are seen to have many limitations, not least a requirement for manual verification to identify whether an area has been inspected. The inspector has no way of telling if they, or the robotic system, has seen an area, or hasn't seen an area; therefore, critical points of interests can be missed entirely.
A further requirement is that the asset is inspected to a required level of detail. However, current systems used for inspection, which are either carried out manually by a human or remotely by a robot -unmanned aerial vehicle, or unmanned ground vehicle -are unable to verify whether the quality of the inspection data meets the agreed inspection requirements. For example, a predetermined minimum distance that the inspection vehicle is from the asset during inspection, and hence a minimum detail, such as mm/pixel, of any image may be a requirement.
Embodiments of the present invention aim to provide an inspection apparatus and method that at least partly address 25 the above-mentioned problems.
The present invention is defined in the attached independent claims, to which reference should now be made. Further, preferred features may be found in the sub-claims appended 30 thereto.
According to one aspect of the present invention, there is provided an inspection apparatus for inspection of an asset, the apparatus comprising at least one mobile module arranged in use to generate a virtual three-dimensional model of the asset and to determine its own position within the model.
The mobile module may be arranged to generate a virtual three dimensional model of the asset and determine its own position within the model, either in real time, such as but not limited to using Simultaneous localisation and mapping, or post processed, such as but not limited to photogrammetry or structure from motion.
The apparatus may be mountable on a mobile vehicle. The vehicle may comprise any of a handheld, body or helmet mount, wheeled vehicle, tracked vehicle, aerial vehicle, floating vehicle, submersible vehicle and/or off-planet vehicle.
The apparatus may be self-contained, that is, the apparatus may not require any external hardware to function and may be remotely operable and/or programmable to perform automatically.
The mobile module preferably includes an inspe=ion module having one or more sensors, scanner and/or cameras, such as but not limited to visual, audio, non-destructive testing and/or thermal sensors.
In a preferred arrangement, the mobile module includes an orientation module for monitoring the orientation of the inspection module with respect to the vehicle.
The orientation module may be hardware based and not limited to servo, IMU, or software based and not limited to computer vision techniques.
The apparatus may further comprise a base station which may allow a user to visualise the three-dimensional model of the inspection area in real-time and provide a visual indication of the progress of the inspection. Preferably, the base station includes a display.
The base station may also be used to define an inspection criterion, such as a minimum/optimum detail of the data produced by the sensors and/or a maximum/optimum distance each sensor is required to be from the surface for gathering inspection data.
The apparatus may house one or more processing units allowing algorithms to be computed on-board, on the base station or in post process, and at least one means of transmitting and receiving data to and from one or more external data modules and/or other inspection apparatus.
The apparatus may be used in conjunction with any robotic system, unmanned vehicle or handheld device.
According to another aspect of the present invention, there is provided an inspection system, for inspecting an asset, the system comprising a mobile inspection vehicle, a mobile module for mounting on the vehicle and a base station, wherein the mobile module is arranged in use to generate a three-dimensional virtual model of the asset, and wherein the system is arranged in use to determine the position of the mobile module within the virtual model.
In a preferred arrangement, the system is arranged in use to determine the position of the mobile module within the model substantially in real time. Alternatively, or in addition, the system is arranged to determine the motion of the mobile module in post-inspection processing.
Preferably the system is arranged in use to determine the 15 orientation of the mobile unit with respect to the vehicle, more preferably substantially in real time.
The mobile unit preferably includes one or more sensors, scanners and/or cameras.
In a preferred arrangement, the vehicle comprises one or more handheld, helmet or body mounts, wheeled vehicle, tracked vehicle, aerial vehicle, floating vehicle, submersible vehicle and/or off-planet vehicle.
The system preferably includes a processing unit arranged in use to process data gathered by the mobile unit.
The system may include a display, preferably located in the 30 base station, for displaying a representation of the virtual model and/or the location and/or orientation of the mobile module within the model.
The system may include an interface for allowing a user to 35 determine operational parameters for the mobile module, including but not limited to: predetermined minimum/optimum distance for the mobile module from a given portion of the asset and/or minimum/optimum field of view of a scanner/camera of the mobile module with respect to the asset.
The percentage of the field of view is used to ensure that any surface area of interest of the asset falls within parts of the camera's field of view. In some cameras, the image near the edge is not clear, and therefore it is preferred for the point of interest or defect to fall within the centre of the camera's field of view, as that is where the point of interest or defect will be the clearest.
According to further aspect of the present invention, there 20 is provided a method of performing an inspection of an asset, the method comprising manually and/or remotely operating an inspection apparatus according to any statement herein.
The method may alternatively/additionally comprise 25 programming the apparatus to perform an automatic inspection of an asset.
According to another aspect of the present invention, there is provided a method of inspecting an asset, the method comprising generating a three-dimensional virtual model of the asset using a mobile module mounted on a vehicle, determining the position and/or orientation of the module within the model, inspecting the asset to a predetermined standard using one or more sensors of the module and updating the model with data to reflect the extent to which the asset has been inspected to the standard.
The method preferably includes determining the orientation of the mobile module with respect to the vehicle. Preferably the method includes determining the position and/or orientation of the mobile module and/or updating the model substantially in real-time.
The invention also includes a program for causing the apparatus described herein to perform a method of inspecting an asset, the method comprising capturing inspection data from one or more sensors, scanners and/or cameras and constructing a virtual three-dimensional model of the asset.
The method may further comprise relaying the model to a user to allow real-time checking/monitoring of the inspection as well as data relating to possible areas of interest, such as defects.
The method may also include maintaining one or more inspection requirements set by the user before and/or during the inspection process and modifying the inspection process to conform with new commands from a user.
In a further aspect, the invention provides a computer programme product on a computer readable medium, comprising instructions that, when executed by a computer, cause the computer to perform one or more methods of inspecting an asset according to any statement herein.
The invention may include any combination of the features or limitations referred to herein, except such a combination of features as are mutually exclusive, Or mutually inconsistent.
A preferred embodiment of the present invention will now be described, by way of example only, with reference to the accompanying diagrammatic drawings, in which: Figure 1 is a schematic representation of an inspection 15 system according to one embodiment of the present invention; Figure 2a is a schematic front view depiction of a field of view of a camera of the system of Figure 1; Figure 2b is a schematic plan view showing the camera and representing the field of view; Figure 3a is a schematic plan view of a camera and a surface of an asset for inspection; Figure 3b is a variant of the view shown in Figure 3a with contrasting detail; and Figure 3c is a further variant.
Turning to Figure 1, this shows, generally at 1000, an inspection system in accordance with an embodiment of the invention. The system 1000 comprises an inspection apparatus 1100 programmed to perform an inspection of a target 1200, in this case a cracked water pipe, a virtual network 1300 and a base station 1400 with an interface 1410 for a user (not shown) to monitor the inspection.
The inspection apparatus 1000 comprises a tracked vehicle 1110 and a mobile module 1120 which comprises a plurality of sub-modules: a mapping module 1130, an inspection module 1140, an orientation module 1150 and a processing module 1160 which allows algorithms to be computed on-board the apparatus. The mapping module 1130 uses a plurality of sensors, such as cameras (not shown) to collect data about the target 1200. This data is used to develop a three-dimensional virtual model of the target 1200 during the inspection. The mapping module 1130 determines the position and orientation of the vehicle 1110 with respect to the environment. The orientation module 1150 determines the orientation of the inspection module 1140 with respect to the vehicle 1110. By combining both orientations it is possible to determine the orientation of the mobile module with respect to the real world. This is required to know what the inspection sensor is looking at.
This inspection data is relayed, in real-time to the base station 1400 via the network 1300 using an on-board communication unit 1170.
The inspection data includes a real-time update of the progress of the inspection, using the three-dimensional model to visually indicate which areas of the target 1200 have been inspected by the apparatus 1100, to the requirements determined by the user, and which areas have not.
The requirements are input by the user and stored by the base station 1400. The base station 1400 communicates to the inspection apparatus 1100 via the network 1300 and unit 1150 to monitor, and if required by the user, modify the quality of the inspection, to meet the requirements.
These requirements may include the percentage of the field of view occupied by the target and the sensor-to-target distance, as will now be described with reference to Figures 2a -30.
Figures 2a and 2b are front and plan schematic camera views respectively as a camera 2000, mounted in use on a mobile module of an inspection apparatus (not shown). The concentric boxes of 2a and triangles of 2b respectively outline the percentage of field of view (%F0V) of the camera 2000 with respect to a target, with Fl being the maximum %F0V i.e. 100%, F2 being 80% and F3 being 50%. This value is important as it contributes to the degree of detail of the inspection data image.
Another determining factor is the sensor-to-target distance D as depicted in Figures 3a -3c.
Figure 3a shows generally a camera 3000 looking at a surface of a target S with a distance Da from the front of the camera 3000 to the surface S and a %F0V depicted by the broken line.
In the example shown in Figure 3b, the inspection of surface 35 S does not to meet the inspection requirements and therefore this part of the target is not yet marked on the three-dimensional model as being adequately inspected.
Figure 3c shows the surface of the target S falling within the %F0V and distance Dc. The area H of the target is captured to a required resolution and is recorded as such in an image that is relayed to and viewed at the base station 1400. This part of the target is now mapped adequately, and the inspection may continue.
The apparatus may utilise multiple sensors including light detection and ranging scanners (LiDAR scanners) as well as thermal and chemical sensors and various types of cameras.
Although the example discussed above is in relation to the inspection of an asset, it will be understood by those skilled in the art that the systems and methods described herein may also be used in various other operations, such as search and rescue inspections, military, defence and space exploration. This technology is applicable to the inspection market for any infrastructure inspection, but can also be used in the search and rescue, and defence sectors to ensure search and rescue personnel, and soldiers have searched all areas and confirmed that there are no injured personnel or targets have been missed.
Likewise, in alternative embodiments of the present invention (not shown) inspection systems may comprise a plurality of inspection apparatus 1100 that may communicate 35 between themselves and one or more base stations, to inspect an area or target too large to be inspected by one apparatus alone. Furthermore, the vehicle can be any of various types, including but not limited to a handheld mount, wheeled vehicle, tracked vehicle, aerial vehicle, floating vehicle, submersible vehicle and/or off-planet vehicle or combinations thereof.
The present invention provides an inspection system, an inspection apparatus and an inspection method with wide ranging uses. The quality of the inspection data allows the client to verify that the inspection data resolution or quality is sufficient that a small sized detail is identified. The process enables inspectors to visualise on a 3D model, in real time, the areas that have been inspected to the specified client requirements.
Depending on the client requirements, the user will select the % of Field of View (FOV) of the camera as well as the distance the robotic vehicle needs to be from the surface of the asset (Dx). If the surface of the asset is within the selected FOV and the selected Dx distance, then the surface is highlighted on a 3D point cloud. This will indicate to the inspector which regions of the assets have been inspected, as well as inspected to the necessary client requirements.
Simultaneous Localisation and Mapping (SLAM) is used to generate a 3D model of the asset, as well as locate the position and orientation of the robotic vehicle in the 3D model in real time. SLAM can be conducted with a variety of sensors; for our robotic systems we utilise a LiDAR scanner as well as cameras.
The processing of data need not take place on the mobile module. It may be processed entirely or in part in the base 10 station either in real-time or else post-process.
The mapping module may produce a virtual three-dimensional base line model of an inspection area, which is relayed to the base station. A user may then highlight areas of the base line model that they would like to be inspected, along with inspection requirements. The vehicle may then compute an inspection path to ensure that the desired regions are fully inspected, whilst continuously verifying that said regions are inspected to the user's requirements.
Whilst endeavouring in the foregoing specification to draw attention to those features of the invention believed to be of particular importance, it should be understood that the applicant claims protection in respect of any patentable feature or combination of features referred to herein, and/or shown in the drawings, whether or not particular emphasis has been placed thereon.

Claims (23)

  1. CLAIMS1. An inspection apparatus for inspection of an asset, the apparatus comprising at least one mobile module arranged in use to generate a virtual three-dimensional model of the asset and to determine its own position within the model.
  2. 2. An apparatus according to Claim 1, wherein the mobile module is arranged to generate a vicual three dimensional model of the asset and determine its own position within the model, either in real time, such as but not limited to using Simultaneous localisation and mapping, or post processed, such as but not limited to photogrammetry or structure from motion.
  3. 3. Apparatus according to Claim 1 or 2, wherein the apparatus is mountable on a mobile vehicle such as any of a handheld, body or helmet mount, wheeled vehicle, tracked vehicle, aerial vehicle, floating vehicle, submersible vehicle and/or off-planet vehicle.
  4. 4. Apparatus according to any of the preceding claims, wherein the mobile module includes an inspection module having one or more sensors, scanner and/or cameras, such as but not limited to visual, audio, non-destructive testing and/or thermal sensors.
  5. 5. Apparatus according to any of the preceding claims, wherein the mobile module includes an orientation module for monitoring the orientation of the inspection module with respect to the vehicle.
  6. 6. Apparatus according to any of the preceding claims, wherein the apparatus further comprises a base station which may allow a user to visualise the three-dimensional model of the inspection area in real-time and provide a visual indication of the progress of the inspection.
  7. 7. The base station may also be used to define an inspection criterion, such as a minimum/optimum detail of the data produced by the sensors and/or a maximum/optimum distance each sensor is required to be from the surface for gathering inspection data.
  8. 8. Apparatus according to any of the preceding claims, wherein the apparatus includes one or more processing units allowing algorithms to be computed on-board, on the base station or in post process, and at least one means of transmitting and receiving data to and from one or more external data modules and/or other inspection apparatus.
  9. 9. An inspection system, for inspecting an asset, the system comprising a mobile inspection vehicle, a mobile module for mounting on the vehicle and a base station, wherein the mobile module is arranged in use to generate a three-dimensional virtual model of the asset, and wherein the system is arranged in use to determine the position of the mobile module within the virtual model. 10. 11. 12. 13. 14.
  10. A system according to Claim 9, wherein the system is arranged in use to determine the position of the mobile module within the model substantially in real time.
  11. A system according to Claim 9 or 10, wherein the system is arranged to determine the motion of the mobile module in post-inspection processing.
  12. A system according to any of Claims 9-11, wherein the system is arranged in use to determine the orientation of the mobile unit with respect to the vehicle.
  13. A system according to any of Claims 9-12, wherein the mobile unit includes one or more sensors, scanners and/or cameras.
  14. A system according to any of Claims 9-13, wherein the vehicle comprises one or more handheld, helmet or body mounts, wheeled vehicle, tracked vehicle, aerial vehicle, floating vehicle, submersible vehicle and/or off-planet vehicle.
  15. 15. A system according to any of Claims 9-14, wherein the system includes a processing unit arranged in use to process data gathered by the mobile unit.
  16. 16. A system according to any of Claims 9-15, wherein the system includes a display, preferably located in the base station, for displaying a representation of the virtual model and/or the location and/or orientation of the mobile module within the model.
  17. 17. A system according to any of Claims 9-16, wherein the system includes an interface for allowing a user to determine operational parameters for the mobile module, including but not limited to: predetermined minimum/optimum distance for the mobile module from a given portion of the asset and/or minimum/optimum field of view of a scanner/camera of the mapping module with respect to the asset.
  18. 18. A method of performing an inspection of an asset, the method comprising manually and/or remotely operating an inspection apparatus according to any of Claims 1-8 or operating a system according to any of Claims 9-17.
  19. 19. A method of inspecting an asset, the method comprising generating a three-dimensional virtual model of the asset using a mobile module mounted on a vehicle, determining the position and/or orientation of the module within the model, inspecting the asset to a predetermined standard using one or more sensors of the module and updating the model with data to reflect the extent to which the asset has been inspected to the standard.
  20. 20. A method according to Claim 19, wherein the method includes determining the orientation of the mobile module with respect to the vehicle.
  21. 21. A method according to Claim 20 or 21, wherein the method includes determining the position and/or orientation of the mobile module and/or updating the model substantially in real-time.
  22. 22. A program for causing inspection apparatus to perform a method of inspecting an asset, the method comprising capturing inspection data from one or more sensors, scanners and/or cameras and constructing a virtual three-dimensional model of the asset.
  23. 23. A computer programme product on a computer readable medium, comprising instructions that, when executed by a computer, cause the computer to perform one or more methods of inspecting an asset according to any of Claims 18-21.
GB1912253.0A 2019-08-27 2019-08-27 Inspection apparatus and method Withdrawn GB2587794A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1912253.0A GB2587794A (en) 2019-08-27 2019-08-27 Inspection apparatus and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1912253.0A GB2587794A (en) 2019-08-27 2019-08-27 Inspection apparatus and method

Publications (2)

Publication Number Publication Date
GB201912253D0 GB201912253D0 (en) 2019-10-09
GB2587794A true GB2587794A (en) 2021-04-14

Family

ID=68108848

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1912253.0A Withdrawn GB2587794A (en) 2019-08-27 2019-08-27 Inspection apparatus and method

Country Status (1)

Country Link
GB (1) GB2587794A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4163871A1 (en) * 2021-10-08 2023-04-12 Pratt & Whitney Canada Corp. Inspecting an interior of a gas turbine engine apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150379701A1 (en) * 2013-02-04 2015-12-31 Dnv Gl Se Inspection camera unit, method for inspecting interiors, and sensor unit
WO2018140701A1 (en) * 2017-01-27 2018-08-02 Kaarta, Inc. Laser scanner with real-time, online ego-motion estimation
US20190235083A1 (en) * 2016-03-11 2019-08-01 Kaarta, Inc. Laser scanner with real-time, online ego-motion estimation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150379701A1 (en) * 2013-02-04 2015-12-31 Dnv Gl Se Inspection camera unit, method for inspecting interiors, and sensor unit
US20190235083A1 (en) * 2016-03-11 2019-08-01 Kaarta, Inc. Laser scanner with real-time, online ego-motion estimation
WO2018140701A1 (en) * 2017-01-27 2018-08-02 Kaarta, Inc. Laser scanner with real-time, online ego-motion estimation

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4163871A1 (en) * 2021-10-08 2023-04-12 Pratt & Whitney Canada Corp. Inspecting an interior of a gas turbine engine apparatus
US11933690B2 (en) 2021-10-08 2024-03-19 Pratt & Whitney Canada Corp. Inspecting an interior of a gas turbine engine apparatus

Also Published As

Publication number Publication date
GB201912253D0 (en) 2019-10-09

Similar Documents

Publication Publication Date Title
CN111512256B (en) Automated and adaptive three-dimensional robotic site survey
US20210073436A1 (en) Method and apparatus for holographic display based upon position and direction
CN109664301B (en) Inspection method, inspection device, inspection equipment and computer readable storage medium
JP6843773B2 (en) Environmental scanning and unmanned aerial vehicle tracking
EP2752657B1 (en) System and methods for stand-off inspection of aircraft structures
CN110599546A (en) Method, system, device and storage medium for acquiring three-dimensional space data
US11604065B2 (en) Fully automatic position and alignment determination method for a terrestrial laser scanner and method for ascertaining the suitability of a position for a deployment for surveying
JP4475632B2 (en) Transmission line inspection system using unmanned air vehicle
US10682677B2 (en) System and method providing situational awareness for autonomous asset inspection robot monitor
US20140063489A1 (en) Laser Scanner
US20190168787A1 (en) Inspection system and method
AU2016102409A4 (en) Local Positioning System for an Unmanned Aerial Vehicle
CN103733022A (en) 3d machine vision scanning information extraction system
US20150379701A1 (en) Inspection camera unit, method for inspecting interiors, and sensor unit
JP2007142517A (en) Mobile automatic supervisory apparatus
US20210344833A1 (en) Inspection workflow using object recognition and other techniques
US9704289B2 (en) Indexing method and system
US20220358764A1 (en) Change detection and characterization of assets
GB2587794A (en) Inspection apparatus and method
US10504294B2 (en) System and method for augmented reality discrepancy determination and reporting
CN114025148A (en) Monitoring method and monitoring system
US20220244303A1 (en) Method for ascertaining and depicting potential damaged areas on components of overhead cables
MXPA05001278A (en) Method and device for inspecting linear infrastructures.
JP7188596B2 (en) Aircraft inspection support device and aircraft inspection support method
Czetina et al. Robot assisted analysis of suspicious objects in public spaces using CBRN sensors in combination with high-resolution LIDAR

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)