WO2019230604A1 - Système d'inspection - Google Patents

Système d'inspection Download PDF

Info

Publication number
WO2019230604A1
WO2019230604A1 PCT/JP2019/020732 JP2019020732W WO2019230604A1 WO 2019230604 A1 WO2019230604 A1 WO 2019230604A1 JP 2019020732 W JP2019020732 W JP 2019020732W WO 2019230604 A1 WO2019230604 A1 WO 2019230604A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
dimensional model
image
abnormal
images
Prior art date
Application number
PCT/JP2019/020732
Other languages
English (en)
Japanese (ja)
Inventor
西本 晋也
Original Assignee
株式会社センシンロボティクス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社センシンロボティクス filed Critical 株式会社センシンロボティクス
Publication of WO2019230604A1 publication Critical patent/WO2019230604A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M99/00Subject matter not provided for in other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination

Definitions

  • the present invention relates to an inspection system.
  • Patent Document 1 describes that a three-dimensional image is created and displayed from an image captured by a camera mounted on a drone.
  • a point cloud is extracted from a plurality of 2D images to create a 3D model.
  • Patent Document 1 it is not specified at which coordinate in the three-dimensional image the degradation position is located.
  • the present invention has been made in view of such a background, and for a three-dimensional model created from a plurality of two-dimensional images, what is captured in the two-dimensional image is any coordinate in the coordinate system of the three-dimensional model.
  • One object is to provide a technique capable of specifying whether or not it is located.
  • a main invention of the present invention for solving the above-described problem is an inspection system for inspecting an inspection object, wherein a flying device including a camera is configured to detect the inspection object based on a plurality of images obtained by capturing the inspection object.
  • a three-dimensional model generation unit that generates a three-dimensional model, a shooting information acquisition unit that acquires a shooting position and a viewpoint axis direction of the camera in a three-dimensional coordinate system for each of the plurality of images; For each of a plurality of images, an abnormality detection unit that detects an abnormality of the inspection object based on the image, and the detected abnormality in the three-dimensional coordinate system according to the photographing position and the viewpoint axis direction
  • An abnormal position specifying unit that specifies an abnormal position, and a three-dimensional model display unit that displays the three-dimensional model in which the abnormal position is mapped. It is assumed that.
  • the present invention for a three-dimensional model created from a plurality of two-dimensional images, it is possible to specify at which coordinate in the coordinate system of the three-dimensional model what is photographed in the two-dimensional image.
  • FIG. 2 is a diagram illustrating a hardware configuration example of a flying device 10.
  • FIG. FIG. 3 is a diagram illustrating a software configuration example of a flight controller 11.
  • 6 is a diagram illustrating a configuration example of position and orientation information stored in a position and orientation information storage unit 151.
  • FIG. 6 is a diagram illustrating a configuration example of shooting information stored in a shooting information storage unit 152.
  • FIG. 2 is a diagram illustrating a hardware configuration example of an inspection server 30.
  • FIG. FIG. 3 is a diagram illustrating a software configuration example of an inspection server 30.
  • 3 is a diagram illustrating a configuration example of a three-dimensional model storage unit 352.
  • FIG. It is a figure which shows the structural example of the abnormality information registered into the abnormality information storage part 353.
  • FIG. It is a figure explaining the flow of the process which image
  • a flying device according to an embodiment of the present invention has the following configuration.
  • An inspection system for inspecting an inspection object A three-dimensional model generation unit that generates a three-dimensional model of the inspection object based on a plurality of images obtained by photographing a flight object with a camera; For each of the plurality of images, a photographing information acquisition unit that acquires a photographing position where the image is photographed in a three-dimensional coordinate system and a viewpoint axis direction of the camera; For each of the plurality of images, an abnormality detection unit that detects an abnormality of the inspection object based on the images; An abnormal position specifying unit that specifies an abnormal position that is a position in the three-dimensional coordinate system according to the photographing position and the viewpoint axis direction with respect to the detected abnormality, A three-dimensional model display unit that displays the three-dimensional model in which the abnormal position is mapped; An inspection system comprising: (Item 2) The inspection system according to item 1, wherein The abnormal position specifying unit specifies, as the abnormal position, coordinates at which a straight line from the shooting position toward the abnormal point intersects the
  • FIG. 1 is a diagram showing an overall configuration example of an inspection system according to an embodiment of the present invention.
  • the inspection system of the present embodiment creates a three-dimensional model of the inspection object 1 based on a plurality of images related to the inspection object 1 photographed by the flying device 10. Further, in the inspection system of the present embodiment, a photographed image is analyzed to detect an abnormal part, and the detected abnormal part is mapped on a three-dimensional model.
  • the inspection object 1 may be a concrete structure such as a slope or a dam, a steel structure such as a steel tower or an iron bridge, or a solar panel or a building. In the present embodiment, a building is assumed as the inspection target unit 1.
  • the inspection object 1 may be anything as long as it can be photographed, and may be, for example, an animal or a car, or may be a disaster area at the time of a disaster.
  • the inspection system includes a flying device 10 that images the inspection object 1 and an inspection server 30 that analyzes an image captured by the flying device 10.
  • the flying device 10 and the inspection server 30 are connected to each other via a communication network 50 so that they can communicate with each other.
  • the communication network 50 is assumed to be the Internet.
  • the communication network 50 is constructed by, for example, a wireless communication path, a cellular phone line network, a satellite communication path, a public telephone line network, a dedicated line network, Ethernet (registered trademark), or the like.
  • FIG. 2 is a diagram illustrating a hardware configuration example of the flying device 10.
  • the flying device 10 includes a propeller 18, a propulsion mechanism connected to the propeller 18 via an ESC (Electronic Speed Controller) 16 (in this embodiment, a motor 17 is assumed), and a flight controller 11 that controls them. Prepare.
  • ESC Electronic Speed Controller
  • the flying device 10 includes a camera 12 that captures a part or all of the inspection object 1.
  • the camera 12 is fixed to the body.
  • the camera 12 includes a vertically downward lens and captures an image directly below in the vertical direction. Therefore, when the attitude of the flying device 10 is fixed, the viewpoint axis (optical axis) 121 of the camera 12 is also fixed.
  • the camera 12 captures an RGB image capturing visible light, but a thermal image capturing infrared light may be captured, and both the RGB image and the thermal image may be captured simultaneously or You may make it image
  • various sensors such as a human sensor may be provided instead of the camera 12 or in addition to the camera 12, various sensors such as a human sensor may be provided.
  • the flight controller 11 can have one or more processors 101 such as a programmable processor (in this embodiment, a central processing unit (CPU) is assumed).
  • the flight controller 11 includes a memory 102 and can access the memory 102.
  • Memory 102 stores logic, code, and / or program instructions that are executable by flight controller 11 to perform one or more steps.
  • the memory 102 may include a separable medium such as an SD card or a random access memory (RAM) or an external storage device. Data acquired from the camera 12, sensor, or the like may be directly transmitted to and stored in the memory 102.
  • the flight controller 11 also includes various sensors 103.
  • the sensors 103 may include, for example, an inertial sensor (acceleration sensor, gyro sensor), a GPS (Global Positioning System) sensor, a proximity sensor (for example, a rider), or a vision / image sensor (for example, a camera).
  • an inertial sensor acceleration sensor, gyro sensor
  • GPS Global Positioning System
  • proximity sensor for example, a rider
  • a vision / image sensor for example, a camera.
  • the flight controller 11 is configured to send and / or receive data from one or more external devices (eg, a transceiver, a terminal, a display device, or other remote controller, etc.). It is possible to communicate with the transmitted / received unit 14.
  • the transmission / reception unit 14 can use any appropriate communication means such as wired communication or wireless communication. In the present embodiment, the transmission / reception unit 14 mainly communicates with the inspection server 30.
  • the transmission / reception unit 14 is, for example, a local area network (LAN), a wide area network (WAN), infrared, wireless, WiFi, point-to-point (P2P) network, telecommunication network, cloud communication, etc. One or more of these can be used.
  • the transmission / reception unit 14 may transmit and / or receive one or more of data acquired by sensors, a processing result generated by the flight controller, predetermined control data, a user command from a terminal or a remote controller, and the like. it can.
  • FIG. 3 is a diagram showing a software configuration example of the flight controller 11.
  • the flight controller 11 includes an instruction receiving unit 111, a flight control unit 112, a position / orientation information acquisition unit 113, an imaging processing unit 114, an imaging information transmission unit 115, a position / orientation information storage unit 151, an imaging information storage unit 152, a GPS sensor 104, An atmospheric pressure sensor 105, a temperature sensor 106, and an acceleration sensor 107 are provided.
  • the instruction receiving unit 111, the flight control unit 112, the position / orientation information acquisition unit 113, the imaging processing unit 114, and the imaging information transmission unit 115 are realized by the processor 101 executing a program stored in the memory 102.
  • the position / orientation information storage unit 151 and the shooting information storage unit 152 are realized as storage areas provided by the memory 102.
  • the instruction receiving unit 111 receives various commands for instructing the operation of the flying device 10 (hereinafter referred to as a flight operation command).
  • a flight operation command various commands for instructing the operation of the flying device 10
  • the flight control unit 112 controls the operation of the flying device 10. For example, the flight control unit 112 adjusts the spatial arrangement, speed, and / or acceleration of the flying device 10 having six degrees of freedom (translational motions x, y, and z and rotational motions ⁇ x, ⁇ y, and ⁇ z).
  • the motor 17 is controlled via the ESC 16.
  • the propeller 18 is rotated by the motor 17 to generate lift of the flying device 10.
  • the flight control unit 112 can control one or more of the states of the mounting unit and sensors.
  • the flight control unit 112 controls the operation of the flying device 10 in accordance with the flight operation command received by the instruction receiving unit 111. Further, the flight control unit 112 can perform various controls so that the flying device 10 continues to fly without depending on a command so as to enable autonomous flight.
  • the position and orientation information acquisition unit 113 acquires information indicating the current position and orientation of the flying device 10 (hereinafter referred to as position and orientation information).
  • the position and orientation information includes the position of the flying device 10 on the map (represented by latitude and longitude), the flight altitude of the flying device 10, and the x, y, and z axes of the flying device 10. The slope with respect to is assumed to be included. Note that the x, y, and z axes of the flying device 10 are coordinate axes orthogonal to each other, and can represent the plane coordinates and the vertical direction of the aircraft.
  • the sensors 103 include a GPS sensor 104, and the position and orientation information acquisition unit 113 can calculate the position of the flying device 10 on the map from the radio wave received from the GPS satellite by the GPS sensor 104.
  • the sensors 103 include an atmospheric pressure sensor 105 and a temperature sensor 106, and the position / orientation information acquisition unit 113 detects the atmospheric pressure (hereinafter referred to as a reference atmospheric pressure) measured by the atmospheric pressure sensor 105 before the flight and during the flight. Based on the difference between the atmospheric pressure measured by the atmospheric pressure sensor 105 (hereinafter referred to as the current atmospheric pressure) and the temperature measured by the temperature sensor 106 during the flight, the flight altitude of the flying device 10 can be calculated.
  • a reference atmospheric pressure the atmospheric pressure measured by the atmospheric pressure sensor 105
  • the current atmospheric pressure the temperature measured by the temperature sensor 106 during the flight
  • the sensors 103 include a triaxial acceleration sensor 107, and the position / orientation information acquisition unit 113 can obtain the attitude of the flying device 10 based on the output from the acceleration sensor 107. Further, the optical axis 121 (viewpoint axis) of the camera 12 can be determined from the attitude of the flying device 10.
  • the position and orientation information acquisition unit 113 includes a position (latitude and longitude) on the map of the flying device 10 acquired using the GPS sensor 104, a flight altitude of the flying device 10 acquired using the atmospheric pressure sensor 105 and the temperature sensor 106, The position and orientation information in which the attitude of the flying device 10 acquired using the acceleration sensor 107 is set is stored in the position and orientation information storage unit 151 of the memory 14.
  • FIG. 4 is a diagram illustrating a configuration example of the position / orientation information stored in the position / orientation information storage unit 151.
  • the position / orientation information storage unit 151 includes a latitude / longitude indicating the current position of the flying device 10, a flight altitude calculated as described above, a reference atmospheric pressure measured before the flight, and measured during the flight.
  • the current atmospheric pressure, the temperature measured during the flight, and the attitude of the flying device 10 acquired using the acceleration sensor 107 (the inclination of the optical axis 121 of the camera 12) are stored.
  • the photographing processing unit 114 controls the camera 12 to photograph a part or all of the inspection object 1 and acquires a photographed image photographed by the camera 12.
  • the imaging processing unit 114 performs imaging at a preset timing. For example, the imaging processing unit 114 can perform imaging every predetermined time (for example, any time such as 5 seconds or 30 seconds can be designated). Note that the imaging processing unit 114 may perform imaging in response to an instruction from the inspection server 30.
  • the imaging processing unit 114 stores the acquired captured image in the imaging information storage unit 152.
  • the photographing processing unit 114 is attached to the photographed image, and the photographing date and time, the latitude and longitude on the map of the flying device 10 at that time (imaging position), the flying altitude (imaging altitude) of the flying device 10 at that time, the flying device 10 (The inclination of the optical axis 121 of the camera 12) is also stored in the photographing information storage unit 152.
  • An image with such information attached is referred to as shooting information.
  • FIG. 5 is a diagram illustrating a configuration example of the shooting information stored in the shooting information storage unit 152.
  • the shooting information stored in the shooting information storage unit 152 includes shooting date and time, shooting position, shooting altitude, tilt, and image data.
  • the shooting information can be stored as a file on a file system, for example.
  • the shooting date / time, shooting position, shooting altitude, and tilt may be stored as, for example, Exif (Exchangeable image file format) information of a JPEG (Joint-Photographic Experts Group) image file, or image data as a file
  • a record may be stored in the system, and a record in which the shooting date / time, shooting position, shooting altitude / tilt, and file name are associated with each other may be registered in the database.
  • the shooting information may include characteristics of the camera 12 such as a focal length and a shutter speed.
  • the imaging information transmission unit 115 transmits the image captured by the camera 12 to the inspection server 30.
  • the shooting information transmission unit 115 transmits shooting information in which a shooting date and time, a shooting position, a shooting altitude, and a tilt are added to a shooting image to the inspection server 30.
  • FIG. 6 is a diagram illustrating a hardware configuration example of the inspection server 30.
  • the inspection server 30 includes a CPU 301, a memory 302, a storage device 303, a communication device 304, an input device 305, and an output device 306.
  • the storage device 303 is, for example, a hard disk drive, a solid state drive, or a flash memory that stores various data and programs.
  • the communication device 304 communicates with other devices via the communication network 50.
  • the communication device 304 includes, for example, an adapter for connecting to Ethernet (registered trademark), a modem for connecting to a public telephone network, a wireless communication device for performing wireless communication, a USB connector for serial communication, an RS232C connector, and the like It is comprised including.
  • the input device 305 is, for example, a keyboard, mouse, touch panel, button, or microphone that inputs data.
  • the output device 306 is, for example, a display, a printer, or a speaker that outputs data.
  • FIG. 7 is a diagram illustrating a software configuration example of the inspection server 30.
  • the inspection server 30 includes a flight control unit 311, an imaging information reception unit 312, a 3D model creation unit 313, an abnormality detection unit 314, a 3D model display unit 315, a captured image display unit 316, an imaging information storage unit 351, and a 3D model.
  • a storage unit 352 and an abnormality information storage unit 353 are provided.
  • the flight control unit 311, the imaging information reception unit 312, the 3D model creation unit 313, the abnormality detection unit 314, the 3D model display unit 315, and the captured image display unit 316 are stored in the storage device 303 by the CPU 301 included in the inspection server 30.
  • the imaging information storage unit 351, the three-dimensional model storage unit 352, and the abnormality information storage unit 353 are realized by reading the stored program into the memory 302, and the memory 302 and the storage device 303 included in the inspection server 30. This is realized as a part of the storage area to be provided.
  • the flight control unit 311 controls the flight device 10 to fly.
  • the flight control unit 311 operates the flying device 10 by transmitting a flight operation command to the flying device 10.
  • the flight control unit 311 can receive, for example, designation of a flight path on a map from an operator and transmit a flight operation command so as to fly on the accepted flight path. Further, the flight control unit 311 continuously makes the flying device 10 fly around the inspection object 1 so that at least a part of the imaging region overlaps in order to create a three-dimensional model of the inspection object 1 as described later.
  • the flying device 10 is controlled so that it can be photographed.
  • the flying device 10 is controlled by the flight control unit 311 of the inspection server 30.
  • the flying device 10 is autonomous by software (data) for automatic flight stored in advance. It is possible to use a method that is controlled automatically, a method that is controlled (operated) by an operator's manual operation, or a combination thereof.
  • the shooting information receiving unit 312 receives shooting information transmitted from the flying device 10.
  • the shooting information receiving unit 312 stores the received shooting information in the shooting information storage unit 351.
  • the configuration of the shooting information storage unit 351 is the same as that of the shooting information storage unit 152 of the flying device 10 shown in FIG.
  • the imaging information storage unit 351 included in the inspection server 30 may store imaging information in association with information specifying the flying device 10 that is the transmission source of imaging information.
  • the 3D model creation unit 313 creates a 3D model that represents a 3D structure from a plurality of captured images.
  • the three-dimensional model creation unit 313 can create a three-dimensional model using a so-called photogrammetry (photogrammetry) technique.
  • the world coordinate system of the three-dimensional model is represented by latitude, longitude, and altitude.
  • the position and viewpoint direction of the camera 12 in the world coordinate system can be indicated by the shooting position, the shooting altitude, and the tilt of the optical axis 121 included in the shooting information.
  • the three-dimensional model creation unit 313 extracts feature points from the image data included in the shooting information, and extracts feature points extracted from the plurality of image data based on the shooting position, shooting height, and inclination included in the shooting information.
  • a three-dimensional point group (also referred to as a point cloud) in the world coordinate system is acquired.
  • the three-dimensional model is assumed to be composed of a three-dimensional point group, but the three-dimensional model creation unit 313 can also generate surface data such as a polygon based on the point group. Further, the three-dimensional model creation unit 313 can generate an ortho image based on the three-dimensional model and the captured image.
  • the 3D model creation unit 313 registers the created 3D model (a 3D point cloud in this embodiment) in the 3D model storage unit 352.
  • FIG. 8 is a diagram illustrating a configuration example of the three-dimensional model storage unit 352. As shown in the figure, the coordinates for each point constituting the three-dimensional point group are registered in the three-dimensional model storage unit 352.
  • the 3D model creation unit 313 creates surface data such as a polygon based on the 3D point group
  • the 3D model storage unit 352 adds or replaces the point coordinates, The surface data may be stored.
  • the abnormality detection unit 314 detects an abnormality of the inspection object 1 by analyzing the captured image from the flying device 10. For example, the abnormality detection unit 314 performs learning using machine learning such as a neural network using, as a teacher signal, an image obtained by photographing the inspection object 1 in which an abnormality has occurred in the past, and obtains a photographed image received from the flying device 10. Abnormality can be determined as an input signal. In addition, the abnormality detection unit 314 stores a normal image in the memory 302 in advance, compares the normal image with the captured image, and recognizes an area constituted by pixels having a difference greater than or equal to a predetermined value. However, it is possible to determine that the area of the region is equal to or greater than a predetermined value as abnormal. Note that the abnormality detection unit 314 can detect an abnormal part from the image using a known technique.
  • the anomaly detection unit 314 specifies the position of the world coordinate system for the anomalous part on the detected captured image. For example, the anomaly detection unit 314 has, for each of the points included in the three-dimensional point group, in the direction indicated by the inclination included in the shooting information from the camera set at the shooting position and shooting altitude included in the shooting information. Determine the position on the image at the time of shooting, determine whether the point on the image is included in the area detected as an abnormal location, whether the point constitutes an abnormal location, It is possible to specify the coordinates of the points constituting the abnormal part as the position of the abnormal part.
  • the abnormality detection unit 314 registers information related to the detected abnormality (hereinafter referred to as abnormality information) in the abnormality information storage unit 353.
  • FIG. 9 is a diagram illustrating a configuration example of abnormality information registered in the abnormality information storage unit 353.
  • the abnormality information includes a coordinate position (abnormal position) on the world coordinate system indicating the abnormal part and information for specifying photographing information related to an image obtained by photographing the abnormal part (for example, a JPEG file). Name and Exif information of the image file, etc.) and a position where an abnormality is detected on the image (hereinafter referred to as an abnormal image position).
  • an abnormal image position may include only one coordinate, for example.
  • the image abnormal position may be information representing a point or a geometric figure.
  • a portion showing the abnormality may be directly drawn at an abnormal position in the image.
  • the coordinate position related to the abnormal position is not used, and for example, the information related to the captured image may be directly associated with the abnormal position.
  • the 3D model display unit 315 displays an image obtained by projecting the 3D model created by the 3D model creation unit 313 onto a plane (hereinafter referred to as a 3D projection image).
  • the three-dimensional model display unit 315 may display the three-dimensional projection image with a wire frame, or may map the captured image to the three-dimensional model.
  • the three-dimensional model display unit 315 displays the position of the camera 12 and the direction of the viewpoint axis superimposed on the three-dimensional model for each captured image that created the three-dimensional model.
  • the position of the camera 12 and the direction of the viewpoint axis are displayed.
  • the setting of whether or not to display and the switching method thereof are appropriately selected according to the application and needs. Can be changed.
  • FIG. 10 is a diagram for explaining the flow of processing for photographing the inspection object 1. The imaging process shown in FIG. 10 is periodically executed while the flying device 10 is flying.
  • the imaging processing unit 114 controls the camera 12 to acquire a captured image captured by the camera 12 (S201), and the position / orientation information acquisition unit 113 includes a GPS sensor 104, an atmospheric pressure sensor 105, and a temperature sensor. Based on the signals from 106 and the acceleration sensor 107, the photographing position, the photographing altitude and the posture are obtained (S202).
  • the shooting information transmission unit 115 creates shooting information in which the current date and time (shooting date and time), shooting position, shooting altitude, and posture are added to the shot image acquired from the camera 12, and transmits the shooting information to the inspection server 30 (S203). .
  • the imaging information receiving unit 312 receives the imaging information transmitted from the flying device 10 (S204), and registers the received imaging information in the imaging information storage unit 351 (S205).
  • the images photographed by the flying device 10 are sequentially registered in the photographing information storage unit 351 together with the photographing date and time, the photographing position, the photographing altitude, and the attitude of the flying device 10 (the inclination of the optical axis 121). It will be.
  • FIG. 11 is a diagram showing a flow of inspection processing executed by the inspection server 30. In the present embodiment, it is assumed that the processing shown in FIG. 11 is performed after all shooting by the flying device 10 is completed, but may be performed while the flying device 10 continues shooting.
  • the 3D model creation unit 313 creates a 3D model based on the shooting information registered in the shooting information storage unit 351 (S211).
  • the 3D model is created by a general photogrammetry (photogrammetry) technique.
  • the three-dimensional model creation unit 313 searches for corresponding points for two pairs of shooting information that are consecutive in the order of shooting times, and for the corresponding points that are found, the world of the corresponding points according to the flight position, flight altitude, and posture.
  • the position in the coordinate system can be specified.
  • the creation of the three-dimensional model is an example, and in addition to the above, it can be created by various methods.
  • the 3D model creation unit 313 registers the created 3D model in the 3D model storage unit 352 (S212).
  • the three-dimensional model is composed of a three-dimensional point group. Therefore, the coordinates of the corresponding point in the world coordinate system can be registered in the three-dimensional model storage unit 352.
  • the abnormality detection unit 314 performs the following processing for each piece of shooting information registered in the shooting information storage unit 351. That is, the abnormality detection unit 314 analyzes the image data included in the imaging information and inspects whether there is an abnormality in the inspection object 1 (S213). When an abnormality of the inspection object 1 is detected from the image data (S214: YES), the abnormality detection unit 314 indicates the light indicated by the inclination of the shooting information from the coordinates of the shooting position and the shooting altitude of the shooting information in the world coordinate system. The coordinates where the straight line extending in the direction of the axis 121 (viewpoint axis) and the three-dimensional model intersect are specified as the position where the abnormality has occurred (abnormal position) (S215).
  • the abnormality detection unit 314 passes through the nearest point when the optical axis 121 from the camera 12 does not intersect any of the points included in the three-dimensional point group, and is the closest point parallel to the optical axis 121. May be determined as an abnormal position.
  • the abnormality detection unit 314 associates information for specifying shooting information (for example, a file name) and a location where an abnormality is found on the image data (image abnormal position) in association with the abnormal position, and stores abnormality information. Registered in the unit 353 (S216).
  • the 3D model display unit 315 displays an image obtained by projecting the 3D model (S217).
  • FIG. 12 is a diagram for explaining a three-dimensional projection image displayed by the three-dimensional model display unit 315.
  • a three-dimensional projection image 41 obtained by mapping a photographed image with respect to a three-dimensional model created by the three-dimensional model creation unit 313 based on a plurality of photographed images is displayed on the screen 40.
  • the three-dimensional model display unit 315 can create a three-dimensional projection image 41 by projecting a point group registered in the three-dimensional model storage unit 352, for example.
  • the 3D model display unit 315 superimposes and displays a figure indicating the camera 12 that has captured the captured image on the 3D projection image 41.
  • the three-dimensional model display unit 315 has the camera 12 tilted in the direction indicated by the tilt from the shooting position and the shooting altitude included in the shooting information for each shooting information registered in the shooting information storage unit 351.
  • a graphic representing the image is displayed so as to be superimposed on the three-dimensional model.
  • a cone 42 indicating the position of the camera 12 and the direction of the viewpoint axis is displayed superimposed on the three-dimensional projection image 41.
  • the position of the camera 12 and the direction of the viewpoint axis are displayed for ease of explanation. It can be appropriately changed according to the above.
  • the 3D model display unit 315 superimposes and displays a graphic indicating the abnormal part on the 3D projection image 41 (S219). Specifically, the three-dimensional model display unit 315 displays a predetermined graphic at the coordinates indicated by the abnormal position for each abnormality information registered in the abnormality information storage unit 353. In the example of FIG. 12, the mark 43 is displayed superimposed on the three-dimensional projection image 41. In the example of FIG. 12, only one mark 43 is displayed. However, when a plurality of abnormal locations are registered in the abnormality information storage unit 353, a plurality of marks 43 are displayed. .
  • a three-dimensional model can be created on the basis of a plurality of images photographed by the flying device 10, and an abnormal part detected from the photographed image can be mapped and displayed on the three-dimensional model.
  • FIG. 13 is a diagram illustrating a flow of processing for displaying a captured image in which an abnormal portion at a specified position is captured.
  • the captured image display unit 316 receives the specification of the position (S231), and the received position.
  • the abnormality information corresponding to is retrieved from the abnormality information storage unit 353 (S232).
  • the captured image display unit 316 reads out the shooting information indicated by the image included in the searched abnormality information from the shooting information storage unit 351 (S233), and outputs the image data included in the read out shooting information to the screen 40 (S234). ).
  • the photographed image display unit 316 can display the photographed image obtained by photographing the position designated by the user on the screen 40.
  • the position on the photographed image can be associated with the position on the three-dimensional model based on the photographing position, the photographing altitude, and the posture. Therefore, the abnormal part detected from the image can be captured not only as an image but also as an abnormal part on the three-dimensional model.
  • a position on the three-dimensional model is specified, it is possible to easily specify an image obtained by photographing the abnormal portion by specifying the abnormal information corresponding to the specified position. And a detailed two-dimensional image can be provided quickly and accurately.
  • the camera 12 is fixed to the lower part of the machine body, but the present invention is not limited to this, and the camera 12 may be movably mounted via a gimbal.
  • the shooting information includes the shooting direction of the camera 12 in addition to the tilt of the flying device 10.
  • the photographing altitude is obtained using the atmospheric pressure sensor 105 and the temperature sensor 106.
  • the present invention is not limited to this, and the photographing altitude may be obtained using a known technique.
  • the shooting information from the flying device 10 is transmitted to the inspection server 30 every time shooting is performed by the camera 12.
  • the flying device 10 stores the shooting information in the shooting information storage unit 152.
  • the shooting information stored in the shooting information storage unit 152 may be transmitted to the inspection server 30 periodically during the flight or once after the flight.
  • the user selects the position of the abnormal location on the screen 40, the abnormal location corresponding to the selected position is searched, and the captured image corresponding to the searched abnormal location is displayed.
  • the abnormality information may be associated with the mark 43 and the abnormality information may be specified from the selected mark 43.
  • the user can select only the position of the abnormal location on the screen 40, but may select any position (coordinates) other than the abnormal location.
  • images (44a, 44b, 44c) corresponding to the selected locations are displayed.
  • the captured image display unit 316 first sets a straight line from the flight position and flight altitude coordinates to the designated coordinates for each of the captured information registered in the captured information storage unit 351. Whether or not the designated coordinates can be photographed in the image data related to the photographing information is determined based on whether or not the angle is within the angle of view of the camera 12 when the camera 12 is directed from the coordinates to the viewpoint axis. .
  • the photographic image display unit 316 has the above straight line passing through another point of the three-dimensional model (or less than a predetermined distance from the other point) for the photographic information determined that the designated coordinates can be photographed. Whether or not shooting of the designated coordinates is hindered is determined by whether or not it passes through the polygon of the three-dimensional model. Finally, the captured image display unit 316 can display image data for the captured information that is determined not to prevent the capturing of the designated coordinates.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Testing Of Devices, Machine Parts, Or Other Structures Thereof (AREA)

Abstract

La présente invention vise à rendre possible de spécifier, dans le système de coordonnées d'un modèle 3D créé à partir d'une pluralité d'images bidimensionnelles, les coordonnées de la position d'un objet photographié sur une image bidimensionnelle. L'invention concerne par conséquent un système d'inspection pour inspecter un objet en cours d'inspection qui comprend : une unité de génération de modèle 3D pour générer un modèle 3D de l'objet en cours d'inspection sur la base d'une pluralité d'images d'un objet d'inspection photographié par un dispositif volant comprenant une caméra ; une unité d'acquisition d'informations de photographie pour acquérir, pour chacune de la pluralité d'images, la position de photographie, à l'intérieur d'un système de coordonnées 3D, où l'image a été photographiée et la direction d'axe de visualisation de la caméra à l'intérieur du système de coordonnées 3D ; une unité de détection d'anomalie pour, pour chaque image de la pluralité d'images, détecter une anomalie dans l'objet en cours inspection à partir de l'image ; une unité de spécification de position d'anomalie pour spécifier, en fonction de la position de photographie et de la direction de l'axe de visualisation, une position d'anomalie qui est la position de l'anomalie détectée dans le système de coordonnées 3D ; et une unité d'affichage de modèle 3D pour afficher le modèle 3D avec la position d'anomalie mappée sur celui-ci.
PCT/JP2019/020732 2018-05-31 2019-05-24 Système d'inspection WO2019230604A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018105559A JP6583840B1 (ja) 2018-05-31 2018-05-31 検査システム
JP2018-105559 2018-05-31

Publications (1)

Publication Number Publication Date
WO2019230604A1 true WO2019230604A1 (fr) 2019-12-05

Family

ID=68095283

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/020732 WO2019230604A1 (fr) 2018-05-31 2019-05-24 Système d'inspection

Country Status (2)

Country Link
JP (1) JP6583840B1 (fr)
WO (1) WO2019230604A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112904437A (zh) * 2021-01-14 2021-06-04 支付宝(杭州)信息技术有限公司 基于隐私保护的隐藏组件的检测方法及隐藏组件检测装置

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019211486A (ja) * 2019-08-26 2019-12-12 株式会社センシンロボティクス 検査システム
JP6818379B1 (ja) * 2020-09-01 2021-01-20 株式会社センシンロボティクス 飛行体の飛行経路作成方法及び管理サーバ
JP7003352B1 (ja) 2021-04-12 2022-01-20 株式会社三井E&Sマシナリー 構造物の点検データ管理システム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160307366A1 (en) * 2015-04-14 2016-10-20 ETAK Systems, LLC Virtualized site survey systems and methods for cell sites
JP2018010630A (ja) * 2016-06-30 2018-01-18 株式会社日立システムズ 被写体異常有無調査システム
KR20180036075A (ko) * 2016-09-30 2018-04-09 (주)태성에스엔아이 구조물 손상 정보가 매핑된 3차원 모델 생성 방법 및 이를 실행시키는 프로그램이 기록된 기록 매체
JP2019070631A (ja) * 2017-10-11 2019-05-09 株式会社日立システムズ 飛行体利用劣化診断システム

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS639850A (ja) * 1986-07-01 1988-01-16 Mitsubishi Electric Corp 3次元物体の曲面状態管理装置
JPH11183172A (ja) * 1997-12-25 1999-07-09 Mitsubishi Heavy Ind Ltd 写真測量支援システム
JP2006027331A (ja) * 2004-07-13 2006-02-02 Hiroboo Kk 無人飛行体を利用した航空映像情報の収集方法
CN205808364U (zh) * 2016-02-29 2016-12-14 北方民族大学 一种基于无人机的单光束土地面积测量系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160307366A1 (en) * 2015-04-14 2016-10-20 ETAK Systems, LLC Virtualized site survey systems and methods for cell sites
JP2018010630A (ja) * 2016-06-30 2018-01-18 株式会社日立システムズ 被写体異常有無調査システム
KR20180036075A (ko) * 2016-09-30 2018-04-09 (주)태성에스엔아이 구조물 손상 정보가 매핑된 3차원 모델 생성 방법 및 이를 실행시키는 프로그램이 기록된 기록 매체
JP2019070631A (ja) * 2017-10-11 2019-05-09 株式会社日立システムズ 飛行体利用劣化診断システム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112904437A (zh) * 2021-01-14 2021-06-04 支付宝(杭州)信息技术有限公司 基于隐私保护的隐藏组件的检测方法及隐藏组件检测装置

Also Published As

Publication number Publication date
JP6583840B1 (ja) 2019-10-02
JP2019211257A (ja) 2019-12-12

Similar Documents

Publication Publication Date Title
WO2019230604A1 (fr) Système d'inspection
JP6878567B2 (ja) 3次元形状推定方法、飛行体、モバイルプラットフォーム、プログラム及び記録媒体
JP6765512B2 (ja) 飛行経路生成方法、情報処理装置、飛行経路生成システム、プログラム及び記録媒体
JP6802599B1 (ja) 検査システム
JP2023100642A (ja) 検査システム
JP6675537B1 (ja) 飛行経路生成装置、飛行経路生成方法とそのプログラム、構造物点検方法
US11122209B2 (en) Three-dimensional shape estimation method, three-dimensional shape estimation system, flying object, program and recording medium
WO2020062178A1 (fr) Procédé basé sur une carte d'identification d'objet cible, et terminal de commande
US11082639B2 (en) Image display method, image display system, flying object, program, and recording medium
WO2020048365A1 (fr) Procédé et dispositif de commande de vol pour aéronef, et dispositif terminal et système de commande de vol
JP2017201757A (ja) 画像取得システム、画像取得方法、画像処理方法
JP6681101B2 (ja) 検査システム
WO2018214401A1 (fr) Plate-forme mobile, objet volant, appareil de support, terminal portable, procédé d'aide à la photographie, programme et support d'enregistrement
WO2021251441A1 (fr) Procédé, système et programme
US20210229810A1 (en) Information processing device, flight control method, and flight control system
US20210404840A1 (en) Techniques for mapping using a compact payload in a movable object environment
CN109891188B (zh) 移动平台、摄像路径生成方法、程序、以及记录介质
WO2020225979A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et système de traitement d'informations
WO2020042980A1 (fr) Appareil de traitement d'informations, procédé de commande de prise de vue, programme, et support d'enregistrement
JP6681102B2 (ja) 検査システム
JP6684012B1 (ja) 情報処理装置および情報処理方法
KR102181809B1 (ko) 시설물 점검 장치 및 방법
WO2021130980A1 (fr) Procédé d'affichage de trajectoire de vol d'un aéronef et dispositif de traitement d'informations
WO2021016867A1 (fr) Dispositif terminal et procédé de traitement de données associé, et véhicule aérien sans pilote et procédé de commande associé
WO2022113482A1 (fr) Dispositif, procédé et programme de traitement d'informations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19810757

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19810757

Country of ref document: EP

Kind code of ref document: A1