WO2022107485A1 - 構造物の点検支援装置、構造物の点検支援方法及びプログラム - Google Patents
構造物の点検支援装置、構造物の点検支援方法及びプログラム Download PDFInfo
- Publication number
- WO2022107485A1 WO2022107485A1 PCT/JP2021/037302 JP2021037302W WO2022107485A1 WO 2022107485 A1 WO2022107485 A1 WO 2022107485A1 JP 2021037302 W JP2021037302 W JP 2021037302W WO 2022107485 A1 WO2022107485 A1 WO 2022107485A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- damage
- information
- target structure
- comment
- selection
- Prior art date
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 100
- 238000000034 method Methods 0.000 title claims abstract description 75
- 230000008569 process Effects 0.000 claims abstract description 51
- 238000012545 processing Methods 0.000 claims description 68
- 238000000605 extraction Methods 0.000 claims description 20
- 230000006870 function Effects 0.000 claims description 18
- 238000010801 machine learning Methods 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 32
- 230000008859 change Effects 0.000 description 19
- 238000003860 storage Methods 0.000 description 18
- 238000001514 detection method Methods 0.000 description 14
- 239000013598 vector Substances 0.000 description 14
- 238000010276 construction Methods 0.000 description 7
- 230000008439 repair process Effects 0.000 description 7
- 235000008733 Citrus aurantifolia Nutrition 0.000 description 6
- 235000011941 Tilia x europaea Nutrition 0.000 description 6
- 239000004571 lime Substances 0.000 description 6
- 238000013473 artificial intelligence Methods 0.000 description 5
- 239000000463 material Substances 0.000 description 5
- 230000015654 memory Effects 0.000 description 5
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 5
- 229910000831 Steel Inorganic materials 0.000 description 4
- 238000013527 convolutional neural network Methods 0.000 description 4
- 230000006866 deterioration Effects 0.000 description 4
- 238000011835 investigation Methods 0.000 description 4
- 150000003839 salts Chemical class 0.000 description 4
- 239000010959 steel Substances 0.000 description 4
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 3
- 240000004050 Pentaglottis sempervirens Species 0.000 description 3
- 239000004567 concrete Substances 0.000 description 3
- 230000007797 corrosion Effects 0.000 description 3
- 238000005260 corrosion Methods 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 230000003014 reinforcing effect Effects 0.000 description 3
- 239000002585 base Substances 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 239000011248 coating agent Substances 0.000 description 2
- 238000000576 coating method Methods 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 235000011389 fruit/vegetable juice Nutrition 0.000 description 2
- JEIPFZHSYJVQDO-UHFFFAOYSA-N iron(III) oxide Inorganic materials O=[Fe]O[Fe]=O JEIPFZHSYJVQDO-UHFFFAOYSA-N 0.000 description 2
- 238000006386 neutralization reaction Methods 0.000 description 2
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 description 2
- 230000000737 periodic effect Effects 0.000 description 2
- 238000011176 pooling Methods 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 239000011513 prestressed concrete Substances 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000007619 statistical method Methods 0.000 description 2
- 239000003513 alkali Substances 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 230000001066 destructive effect Effects 0.000 description 1
- 230000008014 freezing Effects 0.000 description 1
- 238000007710 freezing Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003058 natural language processing Methods 0.000 description 1
- 238000000513 principal component analysis Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 239000011150 reinforced concrete Substances 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/08—Construction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30132—Masonry; Concrete
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30184—Infrastructure
Definitions
- the present invention relates to a structure inspection support device, a structure inspection support method, and a program.
- the investigator who inspected the structure needs to prepare an inspection record of the prescribed format based on the inspection procedure specified by the manager of the structure as a form showing the result of the inspection.
- an expert different from the investigator who actually inspected can grasp the progress of damage to the structure and formulate a maintenance plan for the structure. Will be possible.
- Patent Document 1 discloses a system that can shorten the time required for the preparation of the inspection report.
- the present invention has been made in view of such circumstances, and provides a structure inspection support device, a structure inspection support method, and a program capable of improving the efficiency of comment work and leveling comments such as findings. That is.
- the structure inspection support device of the first aspect is a structure inspection support device including a processor, wherein the processor selects information about the target structure including at least one of a photographed image and damage information of the target structure.
- a selection process for accepting, a creation process for creating a comment for damage to the target structure based on information about the selected target structure, and a display process for displaying the comment on the display are performed.
- the creation process creates at least one comment on the damage of the target structure based on at least one of the photographed image and the damage information.
- the creation process is created as a comment for damage to the target structure using machine learning.
- a database including a database for storing information on a past target structure including at least one of a photographed image of the structure and damage information and a comment on the damage of the structure in association with each other is provided. Based on the information about the target structure and the information about the structure stored in the database, the creation process makes a comment about the damage of the structure similar to the damage of the target structure as a comment about the damage of the structure. create.
- the selection process accepts the selection of the information about the target structure by selecting the three-dimensional model of the structure associated with the information about the target structure.
- the selection process automatically accepts selection of information about the target structure for which a comment is created from the target structure.
- the selection process automatically accepts selection of information about the target structure based on at least one of the captured image and the damage information.
- the processor accepts editing for the comment and executes an editing process for changing the comment.
- the editing process accepts a comment candidate selected from a plurality of comment candidates corresponding to the comment as an edit to the comment.
- the processor executes the related information extraction process for extracting the related information related to the damage of the target structure, and the display process displays the related information on the display.
- the structure inspection support method of the eleventh aspect is based on a selection step that accepts selection of information about the target structure including at least one of a photographed image and damage information of the target structure, and information about the selected target structure. Then, the processor performs a creation step of creating a comment for damage to the target structure and a display step of displaying the comment on the display.
- the structure inspection support program of the twelfth aspect is based on a selection function that accepts a selection of information about the target structure including at least one of a photographed image and damage information of the target structure and information about the selected target structure.
- the computer realizes a creation function for creating a comment for damage to the target structure and a display function for displaying the comment on the display.
- the structure inspection support device According to the structure inspection support device, the structure inspection support method and the program of the present invention, it is possible to improve the efficiency of comment work and level the comments such as findings.
- FIG. 1 is a block diagram showing an example of a hardware configuration of an inspection support device for a structure.
- FIG. 2 is a block diagram showing a processing function realized by a CPU.
- FIG. 3 is a diagram showing information and the like stored in the storage unit.
- FIG. 4 is a flow chart showing an inspection support method using an inspection support device for a structure.
- FIG. 5 is a diagram showing an example of a captured image.
- FIG. 6 is a diagram showing an example of damage information.
- FIG. 7 is a diagram showing an example of a three-dimensional model.
- FIG. 8 is a diagram showing how information about the target structure is selected from the three-dimensional model.
- FIG. 9 is a block diagram of a creation processing unit that executes processing using the trained model.
- FIG. 9 is a block diagram of a creation processing unit that executes processing using the trained model.
- FIG. 10 is a block diagram of a creation processing unit that executes a processing for extracting similar damage.
- FIG. 11 is a diagram showing an example of a method for extracting similar damage.
- FIG. 12 is a diagram showing an example of a screen displayed on the display device in the selection step.
- FIG. 13 is a diagram showing another example of the screen displayed on the display device in the selection step.
- FIG. 14 is a diagram showing an example of displaying an example of a template of inspection record data on a display device.
- FIG. 15 is a diagram showing an example in which inspection record data in which text data other than comments such as findings are input is displayed on a display device.
- FIG. 16 is a diagram showing an example in which the inspection record data in which the text data of the comment is input is displayed on the display device.
- FIG. 17 is a diagram showing an example of an editing process.
- the "structure” includes a building, for example, a civil engineering structure such as a bridge, a tunnel, and a dam, and also includes a building such as a building, a house, a building wall, a pillar, and a beam. be.
- FIG. 1 is a block diagram showing an example of a hardware configuration of an inspection support device for a structure according to the present invention.
- the structure inspection support device 10 shown in FIG. 1 a computer or a workstation can be used.
- the structure inspection support device 10 of this example mainly includes an input / output interface 12, a storage unit 16, an operation unit 18, a CPU (Central Processing Unit) 20, a RAM (Random Access Memory) 22, and a ROM (Read Only Memory). It is composed of 24 and a display control unit 26.
- a display device 30 constituting a display is connected to the structure inspection support device 10, and the display device 30 is displayed under the command of the CPU 20 under the control of the display control unit 26.
- the display device 30 is composed of, for example, a monitor.
- the input / output interface 12 can input various data (information) to the inspection support device 10 of the structure.
- the data stored in the storage unit 16 is input via the input / output interface 12.
- the CPU (processor) 20 reads out various programs including the inspection support program of the structure of the embodiment stored in the storage unit 16, ROM 24, etc., expands them in the RAM 22, and performs calculations to control each unit in an integrated manner. do. Further, the CPU 20 reads out the program stored in the storage unit 16 or the ROM 24, performs calculations using the RAM 22, and performs various processes of the structure inspection support device 10.
- FIG. 2 is a block diagram showing a processing function realized by the CPU 20.
- the CPU 20 has a selection processing unit 51, a creation processing unit 53, a display processing unit 55, and the like. The specific processing functions of each part will be described later. Since the selection processing unit 51, the creation processing unit 53, and the display processing unit 55 are a part of the CPU 20, the CPU 20 can also be referred to as executing the processing of each unit.
- the storage unit (memory) 16 is a memory composed of a hard disk device, a flash memory, and the like.
- the storage unit 16 stores data and programs for operating the structure inspection support device 10, such as an operating system and a program for executing a structure inspection support method. Further, the storage unit 16 stores information and the like used in the present embodiment described below.
- FIG. 3 is a diagram showing information and the like stored in the storage unit 16.
- the storage unit 16 is composed of a non-temporary recording medium such as a CD (Compact Disk), a DVD (Digital Versatile Disk), a hard disk (Hard Disk), various semiconductor memories, and a control unit thereof.
- a non-temporary recording medium such as a CD (Compact Disk), a DVD (Digital Versatile Disk), a hard disk (Hard Disk), various semiconductor memories, and a control unit thereof.
- the storage unit 16 mainly stores information 101 regarding the target structure, the three-dimensional model 103, and inspection record data 105.
- the information 101 regarding the target structure includes at least one of the photographed image and the damage information of the target structure.
- the photographed image is an image of a structure.
- the damage information includes at least one of the location of damage, the type of damage and the degree of damage of the target structure.
- the information 101 regarding the target structure may include an image (damage image) showing the damage detected from the photographed image of the structure. Damage information may be acquired automatically by image analysis or the like, or manually by the user.
- the information 101 regarding the target structure may include a plurality of types of data, and may include, for example, a panoramic composite image and a two-dimensional drawing.
- the panoramic composite image is a group of images corresponding to a specific member, which is composited from the captured image. Damage information (damage image) may also be panoramicly combined.
- the 3D model 103 is, for example, data of a 3D model of a structure created based on a plurality of captured images.
- the three-dimensional model 103 includes data of member regions and member names constituting the structure. Each member region and member name may be specified on the three-dimensional model 103.
- the member area and the member name may be automatically specified for the three-dimensional model 103 from the information on the shape and dimensions of the member. Further, the member region and the member name may be specified with respect to the three-dimensional model 103 based on the user's operation.
- Information 101 about the target structure and the three-dimensional model 103 may be associated with each other.
- the information 101 regarding the target structure is stored in the storage unit 16 in association with the positions and members on the three-dimensional model 103.
- the information 101 about the target structure may be displayed on the 3D model 103.
- the three-dimensional model 103 may be displayed together with the information 101 regarding the target structure.
- the inspection record data 105 is, for example, a template for a two-dimensional inspection record (a document file in a designated format).
- the model may be a format specified by the Ministry of Land, Infrastructure, Transport and Tourism or the local government.
- the operation unit 18 shown in FIG. 1 includes a keyboard and a mouse, and the user can cause the inspection support device 10 to perform necessary processing via these devices.
- the display device 30 can function as an operation unit.
- the display device 30 is, for example, a device such as a liquid crystal display, and can display three-dimensional model data, information 101 about the target structure, inspection record data 105, and comments.
- FIG. 4 is a flow chart showing a structure inspection support method using a structure inspection support device.
- the selection processing unit 51 accepts the selection of the information 101 regarding the target structure (selection step: step S1).
- the information 101 regarding the target structure includes at least one of the captured image and the damage information.
- Information 101 regarding the target structure is acquired from the stored storage unit 16.
- the information 101 about the target structure may be acquired from another storage unit through the network via the input / output interface 12.
- FIG. 5 is a diagram showing an example of a photographed image 107 included in the information 101 regarding the target structure.
- the photographed images 107A and 107B are a plurality of images obtained by photographing the structure.
- a photographed image group 107C composed of a plurality of photographed images 107A, 107B, etc. obtained by photographing a plurality of places of a structure is configured. In the specification, it may be simply referred to as a photographed image 107, if necessary.
- FIG. 6 is a diagram showing an example of damage information 109 included in the information 101 regarding the target structure.
- the damage information 109 shown in FIG. 6 includes a member (damage position), a type of damage, a size of damage, a degree of damage, and a change over time.
- the damage information 109 may include at least one of the location of damage, the type of damage, and the degree of damage of the target structure. Further, the damage information 109 may include damage information other than the damage information 109 shown in FIG. 6, for example, the cause of damage.
- step S1 a preferable selection process in the selection step.
- the selection process section 51 may accept the selection of the information 101 related to the target structure by selecting the three-dimensional model 103 of the structure associated with the information 101 related to the target structure.
- FIG. 7 is a diagram showing an example of the three-dimensional model 103.
- the three-dimensional model 103 can be displayed as a point cloud, a polygon (mesh), a solid model, or the like.
- the three-dimensional model 103 of FIG. 7 is a diagram in which a photographed image (texture) obtained by photographing a structure is texture-mapped to a polygon of a polygon.
- the three-dimensional model 103 includes a member name and a member area.
- the three-dimensional model 103 is composed of, for example, a deck 131, a pier 133, an abutment 135, and the like.
- Information 101 about the target structure is interconnected with positions, members, and the like on the three-dimensional model 103.
- the method for creating the 3D model 103 is not limited. There are various models, and for example, a three-dimensional model 103 can be created by using a method of SfM (Structure from Motion). SfM is a method for restoring a three-dimensional shape from a multi-viewpoint image. For example, a feature point is calculated by an algorithm such as SIFT (Scale-Invariant Feature Transform), and the feature point is used as a clue to use the principle of triangular survey. Calculate the three-dimensional position of the point cloud. Specifically, a straight line is drawn from the camera to the feature point using the principle of triangulation, and the intersection of the two straight lines passing through the corresponding feature point becomes the restored three-dimensional point. Then, by performing this operation for each detected feature point, the three-dimensional position of the point cloud can be obtained.
- the three-dimensional model 103 may be created by using the captured image 107 (photographed image group 107C) shown in FIG.
- the size is not calculated by SfM, for example, by installing a scaler with known dimensions on the subject and taking a picture, it is possible to associate it with the actual scale.
- the three-dimensional model 103 showing the overall bird's-eye view is displayed on the display device 30 (not shown) by the display processing unit 55 (not shown).
- the user manually selects the 3D model 103 on the 3D model 103 via the operation unit 18, and the display processing unit 55 arbitrarily enlarges or reduces the 3D model 103A, or the viewpoint, line of sight, and field of view.
- the 3D model 103A with any of them changed is displayed. It may be referred to as a 3D model 103 including an overall bird's-eye view and a modified 3D model such as an enlargement.
- the selection processing unit 51 selects the captured image 107 or the damage information 109, which is the information 101 regarding the target structure. I can accept it. Further, the selection processing unit 51 may accept selection of both the captured image 107 and the damage information 109, which are the information 101 regarding the target structure.
- the 3D model 103 to which the damage is mapped can be displayed, and the user can select the damage to be commented on the 3D model 103.
- the selection process unit 51 may automatically accept selection of information 101 about the target structure for which a comment is created from the target structure.
- the user selects a target member (bridge) or span (tunnel) via the operation unit 18, and the selection process unit 51 selects information 101 regarding the target structure for each member. Automatically accepts selection.
- the user specifies, for example, a specific member on the three-dimensional model 103, and the selection processing unit 51 relates to the target structure from the members designated on the three-dimensional model 103. It is possible to automatically accept the selection of at least one of the captured image 107 which is the information 101 and the damage information 109.
- the user may select a member from the member list displayed on the display device.
- the selection processing unit 51 can automatically accept the selection of at least one of the photographed image 107, which is the information 101 about the target structure, or the damage information 109, from the members designated from the member list.
- the selection processing unit 51 automatically selects a predetermined number of information 101 regarding the target structure corresponding to the target damage for each damage type among the target members and spans, and accepts the selection. However, if there is no damage in the entire structure, the information 101 about the target structure corresponding to the damage is not selected.
- the selection process section 51 automatically selects a predetermined number of information 101 about the target structure corresponding to the target damage from the entire target structure and accepts the selection. However, if there is no damage in the entire structure, the information 101 about the target structure corresponding to the damage is not selected.
- the selection processing unit 51 may select the damage with the most advanced degree of damage from the information 101 regarding the target structure. For example, the selection processing unit 51 selects the damage having the largest damage degree from the result of the damage degree (ranks of a, b, c, d, and e) of the damage information 109, and selects the information 101 regarding the target structure. accept.
- the selection processing unit 51 may select the damage having the fastest damage progress rate from the information 101 regarding the target structure. For example, the selection processing unit 51 selects the damage having the fastest progress speed from the result of the time change of the damage information 109, and selects the information 101 regarding the target structure.
- the progress rate may be obtained from the length change / year, the width change / year, the area change / year, and the like.
- the rate of progression may be determined from the amount of change in damage size per year. For example, the amount of change in crack length per year, that is, the amount of change in length / year may be obtained. Further, the width change amount / year may be obtained from the change amount of the damage width such as the change amount of the crack width per year.
- the amount of change in the damaged area per year may ask for the year and so on.
- the selection processing unit 51 may select the damage having the largest damage size from the information 101 regarding the target structure.
- the selection processing unit 51 may select the longest damage or the widest damage such as cracks and cracks from the result of the captured image 107 or the damage information 109, and may also select peeling, water leakage, and free lime. The largest area of damage, such as corrosion, may be selected.
- the selection processing unit 51 accepts the selection of information 101 regarding the selected damaged target structure.
- the selection processing unit 51 from the information 101 regarding the target structure, in the case of a concrete member, fatigue, salt damage, neutralization, alkaline aggregate reaction, frost damage, poor construction, excessive external force, etc.
- the damage corresponding to the damage cause specified by the user may be selected from fatigue, salt damage, water leakage, material deterioration, coating film deterioration, construction failure, excessive external force, and the like.
- the selection processing unit 51 accepts the selection of the information 101 regarding the selected damaged target structure.
- the selection processing unit 51 may select the damage existing at the point of interest.
- the Ministry of Land, Infrastructure, Transport and Tourism's "Bridge Periodic Inspection Guidelines" (March 2019) describes examples of points of interest that need to be focused on when regularly inspecting concrete bridges.
- the points of interest are (1) end fulcrum, (2) intermediate fulcrum, (3) span center, (4) span 1/4, (5) joint, (6) segment joint, (7). Examples include a fixing portion and (8) a notch portion.
- the selection processing unit 51 may automatically accept selection of information 101 regarding the damaged target structure for each point of interest.
- Points of interest include (1) lining joints and joints, (2) near the top of the lining and (3) near the middle of the lining span.
- the selection processing unit 51 may automatically accept selection of information 101 regarding the damaged target structure for each point of interest.
- the selection processing unit 51 may automatically accept selection of information 101 regarding a damaged target structure that satisfies the first selection criterion and the second selection criterion, for example. Further, the selection processing unit 51 may automatically accept selection of information 101 regarding the damaged target structure that satisfies the first selection criterion and the third selection criterion, for example. Although the case where two selection criteria are combined has been described, three or more selection criteria may be combined.
- the CPU 20 functions as the selection processing unit 51.
- the creation processing unit 53 creates a comment for damage to the target structure based on the information 101 regarding the selected target structure (creation step: step S2).
- step S2 a preferable comment creation process in the creation step.
- the creation process unit 53 creates at least one comment for the damage of the target structure by artificial intelligence (AI). You may.
- AI artificial intelligence
- a trained model by a convolutional neural network can be used.
- CNN convolutional neural network
- FIG. 9 shows a block diagram of the creation processing unit 53 using the trained model.
- the creation processing unit 53 using the trained model is configured by a CPU or the like.
- the creation processing unit 53 includes a plurality of (three in this example) trained models 53A, 53B, and 53C corresponding to a plurality of types of damage.
- Each trained model 53A, 53B and 53C has an input layer, an intermediate layer and an output layer, and each layer has a structure in which a plurality of "nodes" are connected by "edges".
- Information 101 regarding the selected target structure is input to the input layer of the CNN.
- the information 101 regarding the target structure is a photographed image 107 or damage information 109 (for example, at least one of damage type, degree, damage progress, damage cause, and the like).
- the intermediate layer has a plurality of sets including a convolution layer and a pooling layer, and is a part for extracting features from a photographed image 107 input from an input layer or damage information 109.
- the convolution layer filters nearby nodes in the previous layer (performs a convolution operation using the filter) and obtains a "feature map".
- the pooling layer reduces the feature map output from the convolution layer to a new feature map.
- the "convolution layer” plays a role of feature extraction such as edge extraction from the captured image 107, or a frame allocation of feature extraction such as natural language processing from the damage information 109.
- the output layer of CNN is a part that outputs a feature map showing the features extracted by the intermediate layer.
- the output layer of the trained models 53A, 53B, 53C of this example outputs the inference result as the damage detection results 53D, 53E, 53F.
- the damage detection results 53D, 53E, 53F contain at least one comment for the damage derived from each trained model 53A, 53B, 53C.
- the trained model 53A is a trained model machine-learned to detect water leakage, planar free lime and rust juice damage, respectively, in the damaged areas and damage of water leakage, planar free lime and rust juice.
- the damage type and comment for each area are output as the damage detection result 53D.
- the trained model 53B is a trained model machine-learned to detect damage of peeling and exposed reinforcing bars, and damage detection results 53E for each damaged area of peeling and exposed reinforcing bars, damage type and comment for each damaged area. Is output as.
- the trained model 53C is a trained model machine-learned to detect cracks and linear free lime damage, and includes damage regions of cracks and linear free lime, damage types and comments for each damaged region.
- the damage detection result is output as 53F.
- the output damage detection results 53D, 53E, 53F will be created as comments for damage to the target structure.
- the trained models 53A, 53B, and 53C of the creation processing unit 53 are not limited to the above embodiments, and have, for example, individual trained models for each damage type, and each trained model has its own damage type.
- the corresponding damaged area and comment may be configured to be output as a damage detection result.
- the same number of trained models as the number of damage types to be inspected will be provided.
- it may have one trained model that can handle all types of damage, and may be configured to output the damage area and the damage type and comment for each damage area as a damage detection result.
- Appropriate, that is, accurate and error-free comments can be generated based on the information 101 regarding the target structure.
- the second comment creation process will be described with reference to FIGS. 10 and 11.
- the creation processing unit 53 bases on the information 101 regarding the target structure and the information 140 regarding the past structure stored in the database 60.
- Comments on structural damage related to similar damage similar to damage to the target structure may be made as comments on structural damage.
- the database 60 may be stored in the storage unit 16 or may be stored in another storage unit.
- the database 60 is a part that stores and manages the information 140 regarding the structure inspected in the past and the comment 142 for the damage of the structure created at that time in association with each other.
- the information 140 about the structure includes at least one of the captured image and the damage information.
- the creation processing unit 53 further includes a similar damage extraction unit 53G.
- Information 101 about the target structure selected in the selection step (step S1) is output to the similar damage extraction unit 53G.
- the similar damage extraction unit 53G Based on the information 101 regarding the target structure, the similar damage extraction unit 53G has the target structure out of the information 140 regarding the past structure stored in the database 60 and the comment 142 regarding the damage to the structure created at that time. Information 140 and comments 142 about structures similar to Information 101 are extracted.
- the similar damage extraction unit 53G makes a similar determination based on the damage information of the type of damage, the position of damage, the degree of damage (length, width, area, density, depth, etc.) (mean value, maximum value, etc.). You may go.
- the similar damage extraction unit 53G may make a similar determination based on the position and degree of damage over time in addition to the damage information.
- the database 60 at least one of the captured images at a plurality of time points obtained by photographing the same portion of the structure, the damage information at the plurality of time points detected from the photographed images at the plurality of time points, and the information indicating the change over time of the damage information. It is preferable that one is memorized.
- the similar damage extraction process by the similar damage extraction unit 53G detects the change with time of the damage information based on the information 101 about the target structure at a plurality of time points, and the information indicating the change with time is referred to as the damage of the target structure from the database 60. It can be used as one of the information when extracting similar similar damage.
- the similar damage extraction unit 53G may make a similar determination in consideration of at least one of information other than the damage information, for example, structural information, environmental information, history information, and inspection information of the structure. good.
- the similar damage extraction unit 53G extracts the damaged area of similar damage, the damage type and the comment for each damaged area as the similar damage detection result 53H.
- the similar damage extraction unit 53G extracts information 140 and comments 142 about a structure similar to the information 101 about the target structure based on one or more of the following other information, and extracts the information 140 and the comment 142 as the similar damage detection result 53H. May be good.
- Other information includes at least one of the following structural information, environmental information, history information and inspection information of the structure.
- -Structural information Structural type (in the case of bridges: girder bridge, ramen bridge, truss bridge, arch bridge, diagonal bridge, suspension bridge), member type (in the case of bridge: floor slab, bridge pedestal, abutment, girder ...), Materials (steel, reinforced concrete, PC (Prestressed Concrete) ...), etc.
- -Environmental information Traffic volume (per day, per month, per year, cumulative, etc.), distance from the sea, climate (average temperature, average) Humidity, rainfall, snowfall, etc.)
- History information Construction conditions (temperature at the time of construction, etc.), age (completion date, service start date, age since then), repair history, disaster history (earthquake, typhoon, flood, etc.)
- Inspection information Monitoring information (deflection of structure, vibration amplitude, vibration cycle, etc.
- FIG. 11 is a diagram showing an example of a method for extracting similar damage by the similar damage extraction unit.
- the information 140 regarding the structure stored in the database 60 is indicated by a cross, and the information 101 regarding the target structure is indicated by a ⁇ mark.
- the feature vector A indicates the maximum crack width (mm)
- the feature vector B indicates the number of years after the start of service of the structure.
- the feature space based on the feature vectors can be a multidimensional space composed of three or more feature vectors, but in FIG. 11, for the sake of simplicity, the two-dimensional space composed of two feature vectors is shown.
- the similar damage extraction unit 53G has a feature vector (first feature vector) of the damage information of the current diagnosis target indicated by ⁇ and a feature vector of damage information (second feature vector) indicated by ⁇ .
- the distance to the feature vector) is calculated, and the damage information indicated by the cross mark whose distance is equal to or less than the threshold value (inside the circle shown by the dotted line in FIG. 11) is extracted as similar damage.
- This threshold can be optimized by statistical methods.
- the distance may be a distance when the plurality of parameters of the first feature vector and the second feature vector are not weighted (Euclidean distance), or may be a distance when weighted (Mahalanobis distance). What kind of weight is assigned to which parameter may be determined by a statistical method such as principal component analysis.
- additional search conditions can be specified as points or ranges in the feature space. For example, when a bridge whose completion date is January 1, 1990 or later and a girder bridge whose basic structure is specified are specified, damage similar to that of the structure can be extracted within the specified range.
- damage information In addition to the above, damage information, structural information, environmental information, history information, inspection information, etc. included in other information can be set as the axis of the feature space, and similar damage can be extracted.
- the method for extracting similar damage may use a method different from the method for determining by the distance in the feature space.
- it may be extracted by AI (Artificial Intelligence) for determining the similarity from an image, or AI for determining the similarity by combining a plurality of information among images, damage information, and other information.
- AI Artificial Intelligence
- the display processing unit 55 displays the comment created in the creation step (step S2) on the display device 30 constituting the display (display step: step S3).
- the inspection support program uses the CPU 20 or the like to realize a selection function corresponding to the selection step, a creation function corresponding to the creation step, and a display function corresponding to the display step.
- step S1 selection step (step S1), the creation step (step S2), and the display step (step S3) will be described.
- FIG. 12 is a diagram showing an example of a screen displayed on the display device in the selection step (step S1). As shown in FIG. 12, the display device 30 displays the three-dimensional model 103, the enlarged three-dimensional model 103A, and the information 101 regarding the target structure on one screen. The captured image 107 and the damage information 109 are displayed as the information 101 regarding the target structure. Either the captured image 107 or the damage information 109 may be displayed.
- FIG. 13 is a diagram showing another example of the display displayed on the display device 30.
- FIG. 13A only the three-dimensional model 103 is displayed on the display device 30.
- FIG. 13B the enlarged three-dimensional model 103 is displayed on the display device 30.
- FIG. 13C only the information 101 regarding the target structure is displayed on the display device 30.
- the captured image 107 is displayed as the information 101 regarding the target structure.
- the target structure is related.
- the display of the three-dimensional model 103 (FIG. 13 (A)) is expanded from the display of the information 101 regarding the target structure (FIG. 13 (C)) to the display of the three-dimensional model 103A (FIG. 13 (B)). You may switch to.
- the information 101 regarding the target structure is selected by executing the selection processing unit 51 of the inspection support device 10 (selection step: step S1).
- FIG. 14 is a diagram showing an example of a model of the inspection record data 105 displayed on the display device 30.
- the template inspection record data 105 indicates a state in which no text data has been input.
- text data corresponding to comments such as material name (material name), symbol, member symbol, degree of damage, necessity of repair, necessity of detailed investigation, cause, finding, etc. are input. ..
- the text data of comments such as findings is input in the rightmost column.
- FIG. 15 is a diagram showing an example in which the inspection record data 105 in which text data other than comments such as findings is input is displayed on the display device 30. Based on the information 101 regarding the target structure, the user manually inputs the text data in the corresponding place other than the comment such as the finding of the inspection record data 105.
- the inspection support device 10 may automatically input text data in a corresponding place other than a comment such as a finding of the inspection record data 105 based on the information 101 regarding the target structure.
- FIG. 16 is a diagram showing an example in which the inspection record data 105 in which the text data of comments such as findings are input is displayed on the display device 30.
- the display device 30 includes the automatically created comment. Display the findings column.
- FIGS. 15 and 16 only three lines are shown, but the method is not limited to three lines.
- the automatically created comment is displayed in the finding column.
- the user may modify the content of the created comment via the operation unit 18 (not shown).
- the CPU 20 accepts editing of a comment and executes an editing process of changing the comment.
- the creation processing unit 53 may create a comment template and the display processing unit 55 may display the comment template on the display device 30 according to the selected damage. The user may reselect an appropriate comment from the comment template.
- FIG. 17 is a diagram showing an example of the editing process. As shown in FIG. 17A, a comment created by the inspection support device 10 is displayed in the column of findings. Comments are created using a template that corresponds to the type of damage selected.
- the comment is presumed to be due to (cause of damage) (type of damage).
- Type of damage occurs in (position / range), (comment on progress) (comment on response).
- Includes In FIG. 17 (A), "(crack) is presumed to be due to (fatigue). (Crack) occurs in (main girder) and (progress is fast) (detailed investigation is carried out.)" Is displayed. Has been done. The comments in parentheses are automatically created and the results are entered.
- comment candidates may be displayed from the pull-down menu and reselected from the comment candidates to other candidates.
- the pull-down menu candidates for (cause of damage) are “fatigue”, “salt damage”, “neutralization”, “alkali aggregate reaction”, “freezing damage”, and “construction failure”.
- “Excessive external force”, etc. and in the case of steel parts, “Fatigue”, “Salt damage”, “Water leakage”, “Material deterioration”, “Coating film deterioration”, “Construction failure”, “Excessive external force”, etc. May include.
- Candidates for the (damage type) pull-down menu include, for example, “crack”, “floor crack”, “leakage”, “free lime”, “peeling”, “reinforcing bar exposure”, “crack”, “corrosion”, etc. It may be included.
- Candidates for the (position / range) pull-down menu may include “main girder”, “horizontal girder”, “pier”, “abutment”, “bearing”, etc., as well as “whole”, “end”, etc. ..
- Candidates for the pull-down menu may include "progress is fast”, “progress is slow”, “progress is not a concern”, and the like.
- Candidates for the pull-down menu in include “Implement detailed investigation”, “Implement follow-up investigation”, “Implement repair”, “Implement countermeasures”, etc. good.
- Candidates for the pull-down menu are appropriately selected according to the damage.
- the user may edit the candidate comment in the pull-down menu.
- the comment of the similar damage extracted by the similar damage extraction process may be displayed on the display device 30 as a template of the findings. Similar to the editing process shown in FIG. 17, the comment of similar damage may be edited with the pull-down menu.
- the CPU 20 may execute a related information extraction process for extracting related information related to damage to the target structure, and the display processing unit 55 of the CPU 20 may display the related information on the display device 30.
- Related information related to damage to the target structure includes selected damage inspection data, selected damage past inspection data, and other closely related damage data (damage that is close to the selected damage. Behind the scenes). Inspection data for existing damage, etc.) and similar damage (which may be other structures) may be displayed on the display device 30. The user may edit the comment by displaying the relevant information related to the damage of the target structure.
- the hardware-like structure of the processing unit that executes various processes is various processors as shown below.
- the circuit configuration can be changed after manufacturing the CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), etc., which are general-purpose processors that execute software (programs) and function as various processing units.
- Programmable Logic Device which is a processor, and a dedicated electric circuit, which is a processor having a circuit configuration specially designed to execute a specific process such as ASIC (Application Specific Integrated Circuit). Is done.
- One processing unit may be composed of one of these various processors, or may be composed of two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of a CPU and an FPGA). You may. Further, a plurality of processing units can be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, one processor is configured by a combination of one or more CPUs and software, as represented by a computer such as a client or a server. There is a form in which the processor functions as a plurality of processing units.
- SoC System On Chip
- the various processing units are configured by using one or more of the above-mentioned various processors as a hardware-like structure.
- the hardware-like structure of these various processors is, more specifically, an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Economics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Operations Research (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Entrepreneurship & Innovation (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Analysing Materials By The Use Of Radiation (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
Description
図1は、本発明に係る構造物の点検支援装置のハードウェア構成の一例を示すブロック図である。
選択処理部51は、対象構造物に関する情報101の選択を受け付ける(選択ステップ:ステップS1)。上述したように、対象構造物に関する情報101は、撮影画像及び損傷情報の少なくとも一方を含む。
対象構造物に関する情報101は、3次元モデル103上の位置及び部材などと相互に関連付けられている。
図4に示すように、作成処理部53は、選択された対象構造物に関する情報101に基づいて、対象構造物の損傷に対するコメントを作成する(作成ステップ:ステップS2)。
その他の情報としては、以下に示す、構造物の構造情報、環境情報、履歴情報及び検査情報のうちの少なくとも1つを含む。
・構造情報:構造種類(橋梁の場合:桁橋、ラーメン橋、トラス橋、アーチ橋、斜張橋、吊橋)、部材種類(橋梁の場合:床版、橋脚、橋台、桁・・・)、材料(鋼、鉄筋コンクリート、PC(Prestressed Concrete)・・・)など
・環境情報:交通量(1日あたり、1月あたり、1年あたり、累積など)、海からの距離、気候(平均気温、平均湿度、降雨量、降雪量など)
・履歴情報:施工時条件(施工時の気温など)、経年数(竣工日、供用開始日、それらからの経年数)、補修履歴、災害履歴(地震、台風、洪水など)
・検査情報:モニタリング情報(構造物のたわみ、振動の振幅、振動の周期等)、コア抜き試験情報、非破壊検査情報(超音波、レーダー、赤外線、打音など)
また、その他の情報には、対象構造物の診断目的を含めることができる。診断目的は、損傷程度の判定、対策区分の判定、健全性の判定、損傷原因の推定、補修要否の判定及び補修工法の選定などである。
図4に示すように、表示処理部55は、作成ステップ(ステップS2)で作成されたコメントを、ディスプレイを構成する表示装置30に表示する(表示ステップ:ステップS3)。
上記実施形態において、各種の処理を実行する処理部(processing unit)のハードウェア的な構造は、次に示すような各種のプロセッサ(processor)である。各種のプロセッサには、ソフトウェア(プログラム)を実行して各種の処理部として機能する汎用的なプロセッサであるCPU(Central Processing Unit)、FPGA(Field Programmable Gate Array)などの製造後に回路構成を変更可能なプロセッサであるプログラマブルロジックデバイス(Programmable Logic Device:PLD)、ASIC(Application Specific Integrated Circuit)などの特定の処理を実行させるために専用に設計された回路構成を有するプロセッサである専用電気回路などが含まれる。
12 入出力インターフェイス
16 記憶部
18 操作部
20 CPU
22 RAM
24 ROM
26 表示制御部
30 表示装置
51 選択処理部
53 作成処理部
53A 学習済みモデル
53B 学習済みモデル
53C 学習済みモデル
53D 損傷検出結果
53E 損傷検出結果
53F 損傷検出結果
55 表示処理部
60 データベース
101 対象構造物に関する情報
103 3次元モデル
103A 3次元モデル
105 点検調書データ
107 撮影画像
107A 撮影画像
107B 撮影画像
107C 撮影画像群
109 損傷情報
131 床版
133 橋脚
135 橋台
140 構造物に関する情報
142 コメント
S1、S2、S3 ステップ
Claims (12)
- プロセッサを備える構造物の点検支援装置であって、
前記プロセッサは、
対象構造物の撮影画像及び損傷情報の少なくとも一方を含む対象構造物に関する情報の選択を受け付ける選択処理と、
選択された前記対象構造物に関する情報に基づいて、前記対象構造物の損傷に対するコメントを作成する作成処理と、
前記コメントをディスプレイに表示する表示処理と、
を行う、構造物の点検支援装置。 - 前記作成処理は、前記撮影画像及び前記損傷情報の少なくとも一方に基づいて、前記対象構造物の損傷に対する少なくとも一つの前記コメントを作成する、請求項1に記載の構造物の点検支援装置。
- 前記作成処理は、機械学習を用いて前記対象構造物の損傷に対する前記コメントとして作成する、請求項1又は2記載の構造物の点検支援装置。
- 構造物の撮影画像及び損傷情報の少なくとも一方を含む過去の対象構造物に関する情報と構造物の損傷に対するコメントとを関連付けて記憶するデータベースを備え、
前記作成処理は、前記対象構造物に関する情報と、前記データベースに記憶された前記構造物に関する情報とに基づいて、前記対象構造物の損傷と類似する類似損傷に関する前記構造物の損傷に対するコメントを前記構造物の損傷に対するコメントとして作成する、請求項1から3のいずれか一項に記載の構造物の点検支援装置。 - 前記選択処理は、前記対象構造物に関する情報と関連付けられた前記構造物の3次元モデルを選択することにより、前記対象構造物に関する情報の選択を受け付ける、請求項1から4のいずれか一項に記載に構造物の点検支援装置。
- 前記選択処理は、前記対象構造物から、前記コメントの作成の対象となる前記対象構造物に関する情報を、自動で選択を受け付ける、請求項1から4のいずれか一項に記載に構造物の点検支援装置。
- 前記選択処理は、前記撮影画像及び前記損傷情報の少なくとも一方に基づいて、前記対象構造物に関する情報を、自動で選択を受け付ける、請求項6に記載の構造物の点検支援装置。
- 前記プロセッサは、前記コメントに対する編集を受け付け、前記コメントを変更する編集処理を実行する、請求項1から7のいずれか一項に記載の構造物の点検支援装置。
- 前記編集処理は、前記コメントに対応する複数のコメント候補から選択されたコメント候補を、前記コメントに対する編集として受け付ける、請求項8に記載の構造物の点検支援装置。
- 前記プロセッサは、前記対象構造物の損傷に関連する関連情報を抽出する関連情報抽出処理を実行し、
前記表示処理は前記関連情報をディスプレイに表示する、請求項1から9のいずれか一項に記載の構造物の点検支援装置。 - 対象構造物の撮影画像及び損傷情報の少なくとも一方を含む対象構造物に関する情報の選択を受け付ける選択ステップと、
選択された前記対象構造物に関する情報に基づいて、前記対象構造物の損傷に対するコメントを作成する作成ステップと、
前記コメントをディスプレイに表示する表示ステップと、
をプロセッサにより行う、構造物の点検支援方法。 - 対象構造物の撮影画像及び損傷情報の少なくとも一方を含む対象構造物に関する情報の選択を受け付ける選択機能と、
選択された前記対象構造物に関する情報に基づいて、前記対象構造物の損傷に対するコメントを作成する作成機能と、
前記コメントをディスプレイに表示する表示機能と、
をコンピュータにより実現させる構造物の点検支援プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022563621A JPWO2022107485A1 (ja) | 2020-11-19 | 2021-10-08 | |
CN202180074971.9A CN116508047A (zh) | 2020-11-19 | 2021-10-08 | 结构物的点检辅助装置、结构物的点检辅助方法及程序 |
EP21894359.5A EP4250192A1 (en) | 2020-11-19 | 2021-10-08 | Structure inspection assistance device, structure inspection assistance method, and program |
US18/308,763 US20230260098A1 (en) | 2020-11-19 | 2023-04-28 | Structure inspection assistance apparatus, structure inspection assistance method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-192303 | 2020-11-19 | ||
JP2020192303 | 2020-11-19 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/308,763 Continuation US20230260098A1 (en) | 2020-11-19 | 2023-04-28 | Structure inspection assistance apparatus, structure inspection assistance method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022107485A1 true WO2022107485A1 (ja) | 2022-05-27 |
Family
ID=81708834
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/037302 WO2022107485A1 (ja) | 2020-11-19 | 2021-10-08 | 構造物の点検支援装置、構造物の点検支援方法及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20230260098A1 (ja) |
EP (1) | EP4250192A1 (ja) |
JP (1) | JPWO2022107485A1 (ja) |
CN (1) | CN116508047A (ja) |
WO (1) | WO2022107485A1 (ja) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3889356A4 (en) * | 2018-11-29 | 2022-02-23 | FUJIFILM Corporation | STRUCTURAL REPAIR METHOD SELECTION SYSTEM, REPAIR METHOD SELECTION, AND REPAIR METHOD SELECTION SERVER |
US20230215165A1 (en) * | 2022-01-05 | 2023-07-06 | Here Global B.V. | Method, apparatus, and computer program product for identifying fluid leaks based on aerial imagery |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003067519A (ja) * | 2001-08-24 | 2003-03-07 | Kubota Corp | 機械設備のメンテナンス情報管理システム |
JP2018181235A (ja) * | 2017-04-21 | 2018-11-15 | 古河電気工業株式会社 | 報告書作成装置、風力発電設備点検システム、プログラム、及び風力発電設備の点検報告書の作成方法 |
JP2019082933A (ja) | 2017-10-31 | 2019-05-30 | 株式会社日本ソフト | 報告書作成システム |
WO2020110717A1 (ja) * | 2018-11-29 | 2020-06-04 | 富士フイルム株式会社 | 構造物の損傷原因推定システム、損傷原因推定方法、及び損傷原因推定サーバ |
-
2021
- 2021-10-08 CN CN202180074971.9A patent/CN116508047A/zh active Pending
- 2021-10-08 WO PCT/JP2021/037302 patent/WO2022107485A1/ja active Application Filing
- 2021-10-08 EP EP21894359.5A patent/EP4250192A1/en active Pending
- 2021-10-08 JP JP2022563621A patent/JPWO2022107485A1/ja active Pending
-
2023
- 2023-04-28 US US18/308,763 patent/US20230260098A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003067519A (ja) * | 2001-08-24 | 2003-03-07 | Kubota Corp | 機械設備のメンテナンス情報管理システム |
JP2018181235A (ja) * | 2017-04-21 | 2018-11-15 | 古河電気工業株式会社 | 報告書作成装置、風力発電設備点検システム、プログラム、及び風力発電設備の点検報告書の作成方法 |
JP2019082933A (ja) | 2017-10-31 | 2019-05-30 | 株式会社日本ソフト | 報告書作成システム |
WO2020110717A1 (ja) * | 2018-11-29 | 2020-06-04 | 富士フイルム株式会社 | 構造物の損傷原因推定システム、損傷原因推定方法、及び損傷原因推定サーバ |
Also Published As
Publication number | Publication date |
---|---|
EP4250192A1 (en) | 2023-09-27 |
US20230260098A1 (en) | 2023-08-17 |
CN116508047A (zh) | 2023-07-28 |
JPWO2022107485A1 (ja) | 2022-05-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Shim et al. | Development of a bridge maintenance system for prestressed concrete bridges using 3D digital twin model | |
US20230260098A1 (en) | Structure inspection assistance apparatus, structure inspection assistance method, and program | |
Adhikari et al. | Image-based retrieval of concrete crack properties for bridge inspection | |
Chen et al. | Convolutional neural networks (CNNs)-based multi-category damage detection and recognition of high-speed rail (HSR) reinforced concrete (RC) bridges using test images | |
Kassotakis et al. | Employing non-contact sensing techniques for improving efficiency and automation in numerical modelling of existing masonry structures: A critical literature review | |
Sadhu et al. | A review of data management and visualization techniques for structural health monitoring using BIM and virtual or augmented reality | |
Omer et al. | Inspection of concrete bridge structures: Case study comparing conventional techniques with a virtual reality approach | |
Bień | Modelling of structure geometry in Bridge Management Systems | |
Valero et al. | High level-of-detail BIM and machine learning for automated masonry wall defect surveying | |
Dayan et al. | A scoping review of information-modeling development in bridge management systems | |
Kong et al. | Preserving our heritage: A photogrammetry-based digital twin framework for monitoring deteriorations of historic structures | |
Bai et al. | Image-based reinforced concrete component mechanical damage recognition and structural safety rapid assessment using deep learning with frequency information | |
Saback de Freitas Bello et al. | Framework for bridge management systems (bms) using digital twins | |
JP2024012527A (ja) | 情報表示装置、方法及びプログラム | |
US20230237641A1 (en) | Inspection support device for structure, inspection support method for structure, and program | |
Markova et al. | 3D photogrammetry application for building inspection of cultural heritage objects | |
Samuel | A Human-Centered Infrastructure Asset Management Framework Using BIM and Augmented Reality | |
Mansuri et al. | Artificial intelligence for heritage conservation: a case study of automatic visual inspection system | |
WO2021199830A1 (ja) | 点検支援装置、方法及びプログラム | |
WO2021176891A1 (ja) | 3次元表示装置、方法及びプログラム | |
Koch et al. | Machine vision techniques for condition assessment of civil infrastructure | |
WO2022209304A1 (ja) | モニタリング設計支援装置、モニタリング設計支援方法およびプログラム | |
Neeli | Use of photogrammetry aided damage detection for residual strength estimation of corrosion damaged prestressed concrete bridge girders | |
Ma et al. | Drone aided machine-learning tool for post-earthquake bridge damage reconnaissance | |
JP2022062915A (ja) | 情報処理装置、情報処理方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21894359 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022563621 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180074971.9 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021894359 Country of ref document: EP Effective date: 20230619 |