US10116871B2 - Tunnel lining surface inspection system and vehicle used for tunnel lining surface inspection system - Google Patents

Tunnel lining surface inspection system and vehicle used for tunnel lining surface inspection system Download PDF

Info

Publication number
US10116871B2
US10116871B2 US14/903,623 US201414903623A US10116871B2 US 10116871 B2 US10116871 B2 US 10116871B2 US 201414903623 A US201414903623 A US 201414903623A US 10116871 B2 US10116871 B2 US 10116871B2
Authority
US
United States
Prior art keywords
tunnel lining
lining surface
photography
line sensors
circumferential direction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US14/903,623
Other languages
English (en)
Other versions
US20160227126A1 (en
Inventor
Yukio Akashi
Kazuaki Hashimoto
Shogo Hayashi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
West Nippon Expressway Engineering Shikoku Co Ltd
Original Assignee
West Nippon Expressway Engineering Shikoku Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by West Nippon Expressway Engineering Shikoku Co Ltd filed Critical West Nippon Expressway Engineering Shikoku Co Ltd
Assigned to WEST NIPPON EXPRESSWAY ENGINEERING SHIKOKU COMPANY LIMITED reassignment WEST NIPPON EXPRESSWAY ENGINEERING SHIKOKU COMPANY LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AKASHI, YUKIO, HASHIMOTO, KAZUAKI, HAYASHI, SHOGO
Publication of US20160227126A1 publication Critical patent/US20160227126A1/en
Application granted granted Critical
Publication of US10116871B2 publication Critical patent/US10116871B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • H04N5/23296
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M3/00Investigating fluid-tightness of structures
    • G01M3/38Investigating fluid-tightness of structures by using light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M5/00Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings
    • G01M5/0033Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings by determining damage, crack or wear
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M5/00Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings
    • G01M5/0091Investigating the elasticity of structures, e.g. deflection of bridges or air-craft wings by using electromagnetic excitation or detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/954Inspecting the inner surface of hollow bodies, e.g. bores
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0008Industrial image inspection checking presence/absence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/51Housings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N5/2252
    • H04N5/2253
    • H04N5/2256
    • H04N5/2258
    • H04N5/23229
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/06Illumination; Optics
    • G01N2201/062LED's
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/12Circuits of general importance; Signal processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30184Infrastructure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/701Line sensors
    • H04N5/3692

Definitions

  • the present invention relates to the tunnel lining surface inspection system and the vehicle used for the tunnel lining surface inspection system, in particular to the system and the vehicle for inspecting soundness (degree of deterioration) of a tunnel by image-visualizing faulted conditions such as cracking on the tunnel lining surface.
  • the applicant has already proposed the tunnel lining surface inspection system in which, while a vehicle is travelling in the tunnel, the tunnel lining surface image is penetrated by the photography means mounted in the vehicle and is processed into the image used for inspecting the tunnel lining surface, as shown in the Patent document 1.
  • the tunnel lining surface image can be obtained while a vehicle travels, and, by using the image, the soundness (degree of deterioration) of a tunnel can be inspected by image-visualizing the faulted conditions such as cracking on the tunnel lining surface.
  • Patent document 1 Japanese Patent Application Laid-open No. 2014-95627
  • the tunnel lining surface is constituted according to one span.
  • the joint divides the tunnel lining surface according to one span, and each span is given a span number (lining number) for identifying the portion of the tunnel lining surface.
  • the present invention has been made by taking such actual situations into account, and aims at being able to inspect the soundness (degree of deterioration) of a tunnel according to each span by processing the tunnel lining surface images obtained while a vehicle travels into the images indicating both side faces of the tunnel lining surface according to the each span.
  • the first invention is characterized with a tunnel lining surface inspection system wherein, while a vehicle is travelling in a tunnel, a tunnel lining surface image is penetrated by photography means mounted in the vehicle and is processed into an image used for inspecting the tunnel lining surface, the system comprising
  • the second invention is characterized, in the first invention, in that the tunnel lining surface inspection system wherein
  • the third invention is characterized, in the first or second invention, in that the tunnel lining surface inspection system wherein
  • driving the vehicle mounted with the photography means having the photography range of one side face in both side faces of the tunnel lining surface enables the images of both side faces of the tunnel lining surface to be obtained.
  • the images showing both side faces of the tunnel lining surface can be obtained according to each span.
  • the soundness (degree of deterioration) of the tunnel lining surface can be inspected, and according to each span, the inspection result can be managed.
  • FIG. 1 shows the left side face of the vehicle used for the tunnel lining surface inspection system in the present invention.
  • FIG. 2 is a cross section plan showing the state where the vehicle is traveling on the traveling lane on the left side in the tunnel and shows how the tunnel lining surface is penetrated by using the line sensors and the halogen lamps.
  • FIG. 3 corresponds to FIG. 2 and is a cross section plan showing the state where the vehicle is traveling on the overtaking lane on the right side in the tunnel.
  • FIG. 4 planarly shows how the fixing/reversing means is configured.
  • FIG. 5 shows the state where the photography means is fixed to the second photography position.
  • FIG. 6 shows a cross section of the stage member and the L-shaped member, and also shows an arrow guiding A-A cross section in FIG. 4 .
  • FIG. 7 corresponds to FIG. 4 , showing a transformation example in which, as the illumination means arranged in the L-shaped member, the halogen lamps are replaced with LED unit.
  • FIG. 8 shows an internal configuration of LED unit, FIG. 8 ( a ) being a cross section plan showing the inside in the state shown in FIG. 7 , and FIG. 8 ( b ) being a vertical section plan.
  • FIG. 9 shows a procedure of the processing performed in the tunnel lining surface inspection system in the embodiment.
  • FIG. 10 ( a ), ( b ), ( c ), ( d ), ( e ) explains an image processing performed in the personal computer.
  • FIG. 11 ( a ), ( b ) explains an image processing for distortion correction ( FIG. 10 ( b ) ).
  • FIG. 12 ( a ), ( b ), ( c ) explains an image processing for distortion correction ( FIG. 10 ( b )).
  • FIG. 13 explains an processing ( FIG. 10 ( c ) ) for indicating the portions forming the identical span of the tunnel lining surface by comparing the each image.
  • FIG. 14 ( a ) ( b ) ( c ) shows an image processing for inclined image correction (tilt correction) ( FIG. 10 ( d ) ).
  • FIG. 15 ( a ) ( b ) explains an image synthesis processing ( FIG. 10 ( e ) ) for obtaining the images showing both side faces of the tunnel lining surface according to each span of the tunnel lining surface.
  • FIG. 16 shows an inspection example of the soundness (degree of deterioration) of the tunnel lining surface.
  • Vehicle 10 Photography means 10 Illumination means 30 Fixing/reversing means 32 L-shaped member 34 Positioning means 100 Tunnel lining surface S (S 1, S 2, S 3 . . . Sn) span 110-1, 110-2, 110-3 . . . 110-n Synthesized images
  • FIG. 1 shows the left side face of the vehicle 1 used for the tunnel lining surface inspection system in the present invention.
  • the vehicle 1 is a work vehicle with a base of work track used for road maintenance work, for example.
  • the loading space of the vehicle 1 has a container shape, and the door of one side face of the container (the left side face in FIG. 1 ) and the door at the ceiling of the container are openable.
  • FIG. 1 shows the state where the doors are opened.
  • the photography means 10 and the illumination means 20 are arranged in the loading space of the vehicle 1 so that, when the above doors in the vehicle 1 are opened, the tunnel lining surface can be penetrated and illuminated.
  • a line sensor (camera) is assumed as the photography means 10 .
  • three line sensors 10 a , 10 b , 10 c are assumed.
  • they are dubbed line sensors 10 , hereinafter.
  • a halogen lamp is assumed as the illumination means 20 .
  • a metal halide lamp (HID) and LED lighting may be used as a light source, as well.
  • twelve halogen lamps 20 a , 20 b , 20 c , 20 d , 20 e , 20 f , 20 g , 20 h , 20 i , 20 j , 20 k , 20 l are assumed.
  • the twelve halogen lamps 20 a - 20 l are comprehensively described, they are dubbed halogen lamps 20 , hereinafter.
  • the line sensors 10 are fixed to the fixing/reversing means 30 so that the direction vertical to the travel direction of the vehicle 1 is the direction of the collimation line 11 a .
  • the line sensors 10 photography the photography range 11 with the predetermined picture angle spreading toward the depth direction and the spectator direction to the drawing of FIG. 1 .
  • the halogen lamps 20 are fixed to the fixing/reversing means 30 so as to project light in the direction vertical to the travel direction of the vehicle 1 .
  • the line sensors 10 and the halogen lamps 20 are positioned so that the emission range 21 of the halogen lamps 20 includes the collimation line 11 a of the line sensors 10 .
  • the image processing unit 40 receives the imaging data veind by the line sensors 10 and performs image processing to generate the tunnel lining surface images.
  • FIG. 2 is a cross section plan showing the state where the vehicle 1 is traveling on the traveling lane 150 L on the left side in the tunnel and shows how the tunnel lining surface 100 is penetrated by using the line sensors 10 and the halogen lamps 20 .
  • the road surface on the left side of the center line TC in the tunnel in the drawing is defined as the traveling lane on the left 150 L
  • the road surface on the right side of the center line TC in the tunnel in the drawing is dubbed as the overtaking lane on the right 150 R.
  • the left side of the tunnel lining surface 100 delimited by the center line TC of the tunnel is defined as the left side face 100 L
  • the right side of the tunnel lining surface 100 delimited by the center line TC of the tunnel is defined as the right side face 100 R.
  • the photography means 10 has the photograph range of at least one side face (left side face 100 L in FIG. 2 ) in the both side faces 100 L, 100 R of the tunnel lining surface 100 , and is configured to comprise a plurality of (three in the embodiment) line sensors 10 a , 10 b , 10 c which are arranged along a circumferential direction of the tunnel lining surface 100 and photography each area 100 A, 100 B, 100 C along the circumferential direction of the tunnel lining surface 100 .
  • the line sensors 10 a , 10 b , 10 c respectively have the picture angle of 61°. Therefore, in the each area 100 A, 100 B, 100 C, the neighboring areas are partially overlapped.
  • the illumination means 20 has the emission range of at least one side face (left side face 100 L in FIG. 2 ) of the both side faces 100 L, 100 R of the tunnel lining surface 100 and is configured to comprise a plurality of (twelve in the embodiment) halogen lamps 20 a - 20 l which are arranged along a circumferential direction of the tunnel lining surface 100 and illuminate the each area 100 A, 100 B, 100 C along the circumferential direction of the tunnel lining surface 100 (see FIG. 4 ).
  • the fixing/reversing means 30 fixes the photography means 10 to a first photography position where the left side face 100 L, namely the one side face in both side faces of the tunnel lining surface 100 can be penetrated, and reverses the photography means 10 to fix the photography means 10 to a second photography position where the right side face 100 R, namely the other side face in both side faces of the tunnel lining surface 100 can be violated, the fixing/reversing means 30 being configured to comprise the L-shaped member 32 which is 90° rotatable around the drive axis 31 in the circumferential direction of the tunnel lining surface 100 and in which a plurality of (three in FIG.
  • line sensors 10 a , 10 b , 10 c are arranged along the circumferential direction of the tunnel lining surface 100 and the positioning means (not shown in FIG. 2 and mentioned below in FIG. 4 ) for positioning this L-shaped member 32 in the first photography position and, when the L-shaped member 32 is 90° rotated in the circumferential direction of the tunnel lining surface 100 and positioned in the second photography position, positioning the L-shaped member 32 in this second photography position.
  • the illumination means 20 When the fixing/reversing means 30 is positioned in the first photography position, the illumination means 20 is similarly positioned in the first photography position and illuminates the photography area 100 A, 100 B, 100 C, corresponding to the first photography position, and, when the fixing/reversing means 30 is positioned in the second photography position, the illumination means 20 is similarly positioned in the second photography position and illuminates the photography area 100 D, 100 E, 100 F (see FIG. 3 ), corresponding to the second photography position.
  • FIG. 3 corresponds to FIG. 2 and is a cross section plan showing the state where the vehicle 1 is traveling on the overtaking lane on the right side 150 R in the tunnel.
  • FIG. 3 shows the state in which the photography means 10 and the illumination means 20 are positioned in and fixed to the second photography position by being 90° reversed clockwise from the first photography position ( FIG. 2 ) by the fixing/reversing means 30 .
  • the photography means 10 penetrates the right side face 100 R of the tunnel lining surface 100 , namely, the line sensors 10 a , 10 b , 10 c respectively photography each area 100 D, 100 E, 100 F along the circumferential direction of the tunnel lining surface 100 .
  • the illumination means 20 illuminates the right side face 100 R of the tunnel lining surface 100 , namely, the twelve halogen lamps 20 a - 20 l emit light to each area 100 D, 100 E, 100 F along the circumferential direction of the tunnel lining surface 100 .
  • FIG. 4 planarly shows how the fixing/reversing means 30 is configured.
  • the L-shaped member 32 has the arms 32 L, 32 R respectively having the same length and being symmetrical to the symmetry axis m 1 which passes the drive axis 31 as the rotation center, and is integrally formed with the arms 32 L, 32 R vertically crossed to each other, and the twelve halogen lamps 20 a - 20 l are symmetrically arranged on the upper face 32 S of the L-shaped member 32 .
  • the triangle-shaped stage member 33 is arranged at the inside corner of the L-shaped member 32 so as to be symmetrical to the symmetry axis m 1 .
  • the three line sensors 10 a , 10 b , 10 c are arranged so as to be symmetrical to the symmetry axis m 1 on the upper face 33 S of the stage member 33 .
  • the positioning means 34 is configured to comprise the T-shaped member 35 .
  • the T-shaped member 35 is fixed to the frame installed in the loading space of the vehicle 1 .
  • the L-shaped member 32 is pivoted around the drive axis 31 relatively to the T-shaped member 35 fixed to the frame.
  • the drive axis 31 is driven by drive means such as a motor not shown.
  • the T-shaped member 35 is symmetrical to the symmetry axis m 2 which passes the drive axis 31 , and is integrally formed with the arms 35 L, 35 R, 35 M respectively having the same length from the drive axis 31 to the edge vertically crossed to each other, and on the arms 35 L, 35 R, 35 M, the holes 35 a , 35 b , 35 c having the same distance from the drive axis 31 are formed respectively.
  • the holes 32 a , 32 b having the same distance from the drive axis M are formed.
  • FIG. 4 shows the state where the photography means 10 is fixed to the first photography position.
  • the pin not shown is inserted into the hole 35 c and the hole 32 a
  • the pin not shown is inserted into the hole 35 b and the hole 32 b
  • the photography means 10 and the illumination means 20 are fixed to the first photography position.
  • FIG. 5 shows the state where the photography means 10 is fixed to the second photography position.
  • FIG. 6 shows the cross section of the stage member 33 and the L-shaped member 32 , and the arrow guiding A-A cross section in FIG. 4 .
  • the upper face 33 S of the stage member 33 is arranged in a position higher than the upper face 32 S of the L-shaped member 32 . Therefore, the line sensors 10 are installed higher than the halogen lamps 20 .
  • the line sensors 10 and the halogen lamps 20 are positioned so that the emission range 21 of the halogen lamps 20 includes the collimation line 11 a of the line sensors 10 but does not interfere with the halogen lamps 20 , etc. It is noted that the picture angle 11 of the line sensors 10 spreads toward the depth direction and the spectator direction in FIG. 6 .
  • FIG. 7 corresponds to FIG. 4 , showing a transformation example in which, as the illumination means 20 arranged in the L-shaped member 32 , the halogen lamps 20 are replaced with LED unit 20 .
  • the halogen lamps 20 are replaced with LED unit 20 .
  • the four LED units 20 A, 20 B, 20 C, 20 D are allocated so as to be symmetrical.
  • the four LED units 20 A, 20 B, 20 C, 20 D have the same emission range 21 as the halogen lamps 20 a - 20 l.
  • FIG. 8 shows the internal configuration of LED unit 20 , FIG. 8 ( a ) being a cross section plan seen at the above surface in FIG. 7 , and FIG. 8 ( b ) being a vertical section plan seen as a section of FIG. 7 .
  • the LED unit 20 is configured to comprise a line-shaped LED substrate 22 where a plurality of LED 22 a are arranged along a circumferential direction of the tunnel lining surface 100 (the direction vertical to the travel direction of the vehicle 1 ), a cylinder-shaped rod lens 23 which refracts the light emitted by the line-shaped LED substrate 22 , the lens 23 having the longitudinal length corresponding to the LED arrangement length of the line-shaped LED substrate 22 , a cover glass 24 which transmits and emits outside the light 25 refracted by the rod lens 23 , and a fan 26 for cooling the line-shaped LED substrate 22 .
  • the photography means 10 is configured with three line sensors 10 a , 10 b , 10 c , but the configuration with more photography means or one photography means may be accepted.
  • the embodiment in which one line sensor 20 having the photography range where one side face of the tunnel lining surface 100 can be penetrated installed in the fixing/reversing means 30 photoghraphies the left side face 100 L at the first photography position, and then is 90° reversed and photoghraphies the right side face 100 R at the second photography position, thereby both side faces 100 L, 100 R of the tunnel lining surface 100 areriid by the one line sensor 20 is possible, as well.
  • FIG. 9 shows the procedure of the processing performed in the tunnel lining surface inspection system in the embodiment.
  • the vehicle 1 drives along the traveling lane 150 L on the left side. While the vehicle 1 is traveling, the three line sensors 10 a , 10 b , 10 c and the halogen lamps 20 a - 20 l are activated. Thereby, each area 100 A, 100 B, 100 C of the left side face 100 L of the tunnel lining surface 100 is sequentially penetrated by the three line sensors 10 a , 10 b , 10 c .
  • the image data of the each area 100 A, 100 B, 100 C of the left side face 100 L of the tunnel lining surface 100 penetrated by the each line sensor 10 a , 10 b , 10 c are captured into the image processing unit 40 . (See FIG. 2 ; Step 201 ).
  • the fixing/reversing means 30 90° reverses the line sensors 10 a , 10 b , 10 c and the halogen lamps 20 a - 20 l (Step 202 ).
  • the vehicle 1 drives along the overtaking lane 150 R on the right side.
  • each area 100 D, 100 E, 100 F of the right side face 100 R of the tunnel lining surface 100 is sequentially penetrated by the three line sensors 10 a , 10 b , 10 c .
  • the image data of the each area 100 D, 100 E, 100 F of the right side face 100 R of the tunnel lining surface 100 penetrated by the each line sensor 10 a , 10 b , 10 c are captured into the image processing unit 40 . (See FIG. 3 ; Step 203 ).
  • the image data of the each area 100 A, 100 B, 100 C of the left side face 100 L of the tunnel lining surface 100 and the image data of the each area 100 D, 100 E, 100 F of the right side face 100 R of the tunnel lining surface 100 , captured into the image processing unit 40 are captured into the exterior personal computer, for example, for image processing (Step 204 ).
  • Step 205 the image processing for distortion correction mentioned below is performed.
  • Step 206 the portions forming the identical span S of the tunnel lining surface 100 are indicated by comparing the images of the each area 100 A, 100 B, 100 C, 100 D, 100 E, 100 F, having been penetrated by the three line sensors 10 a , 10 b , 10 c (Step 206 ), as mentioned below.
  • Step 207 the image processing for inclined image correction (tilt correction) mentioned below is performed.
  • the image synthesis processing is performed to obtain the images 110 - 1 , 110 - 2 , . . . 110 - n showing both side faces 100 L, 100 R of the tunnel lining surface 100 according to the each span S 1 , S 2 . . . S n of the tunnel lining surface 100 (Step 208 ).
  • FIG. 10 explains the image processing performed in the personal computer.
  • each image 110 A, 110 B, 110 C, 110 D, 110 E, 110 F showing each area 100 A, 100 B, 100 C, 100 D, 100 E, 100 F of the left side face 100 L and the right side face 100 R of the tunnel lining surface 100 is captured ( FIG. 10 ( a ) ), the image processing for distortion correction is performed to the each image 110 A, 110 B, 110 C, 110 D, 110 E, 110 F ( FIG. 10 (b)), and the each image 110 A, 110 B, 110 C, 110 D, 110 E, 110 F is compared to indicate the portions forming the identical span S of the tunnel lining surface 100 ( FIG. 10 (c)), the image processing for inclined image correction (tilt correction) is performed ( FIG.
  • FIGS. 11, 12 explain the image processing for distortion correction ( FIG. 10 (b)).
  • the photography means 10 When the photography means 10 is arranged so that the collimation of the photography means 10 crosses the tunnel lining surface 100 at right angles and dew such as water drops is formed on the surface of the tunnel lining surface 100 , it is likely that the positional relation of the illumination means 20 and the photography means 10 causes the dew to regularly reflect and partially appear on the image phtographied by the photography means 10 and makes it difficult to identify the defect such as cracking to be appropriately crampd. Therefore, for the purpose of avoiding this, it is effective to incline the axis of emission direction of the illumination means 20 and the axis of the photography direction (collimation) of the photography means 10 .
  • the photography means (line sensor) 10 is arranged so that, the collimation line 11 a of the photography means (line sensor) 10 is inclined forward by the angle ⁇ (8°, for example) to the vertical direction, namely the direction vertical to the ceiling face of the tunnel lining surface 100 , seen from the side face of the vehicle 1 , as shown in FIG. 11 ( a ) , and the collimation line 11 a of the photography means (line sensor) 10 is inclined forward by the angle ⁇ (8°, for example) to the horizontal direction, namely the direction vertical to the side face of the tunnel lining surface 100 , seen from the upper face of the vehicle 1 , as shown in FIG. 11 ( b ) .
  • the emission direction of the illumination means (halogen lamp) 20 is similarly inclined.
  • FIG. 12 ( a ) shows how the collimation line 11 a of the photography means (line sensor) 10 and the joint 111 of the tunnel lining surface 100 are related to each other.
  • the joint vertical to the travel direction of the vehicle 1 (direction of the center line TC in the tunnel) divides each span S 1 , S 2 . . . of the tunnel lining surface 100 .
  • the image processing is performed to correct the joint 111 distorted in the veind image to be in its original linear-shape ( FIG. 12 (c)).
  • FIG. 13 explains the processing ( FIG. 10 (c)) for indicating the portions forming the identical span S of the tunnel lining surface 100 by comparing the each image 110 A- 110 F.
  • This processing is performed on the basis of the joint 111 returned to be linear shaped by the image processing for distortion correction.
  • the each image 110 A, 110 B, 110 C obtained with respect to the left side face 100 L of the tunnel lining surface 100 is placed on the display of the personal computer so as to occupy the same position.
  • the position is matched on the basis of the joint 111 returned to be linear shaped.
  • each position is identified by KP (kilo post), span number (lining number), illumination number, etc., which are penetrated in the each image, and thus, the positions may be recognized based on them.
  • the portions forming the identical span S of the tunnel lining surface 100 are indicated.
  • the image data 110 A- 1 , 110 B- 1 , 110 C- 1 which identify the portions 110 A- 1 , 110 B- 1 , 110 C- 1 which form the identical span S 1 are stored in the folder L 1 of Lane 1 (traveling lane 150 L) while being associated with the address showing the span S 1 .
  • the portions forming the identical span S 2 , S 3 . . . S n are indicated, and the image data which identify the portions which form the identical span S 2 , S 3 . . . S n are stored in the folder L 1 of Lane 1 (traveling lane 150 L).
  • the similar processing is performed for the overtaking lane 150 R.
  • the image data 110 D- 1 , 110 E- 1 , 110 F- 1 which identify the portions 110 D- 1 , 110 E- 1 , 110 F- 1 which form the identical span S 1 are stored in the folder L 2 of Lane 2 (overtaking lane 150 R) while being associated with the address showing the span S 1 .
  • the portions forming the identical span S 2 , S 3 . . . S n are indicated, and the image data which identify the portions which form the identical span S 2 , S 3 . . . 5 n are stored in the folder L 2 of Lane 2 (overtaking lane 150 R).
  • FIG. 14 explains the image processing for inclined image correction (tilt correction) ( FIG. 10 (d)).
  • FIG. 14 (a) shows the positional relation of the tunnel lining surface 100 and the line sensors 10 a , 10 b , 10 c.
  • the line sensors 10 a , 10 b , 10 c are positioned to be off set to the center line TC of the tunnel. Therefore, the collimation lines 11 a of the line sensors 10 a , 10 b , 10 c are not confronting the tunnel lining surface 100 as the photographing object face to face but inclined to it. Accordingly, the line sensors 10 a , 10 b , 10 c , photography the tunnel lining surface 100 situated near as larger images, and photography the tunnel lining surface 100 situated far as smaller images.
  • the area invaded by the line sensors 10 are shown as 100 C, for example, and, among this photography area 100 C, to be specific among each divided area 100 C- 1 , 100 C- 2 , 100 C- 3 , 100 C- 4 , 100 C- 5 , the tunnel lining surface 100 is most largely veind in the divided area 100 C- 1 nearest to the line sensor 10 and the tunnel lining surface 100 is smallest veind in the divided area 100 C- 5 furthest from the line sensor 10 .
  • the image processing is performed so that the tunnel lining surface 100 as the photographing object is penetrated so as to be in the actual size in each area of the images.
  • the angle showing the photography direction of the line sensor 10 (collimation line 11 a ) (the angle to the horizontal line HL).
  • X the position in the two dimensional horizontal direction of the line sensor 10 (lens, sensor) (the relative position on the basis of the center line TC of the tunnel).
  • Y the position in the two dimensional vertical direction of the line sensor 10 (lens, sensor) (the relative position on the basis of the road surface 150 L, 150 R of the tunnel).
  • the picture angle of the line sensor 10 .
  • the image processing is performed so that the tunnel lining surface 100 as the photographing object is penetrated to be in the actual size in the each divided area 100 C- 1 , 100 C- 2 , 100 C- 3 , 100 C- 4 , 100 C- 5 , according to the gained distances d 1 , d 2 , d 3 , d 4 , d 5 .
  • FIG. 15 explains the image synthesis processing ( FIG. 10 ( e ) ) for obtaining the images 110 - 1 , 110 - 2 , . . . 110 - n showing both side faces 100 L, 100 R of the tunnel lining surface 100 according to the each span S 1 , S 2 . . . S n of the tunnel lining surface 100 .
  • the image data 110 A- 1 , 110 B- 1 , 110 C- 1 associated with the address showing the span S 1 are read from the folder L 1 of Lane 1 (traveling lane 150 L).
  • the image data 110 D- 1 , 110 E- 1 , 110 F- 1 associated with the address showing the span S 1 are read from the folder L 2 of Lane 2 (overtaking lane 150 R).
  • each area 100 A, 100 B, 100 C, 100 D, 100 E, 100 F the neighboring areas are partially overlapped.
  • the corresponding points PT are indicated in consideration of the overlapping.
  • the images 110 A- 1 , 110 B- 1 , 110 C- 1 , 110 D- 1 , 110 E- 1 , 110 F- 1 are respectively synthesized on the basis of the corresponding points PT to obtain the image 110 - 1 of the both/left and right sides 100 L and 100 R with respect to the span S 1 of the tunnel lining surface 100 .
  • the similar processing is performed for the span S 2 , S 3 . . . S n, as well, and, as shown in FIG. 15 (b), the images 110 - 2 , 110 - 3 . . . 110 - n showing both side faces 100 L, 100 R of the tunnel lining surface 100 are obtained according to the each span S 2 , S 3 . . . S n of the tunnel lining surface 100 .
  • Each of the images 110 - 2 , 110 - 3 . . . 110 - n is stored in the predetermined file L 3 and used for inspecting the soundness of the tunnel lining surface 100 .
  • the photography means 10 having the photography range of one side face in both side faces 100 L, 100 R of the tunnel lining surface 100 enables the images of both side faces 100 L, 100 R of the tunnel lining surface 100 to be obtained.
  • the images 110 - 1 , 110 - 2 . . . 110 - n showing both side faces 100 L, 100 R of the tunnel lining surface 100 can be obtained according to the each span S 1 , S 2 . . . Sn.
  • the soundness (degree of deterioration) of the tunnel lining surface 100 can be inspected, and according to the each span S 1 , S 2 . . . Sn, the inspection result can be managed.
  • FIG. 16 shows the inspection example of the soundness (degree of deterioration) of the tunnel lining surface 100 .
  • FIG. 16 shows the state in which the images 110 - 1 , 110 - 2 , 110 - 3 , 110 - 4 , 110 - 5 , 110 - 6 of the each span S 1 , S 2 , S 3 , S 4 , S 5 , S 6 are read from the file L 3 and sequentially displayed on the display of the personal computer.
  • the operator on the display, identifies the peelable cracking spot in the images of the tunnel lining surface 100 and draws the cracking 300 .
  • the width, length, direction, shape, density of the cracking 300 are extracted according to the each span S 1 , S 2 , S 3 , S 4 , S 5 , S 6 and stored, while being associated with the image data 110 - 1 , 110 - 2 , 110 - 3 , 110 - 4 , 110 - 5 , 110 - 6 of the each span S 1 , S 2 , S 3 , S 4 , S 5 , S 6 .
  • the occurring situation of abnormal situations such as presence of efflorescence may be extracted.
  • measures can be planned against the abnormal situations such as cracking according to the each span S 1 , S 2 , S 3 , S 4 , S 5 , S 6 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Chemical & Material Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Electromagnetism (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)
US14/903,623 2014-07-25 2014-12-03 Tunnel lining surface inspection system and vehicle used for tunnel lining surface inspection system Expired - Fee Related US10116871B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2014-152322 2014-07-25
JP2014152322A JP6444086B2 (ja) 2014-07-25 2014-07-25 トンネル覆工面調査システムおよびトンネル覆工面調査システムに用いる車両
PCT/JP2014/082021 WO2016013132A1 (ja) 2014-07-25 2014-12-03 トンネル覆工面調査システムおよびトンネル覆工面調査システムに用いる車両

Publications (2)

Publication Number Publication Date
US20160227126A1 US20160227126A1 (en) 2016-08-04
US10116871B2 true US10116871B2 (en) 2018-10-30

Family

ID=55162683

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/903,623 Expired - Fee Related US10116871B2 (en) 2014-07-25 2014-12-03 Tunnel lining surface inspection system and vehicle used for tunnel lining surface inspection system

Country Status (6)

Country Link
US (1) US10116871B2 (ja)
JP (1) JP6444086B2 (ja)
KR (1) KR20170034750A (ja)
CN (1) CN105765374B (ja)
SG (1) SG11201600833YA (ja)
WO (1) WO2016013132A1 (ja)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11193374B2 (en) * 2019-07-19 2021-12-07 Tongji University Method for inspecting service performance of tunnel lining based on defect characteristics thereof

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6373111B2 (ja) * 2014-07-25 2018-08-15 西日本高速道路エンジニアリング四国株式会社 トンネル覆工面調査システムおよびトンネル覆工面調査システムに用いる車両
EP3396309B1 (en) * 2015-12-22 2019-09-25 Mitsubishi Electric Corporation Wobble detection device
WO2018083858A1 (ja) * 2016-11-01 2018-05-11 三菱電機株式会社 移動撮像システム及び撮像方法
JP6927694B2 (ja) * 2016-12-15 2021-09-01 西日本高速道路エンジニアリング四国株式会社 トンネル覆工画像作成システム、および、トンネル覆工画像作成方法
JP2019033478A (ja) * 2017-08-09 2019-02-28 株式会社リコー 構造物壁面の撮影装置、車両、構造物壁面の撮影方法、及びトンネル壁面の撮影方法
EP3442213B1 (en) * 2017-08-09 2020-12-09 Ricoh Company, Ltd. Structure wall imaging device, vehicle, and structure wall imaging method
JP2019109136A (ja) * 2017-12-19 2019-07-04 パナソニックIpマネジメント株式会社 照明装置、照明装置の設置方法及び道路管理システム
CN109696441A (zh) * 2019-02-18 2019-04-30 中国铁路沈阳局集团有限公司科学技术研究所 一种铁路隧道衬砌表面状态检测车
JP7279438B2 (ja) 2019-03-19 2023-05-23 株式会社リコー 撮像装置、車両及び撮像方法
JP6732082B1 (ja) * 2019-09-03 2020-07-29 三菱電機株式会社 画像生成装置、画像生成方法、および画像生成プログラム
JP7329435B2 (ja) * 2019-12-25 2023-08-18 三菱電機株式会社 車両及び撮影方法
JP7458876B2 (ja) 2019-12-25 2024-04-01 三菱電機株式会社 カメラユニット
CN111539286B (zh) * 2020-04-15 2022-11-22 创新奇智(合肥)科技有限公司 衬砌线识别方法、装置及可读存储介质
CN112326552B (zh) * 2020-10-21 2021-09-07 山东大学 基于视觉和力觉感知的隧道掉块病害检测方法和系统
CN113256714B (zh) * 2021-07-13 2021-09-24 湖南大学 隧道表面图像处理方法及系统
JP2023106862A (ja) * 2022-01-21 2023-08-02 株式会社リコー 撮像装置、撮像方法、プログラム、および撮像システム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09284749A (ja) 1996-04-12 1997-10-31 Furukawa Electric Co Ltd:The トンネル内壁面の撮影方法とそれを用いた撮影装置
JP2003185589A (ja) 2001-12-20 2003-07-03 Nishimatsu Constr Co Ltd コンクリート表面の変状調査システム、および、コンクリート表面の変状調査方法
JP2012220471A (ja) 2011-04-14 2012-11-12 Mitsubishi Electric Corp 展開図生成装置、展開図生成方法及び展開図表示方法
JP2014095627A (ja) 2012-11-09 2014-05-22 West Nippon Expressway Engineering Shikoku Co Ltd 道路構造物の表面を調査する装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4888720A (en) * 1987-12-07 1989-12-19 Fryer Glenn E Tunnel measuring apparatus and method
WO2005033475A1 (ja) * 2003-10-01 2005-04-14 Hitachi, Ltd. トンネル落盤監視システム、トンネル落盤監視方法及び土木構造物破損監視システム
CN101957178B (zh) * 2009-07-17 2012-05-23 上海同岩土木工程科技有限公司 一种隧道衬砌裂缝测量方法及其测量装置
CN102346013A (zh) * 2010-07-29 2012-02-08 同济大学 一种隧道衬砌裂缝宽度的测量方法及装置
CN103674963A (zh) * 2013-11-15 2014-03-26 上海嘉珏实业有限公司 一种基于数字全景摄像的隧道检测装置及其检测方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09284749A (ja) 1996-04-12 1997-10-31 Furukawa Electric Co Ltd:The トンネル内壁面の撮影方法とそれを用いた撮影装置
JP2003185589A (ja) 2001-12-20 2003-07-03 Nishimatsu Constr Co Ltd コンクリート表面の変状調査システム、および、コンクリート表面の変状調査方法
JP2012220471A (ja) 2011-04-14 2012-11-12 Mitsubishi Electric Corp 展開図生成装置、展開図生成方法及び展開図表示方法
JP2014095627A (ja) 2012-11-09 2014-05-22 West Nippon Expressway Engineering Shikoku Co Ltd 道路構造物の表面を調査する装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
International Search Report for PCT/JP2014/082021 dated Jan. 20, 2015.

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11193374B2 (en) * 2019-07-19 2021-12-07 Tongji University Method for inspecting service performance of tunnel lining based on defect characteristics thereof

Also Published As

Publication number Publication date
US20160227126A1 (en) 2016-08-04
SG11201600833YA (en) 2016-04-28
CN105765374A (zh) 2016-07-13
WO2016013132A1 (ja) 2016-01-28
JP2016031248A (ja) 2016-03-07
KR20170034750A (ko) 2017-03-29
CN105765374B (zh) 2020-04-10
JP6444086B2 (ja) 2018-12-26

Similar Documents

Publication Publication Date Title
US10116871B2 (en) Tunnel lining surface inspection system and vehicle used for tunnel lining surface inspection system
US9810642B2 (en) Tunnel lining surface inspection system and vehicle used in tunnel lining surface inspection system
JP6068099B2 (ja) 道路構造物の表面を調査する装置
AU2005242076B2 (en) Digital camera with non-uniform image resolution
KR20090004636A (ko) 기판 외관 검사 장치
JP2010223621A (ja) 管状品の内表面検査方法
TWI661191B (zh) 顯示面板檢查裝置及顯示面板檢查方法
RU2604168C2 (ru) Система машинного зрения, позволяющая определять неоднородности глубины объектов изображения
WO2015093147A1 (ja) マルチカメラ撮影システムおよびマルチカメラ撮影画像の合成方法
JP2015049765A (ja) トンネル覆工面画像の歪みを補正する方法
JP2016045194A (ja) 光学フィルム検査装置
JP2009133797A (ja) 基板表面検査装置及び基板表面検査方法
JP2015219014A (ja) 物体診断装置
KR102270768B1 (ko) 터널 내 표면의 균열 검출 시스템 및 방법
JP6933887B2 (ja) 検査装置および検査方法
US20210025834A1 (en) Image Capturing Devices and Associated Methods
JP2005249946A (ja) 表示装置の欠陥検査装置
JP2003344908A (ja) 点検装置用カメラヘッド
JP6739325B2 (ja) 外観画像の作成方法及びルックアップテーブル作成用冶具
JPH08186808A (ja) 画像処理方法及びその装置
ES2538080T3 (es) Un dispositivo de inspección para elementos mecánicos y similares
JP2015137977A (ja) 配管撮像装置とこれにより得た画像データの画像処理装置
JP2009198397A (ja) 基板検査装置および基板検査方法
JP7332407B2 (ja) 撮像装置、画像取得方法および検査装置
JP5490855B2 (ja) 板状基板のエッジ検査装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: WEST NIPPON EXPRESSWAY ENGINEERING SHIKOKU COMPANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AKASHI, YUKIO;HASHIMOTO, KAZUAKI;HAYASHI, SHOGO;REEL/FRAME:037470/0603

Effective date: 20151210

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20221030