WO2013038550A1 - Dispositif d'inspection par immersion - Google Patents

Dispositif d'inspection par immersion Download PDF

Info

Publication number
WO2013038550A1
WO2013038550A1 PCT/JP2011/071155 JP2011071155W WO2013038550A1 WO 2013038550 A1 WO2013038550 A1 WO 2013038550A1 JP 2011071155 W JP2011071155 W JP 2011071155W WO 2013038550 A1 WO2013038550 A1 WO 2013038550A1
Authority
WO
WIPO (PCT)
Prior art keywords
illumination
image
inspection apparatus
underwater inspection
unit
Prior art date
Application number
PCT/JP2011/071155
Other languages
English (en)
Japanese (ja)
Inventor
岡田 聡
亮介 小林
小池 正浩
貴史 渡邊
Original Assignee
日立Geニュークリア・エナジー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立Geニュークリア・エナジー株式会社 filed Critical 日立Geニュークリア・エナジー株式会社
Priority to PCT/JP2011/071155 priority Critical patent/WO2013038550A1/fr
Priority to JP2013533421A priority patent/JP5696221B2/ja
Publication of WO2013038550A1 publication Critical patent/WO2013038550A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G21NUCLEAR PHYSICS; NUCLEAR ENGINEERING
    • G21CNUCLEAR REACTORS
    • G21C17/00Monitoring; Testing ; Maintaining
    • G21C17/003Remote inspection of vessels, e.g. pressure vessels
    • G21C17/013Inspection vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/954Inspecting the inner surface of hollow bodies, e.g. bores
    • GPHYSICS
    • G21NUCLEAR PHYSICS; NUCLEAR ENGINEERING
    • G21CNUCLEAR REACTORS
    • G21C17/00Monitoring; Testing ; Maintaining
    • G21C17/003Remote inspection of vessels, e.g. pressure vessels
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/954Inspecting the inner surface of hollow bodies, e.g. bores
    • G01N2021/9542Inspecting the inner surface of hollow bodies, e.g. bores using a probe
    • G01N2021/9544Inspecting the inner surface of hollow bodies, e.g. bores using a probe with emitter and receiver on the probe
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E30/00Energy generation of nuclear origin
    • Y02E30/30Nuclear fission reactors

Definitions

  • the present invention relates to an underwater inspection apparatus, and more particularly to an underwater inspection apparatus suitable for use in inspecting in-reactor equipment such as a jet pump in addition to a shroud and a pressure vessel in a nuclear reactor.
  • Patent Document 1 an apparatus that moves an illumination apparatus in synchronization with the imaging apparatus is known (for example, see Patent Document 1).
  • Patent Document 3 an imaging device for inspection in a nuclear reactor, one that stably operates illumination and a camera is known (for example, see Patent Document 3).
  • the apparatus of Patent Document 1 has a configuration in which the positional relationship between the imaging apparatus and the illumination can be manually changed.
  • the underwater inspection apparatus that is the subject of the present invention needs to determine the optimal angle of illumination depending on the state of the target location, regardless of the subjectivity of the inspector.
  • a defect opening on the inner wall surface of the reactor vessel such as a crack (crack)
  • the appearance of the defect differs depending on the angle at which the defect is irradiated with light from illumination. It is necessary to irradiate light from the illumination at an optimal irradiation angle.
  • Patent Document 2 and Patent Document 3 Since the devices described in Patent Document 2 and Patent Document 3 are used by fixing the positional relationship between the imaging device and the illumination, they cannot be used for the purpose of determining the optimum angle of illumination.
  • An object of the present invention is to provide an underwater inspection apparatus capable of optimally setting an illumination angle of illumination on an object using only image information in order to solve the above-described problems.
  • the present invention is an underwater inspection apparatus for displaying an image of the surface of an inspection object photographed by an imaging means on a display means, and irradiates the inspection object.
  • An illuminating unit having a joint for adjusting the angle of light in the vertical direction and the horizontal direction, an image processing unit for calculating an image parameter indicating a color tone of the surface of the inspection object from an image photographed by the imaging unit, An illumination determination unit that evaluates an illumination state using a parameter, and an illumination control unit that controls an illumination angle in the illumination unit using a determination result from the illumination determination unit.
  • the image parameter is an amount having a property indicating a contrast of a color image, and a luminance signal (Y) after YUV conversion or a saturation signal (S ) Or luminance signal (I), brightness signal (V) after HSV conversion, luminance signal (Y) after conversion after YIQ conversion, or luminance signal (L) after HLS conversion Amount.
  • a defect determining means for determining the presence / absence of a defect from an image taken by the imaging means is provided.
  • the defect determining means compares the value of the image parameter with a preset threshold value to determine the presence / absence of a defect.
  • a focus control means for focusing the image pickup means using the image parameter is provided.
  • the underwater inspection apparatus includes an inspection vehicle that enables three-dimensional migration and is equipped with the imaging means, and further uses the image parameter to perform the imaging. Vibration suppression means for suppressing the vibration of the means is provided.
  • the underwater inspection apparatus of this embodiment is used when carrying out the defect inspection in a nuclear reactor, especially the visual inspection of a structure.
  • FIG. 1 is an equipment layout diagram for underwater inspection work using the underwater inspection apparatus according to the first embodiment of the present invention.
  • a shroud 2 Inside the nuclear reactor 1, there are structures such as a shroud 2, an upper lattice plate 3, a core support plate 4, and a shroud support 5. These structures are located in the water.
  • an operation floor 6 that is a work space is provided in the upper part of the nuclear reactor 1, and a fuel changer 7 is also provided in the upper part.
  • the camera unit 8 that has entered the reactor 1 is fixed using a camera fixing jig 9 and connected to the control device 10.
  • the control device 10 supplies power for operating the camera unit 8 and performs video communication in order to perform a visual inspection at the inspection target location.
  • a display device 11 is connected to the control device 10 and displays an image from an imaging means mounted on the camera unit 8.
  • a controller 12 is connected to the control device 10 and is operated by an inspector 13a.
  • the camera operator 13b uses the camera fixing jig 9 to adjust the position of the camera unit.
  • the control device 10 can automatically control the image display processing means for generating a display image for displaying an image picked up by the image pickup means mounted on the camera unit 8 and the camera unit 8. The control means for making it.
  • FIG. 2 is a bird's-eye view showing the configuration of the camera unit used in the underwater inspection apparatus according to the first embodiment of the present invention.
  • FIG. 3 is a block diagram showing the configuration of the camera unit used in the underwater inspection apparatus according to the first embodiment of the present invention.
  • the camera unit 8 includes a camera 22, a lens 23, and a lens driving stage 24 that drives the lens 23.
  • the lens 23 is moved by the lens driving stage 24 and has a structure for focusing.
  • the lens drive stage 24 is preferably a linear drive ultrasonic motor, but may be any actuator capable of linear drive.
  • the front surface of the lens 23 is protected by a lens cover 24.
  • the camera unit 8 is provided with an illumination unit 20 at the top, and an illumination 21 is attached to the tip of the illumination unit 20.
  • the illumination unit 20 is directly attached to the camera fixing jig 9, the connection portion is connected by an illumination unit vertical movement joint 26 for changing the direction in the vertical direction.
  • a plurality of illumination unit links 27A, 27b, and 27c are connected by illumination unit left and right moving joints 28A and 28b.
  • the direction in which the camera fixture 9 extends that is, the vertical direction of the nuclear reactor 1 in FIG.
  • Two directions orthogonal to the Z-axis are defined as an X-axis direction and a Y-axis direction.
  • the X-axis direction is the optical axis direction of the camera 22.
  • the Y-axis direction is a direction orthogonal to each of the Z-axis and the X-axis.
  • the illumination unit vertical movement joint 26 can change the angle ⁇ in the direction indicated by the broken line in the drawing.
  • the illumination unit vertical movement joint 26 can change the angle of the illumination 21 around the Y axis in the XZ plane.
  • the angle ⁇ is, for example, about 0 to 45 degrees with respect to the optical axis of the camera 22.
  • the lighting unit left and right moving joints 28A and 28b can change the angle ⁇ in the direction indicated by the one-dot chain line in the drawing.
  • the illumination unit left and right moving joints 28A and 28b can change the angle of the illumination 21 around the Z axis in the XY plane.
  • the angle ⁇ can be varied in the range of ⁇ 45 degrees to +45 degrees around the Z axis when the X axis direction is 0 degrees.
  • the camera 22 images the subject (inner wall surface of the reactor 1) in the imaging area IA on the optical axis.
  • the illumination 21 irradiates the imaging area IA with light.
  • the angle of light hitting the imaging area IA (the angle formed by the optical axis (dashed line) from the illumination 21 with respect to the imaging area IA) can be changed.
  • the angle at which the light from the illumination 21 is irradiated can be changed with respect to the imaging area IA. 26 and the lighting unit left and right moving joints 28A and 28b are driven.
  • FIG. 3 shows a system connection state of the camera unit 8 and the control device 10.
  • the CCD camera 22 is provided inside the camera unit 8 of FIG.
  • the GUI 32 includes the display device 11 and the controller 12 shown in FIG.
  • 1 includes a system control means 30, an image processing means 31, an illumination driver 32, illumination joint motor drivers 33a, 33b and 33c, a lens moving stage driver 34, and a defect. Determination means 35 is provided.
  • illumination determination means 30A and illumination control means 30B are provided.
  • the system control means 30 is connected to the illumination driver 32, the illumination joint motor drivers 33a to 33c, and the lens moving stage driver 34, respectively, and is controlled by the GUI 32 including the display device 11 and the controller 12.
  • the illumination driver 32 controls the illumination 21.
  • the lighting joint motor drivers 33a to 33c control the lighting unit up / down moving joint 26 and the lighting unit left / right moving joints 28A, 28b.
  • the lens moving stage driver 34 controls the lens driving stage 24.
  • the image processing means 31 is connected to the camera 22 and performs image processing for giving a command to the system control means 30.
  • the illumination determination unit 30A determines whether the illumination is at an optimal angle.
  • the illumination control unit 30B controls the illumination joint motor drivers 33a to 33c based on the determination of the illumination determination unit 30A to control the irradiation angle of the illumination 22 to be optimum.
  • the defect determination unit 35 automatically determines a defect based on the image information obtained from the image processing unit 31.
  • FIGS. 4 and 5 are flowcharts showing the contents of the image processing method in the underwater inspection apparatus according to the first embodiment of the present invention.
  • 6 and 7 are explanatory diagrams of an image processing method in the underwater inspection apparatus according to the first embodiment of the present invention.
  • the image processing means 31 shown in FIG. 3 executes image processing as shown in FIG. That is, after starting the processing, the illumination determination unit 30A and the illumination control unit 30B perform optimal illumination control (step 01), set an optimal illumination state, and the defect determination unit 35 performs defect determination (step S02). Perform termination processing. Details of the optimum illumination control in step 01 will be described later with reference to FIG. Details of the defect determination content in step 02 will be described later with reference to FIG.
  • the illumination determination unit 30A sets image parameters (described later) used for determining the illumination state (step S10).
  • the image parameter indicates the color tone of the surface of the inner wall of the reactor that is the object. For example, when YUV is used as the color space, it is a luminance signal (Y) after YUV conversion.
  • the illumination control means 30B moves the illumination to the initial position (step S11).
  • the illumination position can be moved in the vertical and horizontal directions, and an optimum angle is set by two-dimensional scanning. For example, if the angle can be changed in the range of 0 to 45 degrees in the vertical direction, and the angle can be changed in the range of ⁇ 45 degrees to +45 degrees in the left and right direction, the initial position is, for example, The angular position is 45 degrees vertically and -45 degrees horizontally. This is started as an initial position (step S11).
  • the illumination determination means 30A captures an image at this initial position (step S12), calculates and records an image parameter (step S13).
  • Image parameters are acquired by taking an average value of image parameters (for example, luminance signal (Y) after YUV conversion) of a predetermined region (for example, a region of 50 pixels ⁇ 50 pixels) near the center of the imaging region. Calculate and record.
  • the illumination determination means 30A determines whether or not the position of the illumination is located at the rightmost position (step S14), and if not, the illumination control means 30B determines that the illumination unit left / right moving joints 28A and 28b. Is driven by a predetermined amount (step S15), the process returns from step S16 to step S17, and the image capture is repeated again (step S12). That is, by repeating steps S12 to S15, the angle position of 45 degrees in the vertical direction and ⁇ 45 degrees in the horizontal direction is set as the initial position, and the horizontal direction is ⁇ 45 degrees to +45 degrees without changing the vertical angle. The image parameter is calculated every time the predetermined angle is moved. In this way, when the left and right directions are sequentially changed while keeping the vertical direction at the same angle, for example, as shown in FIG. The following characteristics can be obtained.
  • step S14 the illumination determination unit 30A is located at the rightmost position (ie, when it is determined that the angle change in the left-right direction is completed within the same vertical angle), then If it is not located, the illumination control means 30B drives the vertical movement motor in the vertical movement joint 26 of the illumination unit (step S19). Change the irradiation angle. That is, for example, in the initial position, when the angle is 45 degrees in the vertical direction, the vertical movement motor is driven so that the vertical direction is 40 degrees. And it returns to step S17 from step S20, and repeats from image capture again (step S12).
  • Steps S12 to S15 the image parameter is calculated every time the predetermined angle is moved by changing the vertical direction by 40 degrees and the horizontal direction from -45 degrees to +45 degrees.
  • the image parameters are set such that the horizontal angle is the horizontal axis and the image parameter is the vertical axis, for example, as shown in FIG. The following characteristics can be obtained.
  • the illumination determination unit 30A compares all the image parameters and determines the illumination position that gives the maximum value of the image parameters (step S18). S21), the illumination control means 30B moves the illumination 22 to that position (step S22). This maximum value will be described later with reference to FIG.
  • the image parameter for determining the optimal illumination state varies depending on the color space.
  • brightness V is selected when HSV is selected
  • intensity I is selected when HIS is selected
  • luminance is selected when a space that is separated into luminance and color difference, such as YIQ, YUV, HLS, YCbCr, YPbPr, is selected. (Y or L) is used to determine the illumination state.
  • the image parameter is a quantity having a property indicating the contrast of the color image, and the luminance signal (Y) after YUV conversion, or the saturation signal (S) or luminance signal (I) after HIS conversion, after HSV conversion.
  • the optimal image parameters differ depending on the nature of the defect. For example, even when the defect is a crack, the luminance signal (Y) after YUV conversion may be optimal depending on the direction and shape of the opening, or the saturation signal (S) after HIS conversion may be optimal. There is also.
  • the optimum image parameter is different from that when cracking. Therefore, a plurality of image parameters are used, and the relationship between the angle and the image parameter is obtained for each image parameter. Then, when the optimum illumination position (angle) is obtained, the optimum image parameter that can obtain the maximum value is used.
  • the image parameter 40 shown in FIG. 7A is obtained when the illumination is applied by swinging the angle ⁇ from the left to ⁇ 1 to + ⁇ 1 with an angle of ⁇ 3 in the vertical direction with respect to the inspection target surface.
  • the maximum value of the image parameter 40 is M1.
  • the image parameter 42 in FIG. 7B is a value when the angle in the vertical direction is ⁇ 2.
  • the maximum value of the image parameter 42 is M2.
  • the image parameter 44 in FIG. 7C is a value when the vertical angle is ⁇ 3.
  • the maximum value of the image parameter 44 is M3.
  • the angle at which the maximum value M3 is obtained is ⁇ 3.
  • step S21 the lighting unit up / down moving joint 26 and the lighting unit left / right moving joints 28A, 28b are driven, and the angle of the lighting 22 is adjusted so that the vertical direction is the angle ⁇ 3 and the horizontal direction is the angle ⁇ 3. Control.
  • step 02 of FIG. 4 Next, the defect determination method in step 02 of FIG. 4 will be described.
  • a threshold value 46 to be determined as a defect is set in advance in the image parameter. If the threshold is exceeded even once by changing the angle of illumination, it is determined as a defect. In the example of FIG. 7, as shown in FIG. 7C, when the vertical direction ⁇ 3 and the horizontal direction ⁇ 3, the threshold value 46 is exceeded and a defect is determined.
  • the present embodiment it is possible to set an optimal illumination state using any one of the feature values of lightness, intensity, and luminance representing the color tone change in the acquired image. In addition, it is possible to determine the presence or absence of defects.
  • work using the underwater inspection apparatus by this embodiment is the same as that of what was shown in FIG.
  • the configuration of the camera unit used in the underwater inspection apparatus according to the present embodiment is the same as that shown in FIGS.
  • FIG. 8 and 10 are flowcharts showing the contents of the image processing method in the underwater inspection apparatus according to the second embodiment of the present invention.
  • FIG. 9 is an explanatory diagram of an image processing method in the underwater inspection apparatus according to the second embodiment of the present invention.
  • the arrangement of equipment during underwater inspection using the underwater inspection apparatus is different in that the system control means 30 in FIG. 3 is provided with means for automatically driving the lens drive stage driver 34.
  • the image processing means 31 a procedure for determining whether or not the focus of the lens 23 is in agreement with the camera 22 is added, and only differences from the first embodiment will be described below.
  • step S03 focus control in step S03 is added to FIG.
  • the system control unit 30 performs focus control (step S03), the illumination determination unit 30A and the illumination control unit 30B perform optimum illumination control (step S01), sets an optimum illumination state, and determines a defect. After the means 35 determines the defect (step S02), the termination process is performed.
  • FIG. 9 shows an image of the photographed image
  • FIG. 9A shows a state where the image is out of focus
  • FIG. 9B shows a state where the image is in focus (in focus). It is assumed that there is an inspection object 51 that is not in focus in the image 50 that is not in focus.
  • the video line 52 used for focus determination is scanned from the upper part to the lower part of the image. When scanning, the same determination as the image parameter of FIG. 7 of the first embodiment is performed.
  • the image parameter is used as a reference for determining the illumination state.
  • the image parameter is used to determine whether or not the focus is matched. Focusing can be determined based on the contrast of the two areas, so that the difference between the image parameters of the two areas is extracted.
  • the system control means 30 first captures an image (step S30). Processing for extracting the target region is performed on the image (step S31).
  • the target area is two adjacent areas near the center of the imaging area. For example, an area of 50 pixels ⁇ 50 pixels that is close to or adjacent to each other is set near the center of the imaging area.
  • the system control means 30 obtains the average value of the image parameters of the two areas for the extracted target area, extracts the difference between the two average values as a focus determination signal (step S32), and If the value is evaluated (step S33) and the value is smaller than the previous focus determination signal, the motor driver is driven (step S34), and the process returns to the image capturing process.
  • step S02 Processing after the optimal illumination control (step S02) is the same as that of the first embodiment.
  • the focus and illumination are simultaneously optimized for the inspection image that is out of focus by using the image parameter representing the color tone change in the acquired image, and whether there is a defect or not. Can be determined.
  • the inspection vehicle 60 is used instead of the inspection camera unit 8 shown in FIG. 1, and the vehicle cable is used instead of the camera fixing jig.
  • the difference is that 61 is used.
  • the present embodiment is intended to achieve the same purpose as the first embodiment, but the device in which the camera is mounted is not a fixed type but an electrophoretic type inspection vehicle 60.
  • Another difference is that a function for suppressing the vibration of the acquired image is added to suppress the shake.
  • Other configurations, image processing methods, and display methods are the same.
  • FIG. 11 is an equipment layout diagram during underwater inspection work using the underwater inspection apparatus according to the third embodiment of the present invention.
  • the inspection vehicle 60 is put into the nuclear reactor 1, and the control device 10A, the display device 11, and the controller 12A are connected via the vehicle cable 61.
  • the controller 12A is operated by the vehicle operator 13a, and the vehicle cable 61 is operated by the operation assistant 13b and adjusts its position.
  • the system control unit 30 in FIG. 3 includes a unit that suppresses vibration of the acquired image in order to suppress vertical and horizontal shakes in the migration type inspection vehicle 60.
  • FIG. 12 is a bird's-eye view showing the configuration of an inspection vehicle used in the underwater inspection apparatus according to the third embodiment of the present invention.
  • FIG. 12 shows the structure of the inspection vehicle 60.
  • a camera unit 70 for visual inspection is attached to the inspection vehicle 60. Further, in order to enable three-dimensional migration, a lifting thruster 71, a propulsion thruster 72, and translational turning thrusters 73A and 73b are mounted.
  • the illumination 22 of the illumination unit 20 can change the angle in the vertical direction and the horizontal direction, and the irradiation angle of light from the illumination 22 in the imaging region can be changed.
  • FIGS. 13, 15 and 16 are flowcharts showing the contents of the image processing method in the underwater inspection apparatus according to the third embodiment of the present invention.
  • FIG. 14 is an explanatory diagram of an image processing method in the underwater inspection apparatus according to the third embodiment of the present invention.
  • the system control means 30 shown in FIG. 3 performs image vibration suppression (step S04), performs focus control (step S03), performs optimum illumination control (step S01), and sets an optimum illumination state.
  • the defect determination means 35 performs a termination process after performing the defect determination (step S02).
  • the focus control (step S03) is the content described with reference to FIG.
  • the optimum illumination control (step S01) is the content described in FIG.
  • the defect determination (step S02) is the content described with reference to FIG. That is, image vibration suppression (step S04) is unique to this embodiment, and this point will be described below.
  • FIG. 14 shows an image of the display image range.
  • the display image 81 at the time Ti is shifted from the display image 80 at the time Ti-1 by the image shift amount 82 in the X direction and the image shift amount 83 in the Y direction. Note that since the display size does not change at either time Ti-1 or time Ti, there is no image information corresponding to the shift, so the blank area 84 is inserted at time Ti.
  • step S04 the image vibration suppression process is executed by the system control unit 30 in FIG.
  • step S60 a time constant N is input.
  • step S61 initial image processing at time T0 is performed.
  • step S62 an initial photographed image is read.
  • step S63 distortion caused by the camera lens is corrected.
  • step S63 the initial image is stored.
  • step S65 an initial image correlation within a range determined by the time constant N is calculated, and in step S66, an image shake amount is calculated.
  • a method for calculating the amount of image blur will be described later with reference to FIG.
  • step S68 image correction processing
  • step S69 the image shake amount is calculated (step S69)
  • step S70 the past shake amount is read (step S70)
  • the average value is calculated (S71)
  • step S72 the image is shifted (step S72), and the result is displayed. (Step S73).
  • Step S73 a blank area is inserted and displayed at the edge of the screen.
  • step S74 loop determination is performed in the next step S74. If there is no end input from the operator in step S75, the process jumps from step S76 to step S77, and the processes from step S67 to step S74 are repeated.
  • step S66 a specific procedure for calculating the amount of image shake in step S66 will be described with reference to FIG.
  • step S80 an image at the current time Ti is acquired in step S80. More specifically, the current time image is captured in step S81, distortion is corrected in step S82, and the original image is stored in step S83.
  • step S84 the image at the previous time Ti-1 is read, and in step 85, image correlation calculation is performed.
  • step S86 the image shake amount (A, B) shown in FIG. 14 is determined, and finally stored in step S87.
  • Illumination control means 31 ... Image processing means 32 ... Illumination drivers 33a, 33b, 33c ... Illumination joint motor driver 34 ... Lens Moving stage driver 35... Defect determination means 60.
  • Kuru 61 ... Vehicle cable 70 ... camera unit 71 ... lift thruster 72 ... propulsion thrusters 73A, 73b ... translation swing thruster

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • High Energy & Nuclear Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Plasma & Fusion (AREA)
  • Analytical Chemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Biochemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
  • Monitoring And Testing Of Nuclear Reactors (AREA)

Abstract

Le dispositif d'inspection par immersion ci-décrit permet de régler optimalement un angle de projection d'éclairage sur un objet uniquement à l'aide d'informations d'images. L'unité d'éclairage (20) comporte une articulation qui règle l'angle auquel la lumière est projetée verticalement et horizontalement sur la face de la paroi intérieure d'un réacteur atomique (1) qui est l'objet à inspecter. Un moyen de traitement de l'image (31) calcule un paramètre d'image à partir d'une image qui est photographiée à l'aide d'un appareil (22) qui indique un ton de la surface de l'objet qui doit être inspecté. Un moyen de détermination d'éclairage (30A) utilise le paramètre d'image pour évaluer la condition d'éclairage. Un moyen de commande d'éclairage (30B) utilise le résultat de la détermination fournie par le moyen de détermination d'éclairage (30A) pour commander l'angle d'éclairage par l'unité d'éclairage.
PCT/JP2011/071155 2011-09-15 2011-09-15 Dispositif d'inspection par immersion WO2013038550A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2011/071155 WO2013038550A1 (fr) 2011-09-15 2011-09-15 Dispositif d'inspection par immersion
JP2013533421A JP5696221B2 (ja) 2011-09-15 2011-09-15 水中検査装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/071155 WO2013038550A1 (fr) 2011-09-15 2011-09-15 Dispositif d'inspection par immersion

Publications (1)

Publication Number Publication Date
WO2013038550A1 true WO2013038550A1 (fr) 2013-03-21

Family

ID=47882809

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/071155 WO2013038550A1 (fr) 2011-09-15 2011-09-15 Dispositif d'inspection par immersion

Country Status (2)

Country Link
JP (1) JP5696221B2 (fr)
WO (1) WO2013038550A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017033498A1 (fr) * 2015-08-24 2017-03-02 株式会社Jvcケンウッド Dispositif de photographie sous-marine, procédé de commande de dispositif de photographie sous-marine, et programme de commande de dispositif de photographie sous-marine
JP2019120491A (ja) * 2017-12-28 2019-07-22 日立Geニュークリア・エナジー株式会社 欠陥検査方法、および、欠陥検査システム
WO2020174042A1 (fr) * 2019-02-27 2020-09-03 Alfa Laval Corporate Ab Caméra d'inspection

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0419787U (fr) * 1990-06-05 1992-02-19
JPH08285984A (ja) * 1995-04-19 1996-11-01 Mitsubishi Heavy Ind Ltd 浮上型遠隔目視点検装置
JPH10238699A (ja) * 1997-02-28 1998-09-08 Kansai Electric Power Co Inc:The タンク内部検査装置
JP2004333356A (ja) * 2003-05-09 2004-11-25 Kanto Auto Works Ltd 車両用塗面検査装置
JP2008046103A (ja) * 2006-07-19 2008-02-28 Shimatec:Kk 表面検査装置
JP2010066963A (ja) * 2008-09-10 2010-03-25 Hitachi-Ge Nuclear Energy Ltd 画像処理方法,画像処理装置およびそれを搭載した水中検査装置
JP4675436B1 (ja) * 2009-09-03 2011-04-20 シーシーエス株式会社 表面検査用照明・撮像システム及びデータ構造
JP2011181984A (ja) * 2010-02-26 2011-09-15 Hitachi-Ge Nuclear Energy Ltd 画像処理方法,画像処理装置およびそれを搭載した水中検査装置

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05164880A (ja) * 1991-12-11 1993-06-29 Hitachi Ltd 原子炉格納容器内の点検装置
JPH09304576A (ja) * 1996-05-17 1997-11-28 Hitachi Ltd 燃料集合体装荷確認装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0419787U (fr) * 1990-06-05 1992-02-19
JPH08285984A (ja) * 1995-04-19 1996-11-01 Mitsubishi Heavy Ind Ltd 浮上型遠隔目視点検装置
JPH10238699A (ja) * 1997-02-28 1998-09-08 Kansai Electric Power Co Inc:The タンク内部検査装置
JP2004333356A (ja) * 2003-05-09 2004-11-25 Kanto Auto Works Ltd 車両用塗面検査装置
JP2008046103A (ja) * 2006-07-19 2008-02-28 Shimatec:Kk 表面検査装置
JP2010066963A (ja) * 2008-09-10 2010-03-25 Hitachi-Ge Nuclear Energy Ltd 画像処理方法,画像処理装置およびそれを搭載した水中検査装置
JP4675436B1 (ja) * 2009-09-03 2011-04-20 シーシーエス株式会社 表面検査用照明・撮像システム及びデータ構造
JP2011181984A (ja) * 2010-02-26 2011-09-15 Hitachi-Ge Nuclear Energy Ltd 画像処理方法,画像処理装置およびそれを搭載した水中検査装置

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017033498A1 (fr) * 2015-08-24 2017-03-02 株式会社Jvcケンウッド Dispositif de photographie sous-marine, procédé de commande de dispositif de photographie sous-marine, et programme de commande de dispositif de photographie sous-marine
JP2017046021A (ja) * 2015-08-24 2017-03-02 株式会社Jvcケンウッド 水中撮影装置、水中撮影装置の制御方法、水中撮影装置の制御プログラム
US10594947B2 (en) 2015-08-24 2020-03-17 JVC Kenwood Corporation Underwater imaging apparatus, method for controlling an underwater imaging apparatus, and program for controlling an underwater imaging apparatus
JP2019120491A (ja) * 2017-12-28 2019-07-22 日立Geニュークリア・エナジー株式会社 欠陥検査方法、および、欠陥検査システム
JP7178171B2 (ja) 2017-12-28 2022-11-25 日立Geニュークリア・エナジー株式会社 欠陥検査方法、および、欠陥検査システム
WO2020174042A1 (fr) * 2019-02-27 2020-09-03 Alfa Laval Corporate Ab Caméra d'inspection

Also Published As

Publication number Publication date
JP5696221B2 (ja) 2015-04-08
JPWO2013038550A1 (ja) 2015-03-23

Similar Documents

Publication Publication Date Title
JP5801858B2 (ja) 溶接設備および溶接方法
JP6299111B2 (ja) レーザ加工装置
US8363099B2 (en) Microscope system and method of operation thereof
JP5128920B2 (ja) 基板表面検査装置及び基板表面検査方法
WO2018066576A1 (fr) Procédé d'inspection d'apparence
JPH10290389A (ja) マルチフォーカス画像作成方法及び画像作成装置
JP4709762B2 (ja) 画像処理装置及び方法
JP6233824B1 (ja) 画像検査装置、生産システム、画像検査方法、プログラム及び記憶媒体
JP2019058942A (ja) 溶接外観不良検出装置、レーザ溶接装置、及び、溶接外観不良検出方法
JP5696221B2 (ja) 水中検査装置
JP4490154B2 (ja) 細胞培養装置
JP2005049221A (ja) 検査装置および検査方法
JP2009250844A (ja) 三次元形状計測方法および三次元形状計測装置
JP5417197B2 (ja) 検査装置および検査方法
JP3741287B1 (ja) 実装基板の検査方法及び検査装置
JP2011191170A (ja) 画像処理装置
JP2007334423A (ja) 自動撮影装置
JP2012137706A (ja) 画像処理方法,画像処理装置およびそれを搭載した水中検査装置
CN110849285A (zh) 基于单目相机的焊点深度测量方法、系统及介质
JP2016065875A (ja) 画像処理装置及び画像処理方法
JP5615252B2 (ja) 外観検査装置
JP2007285868A (ja) 輝度勾配検出方法、欠陥検出方法、輝度勾配検出装置および欠陥検出装置
JP2004042118A (ja) 溶接位置自動倣い制御装置及び自動倣い溶接方法
KR101665764B1 (ko) 묘화 장치, 기판 처리 시스템 및 묘화 방법
JP6184746B2 (ja) 欠陥検出装置、欠陥修正装置および欠陥検出方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11872207

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2013533421

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11872207

Country of ref document: EP

Kind code of ref document: A1