WO2022244752A1 - 監視システム、および、監視方法 - Google Patents

監視システム、および、監視方法 Download PDF

Info

Publication number
WO2022244752A1
WO2022244752A1 PCT/JP2022/020456 JP2022020456W WO2022244752A1 WO 2022244752 A1 WO2022244752 A1 WO 2022244752A1 JP 2022020456 W JP2022020456 W JP 2022020456W WO 2022244752 A1 WO2022244752 A1 WO 2022244752A1
Authority
WO
WIPO (PCT)
Prior art keywords
photographing
blade
image information
imaging
control device
Prior art date
Application number
PCT/JP2022/020456
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
昭義 小村
幸弘 白濱
誠 恒川
浩一郎 高橋
昇三 宮部
Original Assignee
株式会社日立パワーソリューションズ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立パワーソリューションズ filed Critical 株式会社日立パワーソリューションズ
Publication of WO2022244752A1 publication Critical patent/WO2022244752A1/ja

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F03MACHINES OR ENGINES FOR LIQUIDS; WIND, SPRING, OR WEIGHT MOTORS; PRODUCING MECHANICAL POWER OR A REACTIVE PROPULSIVE THRUST, NOT OTHERWISE PROVIDED FOR
    • F03DWIND MOTORS
    • F03D1/00Wind motors with rotation axis substantially parallel to the air flow entering the rotor 
    • F03D1/04Wind motors with rotation axis substantially parallel to the air flow entering the rotor  having stationary wind-guiding means, e.g. with shrouds or channels
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F03MACHINES OR ENGINES FOR LIQUIDS; WIND, SPRING, OR WEIGHT MOTORS; PRODUCING MECHANICAL POWER OR A REACTIVE PROPULSIVE THRUST, NOT OTHERWISE PROVIDED FOR
    • F03DWIND MOTORS
    • F03D17/00Monitoring or testing of wind motors, e.g. diagnostics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E10/00Energy generation through renewable energy sources
    • Y02E10/70Wind energy
    • Y02E10/72Wind turbines with rotation axis in wind direction

Definitions

  • the present invention relates to a monitoring system and a monitoring method.
  • Patent Literature 1 discloses a report creation device capable of efficiently creating an inspection report for wind power generation equipment. Further, paragraph 0027 of Patent Document 1 includes the following description. "As described above, the unmanned flying object 2 is flight-controlled by remote control, and the flight control system calculates the distance between the unmanned flying object 2 and the structure detected by the collision avoidance sensor, and even in a strong wind environment, It is preferable to have a function to control the flight to be sufficiently close to the object to be inspected so that damage to the object to be inspected can be observed, while keeping the necessary separation distance so as not to collide with the object to be inspected.”
  • the blades of the wind turbine are approached and photographed, and the damaged portion is inspected from the photographed image. Since many wind turbines are in operation at each site, it is desirable to shorten the time required for inspection by streamlining the photography of wind turbine blades.
  • the blades of the windmill are located high above the tower, so the drone flies close to the blades and shoots them. The problem here is the collision between the drone and the blade.
  • the main object of the present invention is to provide a monitoring system that can be inspected while avoiding collision with the blades of wind power generation equipment.
  • the present invention is a monitoring system for inspecting wind power generation equipment based on image information obtained by photographing the wind power generation equipment, a photographing device for photographing the image information of the blades constituting the wind power generation facility; and a control device for controlling the photographing device to photograph the image information and for storing the photographed image information in a storage unit.
  • the control device instructs the photographing device about photographing parameter information for photographing the image information of the blade, so that the photographing device is moved to a photographing position that matches the photographing parameter information, and then the image information is captured.
  • the imaging parameter information includes a separation distance at which focus is achieved from the imaging device on the R cross section, which is a plane perpendicular to the R axis that is the longitudinal direction of the blade, toward the blade on the same R cross section. characterized by including Other means will be described later.
  • FIG. 1 is a configuration diagram showing a monitoring system related to Example 1;
  • FIG. FIG. 2 is a three-dimensional view showing a wind turbine to be inspected according to the first embodiment;
  • FIG. 3 is a three-dimensional diagram showing a coordinate system defined around a blade related to Example 1;
  • FIG. 2 is a side view showing an installation location of an unmanned air vehicle according to Embodiment 1;
  • FIG. 3 is a three-dimensional view showing the movement of the flight route along the R-axis of the blade for Example 1;
  • FIG. 6 is a side view of FIG. 5 for Example 1;
  • FIG. 4 is a cross-sectional view taken along the line R of FIG. 3 regarding Example 1;
  • FIG. 8 is a modified example of the R cross-sectional view of FIG. 7 relating to Example 1.
  • FIG. 3 is a three-dimensional view showing a spiral flight route F related to Example 1.
  • FIG. FIG. 4 is a side view showing an imaging method related to Example 1;
  • FIG. 11 is a side view when the unmanned air vehicle of FIG. 10 moves in Example 1;
  • 5 is a table showing an example of management information corresponding to identification information given to image information regarding Example 1.
  • FIG. FIG. 11 is a configuration diagram showing a monitoring system related to Example 2;
  • FIG. 10 is a display screen diagram when a first screen display rule is specified regarding the second embodiment;
  • FIG. 11 is a display screen diagram when a second screen display rule is specified regarding the second embodiment;
  • FIG. 11 is a display screen diagram when a third screen display rule is specified according to the second embodiment;
  • FIG. 10 is a display screen diagram when distinguishing between normal and abnormal inspection results regarding Example 2;
  • FIG. 1 is a configuration diagram showing a monitoring system 100 of Example 1.
  • the monitoring system 100 includes an unmanned flying object 10 (drone) that photographs blades 73 of a wind turbine (wind power generation facility) 70 shown in FIG. and a display unit 40 for displaying output information from the control device 21 .
  • the unmanned flying object 10 includes a photographing device 11 for photographing a blade 73 , a motor driving unit 13 for driving and controlling a motor mounted on the unmanned flying object 10 , and image information 31 which is a photographed image output from the photographing device 11 . to the control device 21.
  • FIG. 1 shows an example in which the imaging device 11 is housed in the unmanned air vehicle 10, the imaging device 11 may be configured to have a movement function.
  • the control device 21 performs flight control (motor drive control) and imaging control of the unmanned flying object 10 by transmitting the following information to the unmanned flying object 10 .
  • the photographing device 11 controls the flight of the unmanned air vehicle 10 so as to satisfy the transmitted separation distance.
  • the photographing device 11 can be made to photograph as the acquisition position of the image information 31.
  • the unmanned flying object 10 assists the inspection work by photographing the blade 73 while flying in the order of (step 1) to (step 4) below.
  • Step 1) Take off and move to blade 73;
  • Step 2) A step of moving to a shooting start point.
  • Step 3) A process of photographing while approaching and following the blade 73 .
  • Step 4) A step of landing at the target position.
  • the control device 21 stores the following information and the like in the storage section 30 .
  • - Image information 31 (see FIG. 14 for details), which is a photographed image output from the photographing device 11 and received via the transmission device 12
  • ⁇ Unique identification information 32 given to the image information 31 by the control device 21 (see FIG. 5 for details)
  • ⁇ Normal image information 33 (see FIG. 16 for details), which is image information 31 obtained by photographing the normal state of the blade 73 in advance.
  • FIG. 2 is a three-dimensional view showing a wind turbine 70 to be inspected.
  • the windmill 70 is composed of a nacelle 72 provided on a tower 71 and blades 73 fixed to the nacelle 72 .
  • Each blade 73 receives the wind force and rotates along the rotation direction ⁇ , and this rotational force is used to generate power.
  • an R-axis is defined along the longitudinal direction of the blade 73 from the root of the nacelle 72 to the tip of the blade 73 .
  • FIG. 3 is a three-dimensional view showing a coordinate system defined around the blade 73.
  • the “R-axis” is the axis along the longitudinal direction of the blade 73 as explained in FIG.
  • the R-axis is defined so as to follow the rotation direction ⁇ regardless of which direction the rotation direction ⁇ is directed (upward, horizontal, or ground).
  • the “R cross section” is a plane perpendicular to the R axis, and for the sake of convenience, one axis thereof is the X axis and the other axis is the Y axis.
  • the “image plane” defines from which direction of the R section the blade 73 is imaged. For example, in FIG. 3, an imaging plane a for imaging the blade 73 with the angle of view directed in the positive Y-axis direction, an imaging plane b for imaging the blade 73 with the angle of view directed in the negative X-axis direction, and an imaging plane b for imaging the blade 73 with the angle of view directed in the negative X-axis direction.
  • a total of 4 planes: the shooting plane c (not shown because it was hidden behind the shooting planes a and b) and the shooting plane d in the negative direction of the Y axis (not shown because it was hidden behind the shooting planes a and b) is shown as an example of the shooting surface of Incidentally, the imaging planes around the blade may be provided four as shown in FIG. 3, or may be provided without being limited to four such as five. In any case, by photographing the blade 73 from a plurality of directions, it is possible to prevent occurrence of unphotographed portions.
  • FIG. 4 is a side view showing the installation location of the unmanned air vehicle 10. As shown in FIG. Hereinafter, the process of taking off and moving to the blade 73 will be described as (step 1).
  • An observer of the windmill 70 installs the unmanned flying object 10 at a certain distance W from the front of the windmill 70 .
  • the height H from the ground surface to the nacelle 72 of the windmill 70 is constant depending on the model of the windmill 70 . Therefore, by determining the distance W in advance, the unmanned flying object 10 automatically moves to the vicinity of the nacelle 72 of the wind turbine 70 .
  • the control device 21 After installing the unmanned flying object 10 in front of the wind turbine 70 , the observer of the wind turbine 70 instructs the control device 21 to inspect the blades 73 .
  • the control device 21 transmits flight route information to the motor driving section 13 of the unmanned flying object 10 .
  • the unmanned air vehicle 10 moves to the vicinity of the nacelle 72 of the blades 73 .
  • the flight route information is information for taking off the unmanned air vehicle 10 and efficiently and quickly approaching the imaging position of the imaging target (blade 73). An existing technique may be used to create this flight route information.
  • FIG. 5 is a three-dimensional view showing movement of the flight route (F1 during forward movement, F2 during return movement) along the R-axis of blade 73 .
  • This reference position is the starting point of the flight route F1.
  • the first additional information (No. 1, No. 2, . . . No. 100) is identification information 32 that is unique for each image shot at that shooting position. In other words, it can be said that the identification information 32 is information numbered according to the order in which images are captured.
  • This is position specific information.
  • a smaller value of r indicates a position closer to the nacelle 72
  • the entire blade can be correctly displayed on the screen even when the drone reciprocates.
  • FIG. 6 is a side view of FIG. 5.
  • the photographing device 11 photographs a predetermined portion (photographing surface a) of the blade 73 at predetermined equal intervals (intervals at which 50 images can be photographed up to No. 1, 2, . . . , 50).
  • control device 21 designates a predetermined separation distance D for the motor drive unit 13, so the unmanned flying object 10 comes too close to the blade 73 and collides with it, or moves too far from the blade 73 and causes an inadvertent collision. Acquisition of a clear photographed image can be prevented.
  • the unmanned flying object 10 which has reached the tip of the blade 73, similarly follows the flight route F2 during the return motion to the photographing plane b, which is different from the photographing plane a which has been photographed so far. to start shooting. That is, the unmanned flying object 10 moves from the second end (No. 51 at the tip of the blade 73) far from the reference position to the first end (No. .100) and shoots at predetermined equal intervals.
  • FIG. 7 is a cross-sectional view taken along the line R in FIG. 3.
  • a spiral flight route F (a three-dimensional view will be described later in FIG. 9) is shown.
  • photographing is performed in the order of position P1 for photographing surface a ⁇ position P2 for photographing surface b ⁇ position P3 for photographing surface c ⁇ position P4 for photographing surface d. Since the flight route F moves while maintaining a predetermined separation distance D from the blade 73, it is expressed as a circular motion in the R sectional view.
  • the blade 73 is approximated as a streamlined plate having a front surface and a back surface. Therefore, the unmanned flying object 10 can photograph both surfaces by photographing the front surface of the blade 73 at positions P1 and P2 and photographing the rear surface of the blade 73 at positions P3 and P4. Note that the surface of the blade 73 may be photographed at the position Pu instead of at the positions P1 and P2. Similarly, instead of photographing the back surface of the blade 73 at positions P3 and P4, it may be photographed at position Pd. As a result, the number of shots can be reduced from four to two.
  • the unmanned flying object 10 is moved to the position Pu, the position Pu is on the orbit of the blade 73 (on the circumferential direction ⁇ ), so the position of the blade 73 will rotate from the current position P0 to the position Pu. , there is a fear of colliding with the unmanned flying object 10 .
  • the blade 73 can be prevented from rotating by fixing the rotation direction ⁇ of the blade 73 with a hydraulic brake during inspection work.
  • power supply to the hydraulic brake is interrupted when power is lost, unintended rotation of the blade 73 may not be suppressed.
  • the circumferential direction ⁇ of the blade 73 (the line passing through the points Pu, P0, and Pd) and the longitudinal direction of the blade 73 surface in the R cross-sectional view (the line passing through the points Px, P0, and Py) are perpendicular to each other at 90°
  • the imaging positions P1, P2, P3, and P4 are set so as to be about 45° from the perpendicular.
  • the control device 21 sets the movement range (shooting position) of the unmanned flying object 10 outside the range of the rotation direction ⁇ of the blade 73 .
  • FIG. 8 is a modification of the R cross-sectional view of FIG.
  • the observer of the wind turbine 70 sets the fixed angle of the blades 73 to the nacelle 72 (the line passing through the points Px, P0, and Py) slightly to the right of the normal operation (perpendicular to the circumferential direction ⁇ ) in FIG. Shift it before inspection so that it is lowered.
  • the unmanned flying object 10 photographs the front surface of the blade 73 at the position P5 and the rear surface of the blade 73 at the position P6.
  • collision between the unmanned flying object 10 and the blade 73 at the photographing position can be avoided, and the number of photographs can be appropriately reduced.
  • FIG. 9 is a three-dimensional view showing a spiral flight route F.
  • the unmanned air vehicle 10 may move in a flight route F that gradually advances along the R axis from the root of the nacelle 72 to the tip of the blade 73 while spirally turning around the blade 73 .
  • images of imaging planes in four directions are obtained by imaging a total of four times at positions P1, P2, P3, and P4.
  • the flight route that reciprocates along the R axis and the flight route that spirals around the R axis have been described above with reference to FIGS. What is common to both flight routes is that any position on the flight route maintains a predetermined separation distance D from the blade 73 . Therefore, the control device 21 instructs the unmanned flying object 10 about photographing parameter information such as a predetermined separation distance D and a photographing surface.
  • the predetermined separation distance D is a distance at which the blade 73 on the same R section is focused from the imaging device 11 on the R section, which is a plane perpendicular to the R axis, which is the longitudinal direction of the blade 73 .
  • the possible range of the separation distance D is the range in which the focus can be adjusted, so it depends on the f value (aperture value) of the shooting parameter information. If you want to widen the range in which focus adjustment is possible (deepen the depth of field), you can increase the f-number. If you want to narrow the range in which focus adjustment is possible (shallow depth of field), you can reduce the f-number.
  • the horizontal axis is the R cross section (X-axis and Y-axis)
  • the vertical axis is the R-axis.
  • the unmanned air vehicle 10 maintains the separation distance D from the blade 73 at the reference position. Therefore, the unmanned flying object 10 (photographing device 11) at the photographing position 101Q obtains the current separation distance 111 to the blade 73 by adjusting the focus with the autofocus function with the blade 73 as the subject.
  • the unmanned flying object 10 is moved in the direction to widen the current separation distance 111 (rightward in FIG. 10). If the current separation distance 111 is greater than the separation distance D, the unmanned flying object 10 is moved in a direction to narrow the current separation distance 111 (to the left in FIG. 10). When the current separation distance 111 substantially matches the separation distance D, the position adjustment of the unmanned flying object 10 in the horizontal axis direction of FIG. 10 is completed. As a result, the camera is focused on the blade 73 separated by the separation distance D, so that collision between the two can be avoided and a clear image of the surface of the blade 73 can be obtained.
  • the photographing device 11 is equipped with a lens capable of photographing a desired angle of view 101K from a plurality of lenses having different focal lengths, such as a 24 mm wide-angle lens, a 50 mm standard lens, and a 200 mm telephoto lens.
  • the angle of view 101K is also designated to the unmanned flying object 10 by the control device 21 as one of the imaging parameter information.
  • an imaging range 101D in the R-axis direction including the imaging center 101P is determined.
  • Lenses include a zoom lens whose focal length can be changed within a predetermined range and a single focal length lens whose focal length is fixed.
  • the photographing device 11 by adopting a 24 mm wide-angle lens or the like for the photographing device 11 to widen the photographing range 101D of one image, the number of photographed images can be reduced and the data amount of the image information 31 can be reduced.
  • a 200 mm telephoto lens or the like in the photographing device 11 by using a 200 mm telephoto lens or the like in the photographing device 11 to narrow the photographing range 101D of one image, although the number of photographed images increases, a clear image can be photographed as the image information 31, and the accuracy of inspection can be improved.
  • the control device 21 sends appropriate image information 31 (a plurality of captured images) to the unmanned flying object 10 by designating imaging parameter information as shown in (imaging strategy 1) or (imaging strategy 2) below to the unmanned aircraft.
  • the flying object 10 is made to photograph.
  • Photographing policy 1 Although overlapping parts are allowed between the first photographing range 101D and the second photographing range 102D, an instruction is given not to generate an unphotographed portion. As a result, since the imaging range of the blade 73 is covered, omission of inspection can be prevented.
  • Photographing policy 2 An instruction is given not to allow overlapping portions between the first photographing range 101D and the second photographing range 102D, and not to generate an unphotographed portion. That is, the image is captured so that the edge of the imaging range 101D (upper end in FIG. 11) and the edge of the imaging range 102D (lower end in FIG. 11) are in contact with each other. As a result, it is possible to reduce the number of shots while preventing inspection omissions.
  • the imaging parameter information includes information indicating (imaging policy 1) or (imaging policy 2) and information indicating the imaging ranges 101D and 102D (angles of view 101K and 102K).
  • the angle of view 101K the angle of view 102K, but the angle of view 101K ⁇ the angle of view 102K may be set using a zoom lens.
  • the control device 21 autonomously moves the unmanned air vehicle 10 so as to match the photographing parameter information (the relationship between the focal length and the photographing angle of view, etc.) of the photographing device 11 specified in advance. This makes it possible to easily determine the position of the unmanned air vehicle 10 with respect to the blades 73 without using GPS (Global Positioning System) signals or collision avoidance sensors.
  • GPS Global Positioning System
  • autonomous movement means that the unmanned flying object 10 that has received an instruction (imaging parameter information) from the control device 21 moves in accordance with the instruction. It can also be said that the unmanned flying object 10 moves autonomously because it is not controlled by a human using a remote controller or the like.
  • step 4 the process of landing at the target position will be described as (step 4).
  • the unmanned flying object 10 completes shooting, when the remaining flight time is short, when autonomous shooting flight of the unmanned flying object 10 becomes difficult due to strong winds, or when the unmanned flying object 10 returns from the observer through a communication unit (not shown). Lands at any point when command is received.
  • a plurality of candidate landing points different from the takeoff point may be set in advance as the landing point. As a result, it is possible to determine a suitable point for the remaining flight time according to the battery state and poor environmental conditions, and to change the landing point as appropriate, thereby improving safety.
  • the image information 31 photographed in this way is transmitted from the photographing device 11 to the control device 21 via the transmission device 12 .
  • the control device 21 assigns unique identification information 32 to the image information 31 in order from positions closest to the reference position (No. 1, 2, . . . Further, when the photographing device 11 moves back, the identification information 32 is assigned to the positions farther from the reference position (No. 51, . . . , 99, 100 in FIG.
  • a wear state of level 2 is added as the damage state obtained from the inspection result of this photographed image.
  • control device 21 assigns specific information (blade No.) for each blade among the plurality of blades of the wind turbine 70 to each image information 31 and stores the information in the storage unit 30 .
  • the stored blade No. is referred to in the diagnostic processing of the control device 21 when comparing the image information 31 to which the specific information for each blade is attached. In this way, by acquiring the image information 31 for each blade, it is possible to easily identify the blade of a certain machine (for example, blade No. 1 of machine 1) and the blade of another machine (blade No. 1 of machine 2). It can be used for diagnostic processing compared to .
  • normal image information 33 indicating the normal state of the blade 73 is stored in advance in the storage unit 30 in order to obtain inspection results. Then, the control device 21 compares the image information 31 captured in the current inspection acquired from the imaging device 11 with the normal image information 33 prepared in advance for each identical identification information 32 . When the two are different as a result of the comparison, the control device 21 performs diagnostic processing for diagnosing the image information 31 photographed in the current inspection as abnormal. Then, the control device 21 adds the damage state to the table of FIG. 12 as a diagnosis result.
  • the first embodiment described above may be modified as follows.
  • the direction of the arrow indicating the flight route F1 and the arrow indicating the flight route F2 in FIG. 5 are reversed. That is, the flight route F1 moves from the tip of the blade 73 to the vicinity of the nacelle 72 during the forward motion, and moves from the vicinity of the nacelle 72 to the tip of the blade 73 during the backward motion.
  • the control device 21 acquires image information 31 with normal inspection results during a predetermined period in the past and learns based on a statistical method, thereby obtaining normal images for evaluating the soundness of the wind turbine 70.
  • Information 33 may be generated. Learning based on a statistical method is, for example, a process of obtaining, as a learning result, an average image of a set of images of the same object in the same shooting range taken at different shooting dates and times. Thereby, even if the user does not give the normal image information 33 to the control device 21, the normal image information 33 can be generated from the image information 31 acquired in the past period.
  • the second embodiment is a configuration for displaying image information 31 captured by the unmanned air vehicle 10 on the display unit 40 .
  • FIG. 13 is a configuration diagram showing the monitoring system 100 of the second embodiment. The description of the configuration similar to that of the monitoring system 100 of FIG. 1 is omitted.
  • the monitoring system 100 of FIG. 13 has an input unit 22, a rule setting unit 23, and a display control unit 24 added to the configuration of FIG.
  • the input unit 22 is input means for allowing the inspector to input various information such as a screen display rule for displaying the image information 31 of the blade 73 on the display unit 40 .
  • Input means are, for example, a keyboard, a mouse, a touch panel, and the like.
  • the rule setting unit 23 sets the screen display rule received from the input unit 22 to the display control unit 24 .
  • the display control unit 24 causes the display unit 40 to display the image information 31 of the blade 73 based on the set screen display rule (see FIGS. 14 to 16 for details).
  • FIG. 14 is a display screen diagram when the first screen display rule is specified.
  • the first screen display rule 201 a set of images of blade No. 1 of machine No. 1 at site A photographed on photographing surface a is designated as a display object, and a comparative display with normal image information 33 is also designated. ing.
  • the display control unit 24 displays the image information 31 that conforms to the first screen display rule 201 as an image set 202, and the normal image information 33 that conforms to the first screen display rule 201 as an image set 203 to display one screen. 200 is divided into left and right and displayed.
  • FIG. 15 is a display screen diagram when the second screen display rule is specified.
  • the second screen display rule 211 as part of the management information in FIG. Image set No. 1 is specified as a display target.
  • a comparison display for each of the shooting planes a to d is also specified.
  • FIG. 16 is a display screen diagram when the third screen display rule is specified.
  • the third screen display rule 221 as part of the management information (site name, unit number, blade No.) in FIG. ing.
  • a comparison display for each shooting date and time is also specified.
  • FIG. 17A and 17B are display screen diagrams for distinguishing whether the inspection result is normal or abnormal.
  • FIG. 15 illustrates the case where all displayed images are normal, damage may appear only in some displayed images.
  • the display control unit 24 may display the damage description information together with the image information 31 for each identification information 32 .
  • the display control unit 24 displays, as a first display 231 in the screen 230, the image information 31 obtained by photographing the same imaging position of the same blade 73 at different times side by side. Under each displayed image, the shooting date such as 2018/4/1, the diagnostic result of normal or abnormal, and the degree of damage in case of abnormality (1% to 100%, the higher the number, the more serious the damage). Is displayed.
  • the blade 73 which was normal on April 1, 2018, suffered mild damage 231a on October 1, 2018, and progressed to severe damage 231b on April 1, 2019. is doing.
  • control device 21 stores the image information 31 of the same photographing position in time series for the image information 31 determined to be abnormal as a result of the diagnostic processing in the storage unit 30, and compares the image information 31 of the same photographing position stored in the time series. Then, the degree of progress of the damage state is calculated.
  • similar damage is found in the damage 232a of the A-site Unit 1 and the damage 232b of the X-site N unit.
  • the first display 231 or the second display 232 makes it possible to easily grasp the damaged portion of the entire blade.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Sustainable Energy (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Sustainable Development (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • General Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Wind Motors (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Alarm Systems (AREA)
PCT/JP2022/020456 2021-05-17 2022-05-17 監視システム、および、監視方法 WO2022244752A1 (ja)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-083458 2021-05-17
JP2021083458A JP7369735B2 (ja) 2021-05-17 2021-05-17 監視システム、および、監視方法

Publications (1)

Publication Number Publication Date
WO2022244752A1 true WO2022244752A1 (ja) 2022-11-24

Family

ID=84141409

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/020456 WO2022244752A1 (ja) 2021-05-17 2022-05-17 監視システム、および、監視方法

Country Status (3)

Country Link
JP (2) JP7369735B2 (zh)
TW (1) TWI818539B (zh)
WO (1) WO2022244752A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115962101B (zh) * 2022-12-05 2024-03-22 中材科技风电叶片股份有限公司 一种失速状态监测方法及系统

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09305764A (ja) * 1996-05-14 1997-11-28 Kawatetsu Joho Syst Kk 画像のパターン欠陥検出方法及び装置
US20160017866A1 (en) * 2014-07-18 2016-01-21 General Electric Company Wind tower and wind farm inspections via unmanned aircraft systems
JP2016136105A (ja) * 2015-01-23 2016-07-28 新日鐵住金株式会社 機械設備の異常検出方法および異常検出装置
JP2018021491A (ja) * 2016-08-02 2018-02-08 株式会社日立製作所 システム及び飛行ルート生成方法
US20180149138A1 (en) * 2016-11-30 2018-05-31 Dji Technology, Inc. Aerial inspection in a movable object environment
JP2018181235A (ja) * 2017-04-21 2018-11-15 古河電気工業株式会社 報告書作成装置、風力発電設備点検システム、プログラム、及び風力発電設備の点検報告書の作成方法
JP2019070631A (ja) * 2017-10-11 2019-05-09 株式会社日立システムズ 飛行体利用劣化診断システム
JP2019133306A (ja) * 2018-01-30 2019-08-08 株式会社日立製作所 画像処理装置及び画像処理方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI571720B (zh) * 2015-12-09 2017-02-21 財團法人金屬工業研究發展中心 風力發電機之葉片檢查系統及其檢查方法
TWI627351B (zh) * 2016-12-13 2018-06-21 財團法人金屬工業研究發展中心 利用無人飛行載具對風機葉面攝影之行徑產生方法、內儲程式之電腦程式產品及內儲程式之電腦可讀取記錄媒體

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09305764A (ja) * 1996-05-14 1997-11-28 Kawatetsu Joho Syst Kk 画像のパターン欠陥検出方法及び装置
US20160017866A1 (en) * 2014-07-18 2016-01-21 General Electric Company Wind tower and wind farm inspections via unmanned aircraft systems
JP2016136105A (ja) * 2015-01-23 2016-07-28 新日鐵住金株式会社 機械設備の異常検出方法および異常検出装置
JP2018021491A (ja) * 2016-08-02 2018-02-08 株式会社日立製作所 システム及び飛行ルート生成方法
US20180149138A1 (en) * 2016-11-30 2018-05-31 Dji Technology, Inc. Aerial inspection in a movable object environment
JP2018181235A (ja) * 2017-04-21 2018-11-15 古河電気工業株式会社 報告書作成装置、風力発電設備点検システム、プログラム、及び風力発電設備の点検報告書の作成方法
JP2019070631A (ja) * 2017-10-11 2019-05-09 株式会社日立システムズ 飛行体利用劣化診断システム
JP2019133306A (ja) * 2018-01-30 2019-08-08 株式会社日立製作所 画像処理装置及び画像処理方法

Also Published As

Publication number Publication date
JP2022176821A (ja) 2022-11-30
JP7369735B2 (ja) 2023-10-26
TWI818539B (zh) 2023-10-11
TW202308905A (zh) 2023-03-01
JP2022176953A (ja) 2022-11-30

Similar Documents

Publication Publication Date Title
US11181935B2 (en) System and method for detecting obstacles in aerial systems
JP7175652B2 (ja) 航空機用レーザスペックルのシステム及び方法
CN108803668B (zh) 一种静态目标监测的智能巡检无人机吊舱系统
EP2697700B1 (de) System und verfahren zur steuerung eines unbemannten fluggeräts
US9310189B2 (en) Method and system for the remote inspection of a structure
WO2017150433A1 (ja) 無人航空機、無人航空機制御システム、飛行制御方法およびプログラム記憶媒体
WO2018056498A1 (ko) 송전선로 전자계 및 순시 점검 영상 취득 장치 및 방법
US20170142309A1 (en) Imaging apparatus and imaging method
EP3596570B1 (de) Verfahren zum bestimmen eines pfades entlang eines objekts, system und verfahren zum automatischen inspizieren eines objekts
CN109073498A (zh) 用于扫描对象的移动飞行器和用于对象的损坏分析的系统
WO2022244752A1 (ja) 監視システム、および、監視方法
US11430101B2 (en) Unmanned aerial vehicle, inspection method, and storage medium
JP7011908B2 (ja) 光学情報処理装置、光学情報処理方法および光学情報処理用プログラム
JP2024027907A (ja) 無人飛行体、および、風力発電設備の点検システム、並びに、風力発電設備の点検方法
KR20160123551A (ko) 전력 설비 점검을 위한 위상 정보 기반의 드론 시스템 자동 제어 시스템 및 그 방법
CN112327913A (zh) 一种用于电力巡检的无人机飞行控制方法及系统
US20230366775A1 (en) Method, aerial vehicle and system for detecting a feature of an object with a first and a second resolution
CN118062282A (zh) 一种用于风机叶片巡检的无人机载荷平台
JP7022858B1 (ja) 構造物表示装置、および、構造物表示方法
EP4223648A1 (en) Automated method and system for aircraft inspection with data validation
Hidaka et al. Autonomous Flight Control of small UAV within the view area based on multi–camera coupling
Nabandit et al. Obstacle Detection and Avoidance based on 3D Stereoscopic Technique for Autonomous Drone
CN113273173A (zh) 可移动平台的巡检方法、装置、可移动平台及存储介质
CN116480536A (zh) 一种无人机自主追踪巡检风机叶片方法
KR20230115042A (ko) 충돌회피 드론 및 제어방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22804663

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22804663

Country of ref document: EP

Kind code of ref document: A1