WO2022003213A1 - Système et méthode pour détecter l'avifaune dans des parcs éolients - Google Patents

Système et méthode pour détecter l'avifaune dans des parcs éolients Download PDF

Info

Publication number
WO2022003213A1
WO2022003213A1 PCT/ES2020/070415 ES2020070415W WO2022003213A1 WO 2022003213 A1 WO2022003213 A1 WO 2022003213A1 ES 2020070415 W ES2020070415 W ES 2020070415W WO 2022003213 A1 WO2022003213 A1 WO 2022003213A1
Authority
WO
WIPO (PCT)
Prior art keywords
equipment
module
birds
wind
bird
Prior art date
Application number
PCT/ES2020/070415
Other languages
English (en)
Spanish (es)
Inventor
Ramón DOLZ GARCIA
Roberto ANTON AGIRRE
Vicente CAMPOS TENA
Original Assignee
3D Observer Project, S.L.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3D Observer Project, S.L. filed Critical 3D Observer Project, S.L.
Priority to PCT/ES2020/070415 priority Critical patent/WO2022003213A1/fr
Priority to ES202290033U priority patent/ES1303416Y/es
Publication of WO2022003213A1 publication Critical patent/WO2022003213A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M29/00Scaring or repelling devices, e.g. bird-scaring apparatus
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F03MACHINES OR ENGINES FOR LIQUIDS; WIND, SPRING, OR WEIGHT MOTORS; PRODUCING MECHANICAL POWER OR A REACTIVE PROPULSIVE THRUST, NOT OTHERWISE PROVIDED FOR
    • F03DWIND MOTORS
    • F03D17/00Monitoring or testing of wind motors, e.g. diagnostics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F03MACHINES OR ENGINES FOR LIQUIDS; WIND, SPRING, OR WEIGHT MOTORS; PRODUCING MECHANICAL POWER OR A REACTIVE PROPULSIVE THRUST, NOT OTHERWISE PROVIDED FOR
    • F03DWIND MOTORS
    • F03D7/00Controlling wind motors 
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F03MACHINES OR ENGINES FOR LIQUIDS; WIND, SPRING, OR WEIGHT MOTORS; PRODUCING MECHANICAL POWER OR A REACTIVE PROPULSIVE THRUST, NOT OTHERWISE PROVIDED FOR
    • F03DWIND MOTORS
    • F03D80/00Details, components or accessories not provided for in groups F03D1/00 - F03D17/00
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the invention consists of a system for detecting birds in flight and their positioning and three-dimensional monitoring in real time, which, by integrating the analysis of the captured images and the data collected at the time, such as an analysis of meteorological variables and the rotation speed and position of the wind turbine blades with respect to the direction of the bird, defines an action strategy based on the creation of a probabilistic prediction of the path that the detected bird will follow in a future time.
  • This prediction is generated by an artificial intelligence system or equipment through the union between all the variables taken by the system in real time and those that have been previously taken and stored in a record of detections and that, in addition, were classified through an analysis. statistical.
  • the system based on the analysis of the movement made by the three-dimensionally positioned birds in real time, acts with sufficient anticipation to make decisions ranging from the shutdown of wind turbines to the activation of any deterrent method.
  • the present invention aims to considerably reduce the mortality of these species in wind farms by means of a system, based on Stereoscopic Artificial Vision and Artificial Intelligence, and a series of measures activated by said system.
  • the system object of this patent aims to address the problem from a diametrically different plane than the systems currently used, since, normally, it bases all its functionality on the premise of the precise detection of medium-large birds that Species that tend to coincide with threatened species are found in the field of work.
  • IdentiFlight bases its operation on computer vision technology, and comprises two sets of cameras.
  • the first set is the lower and is a circular distribution of cameras covering 360 and in two dimensions and horizontally around the site of the system (the vertical coverage depends on the optical used). These cameras are in charge of detecting possible birds in the immediate environment, but it is important to note that not stereoscopically capture the environment, ie do not get a 3D image, but obtained a two - dimensional (2D) image 360 e.
  • This first 2D subsystem of IdentiFlight provides information on the orientation of the detected birds (azimuth and elevation directions), not being able to provide information on the distance at which the bird is or its size, since due to the distribution of the cameras a single point of view is obtained for every point in space and therefore cannot provide 3D information.
  • the other set of IdentiFlight cameras is located at the top and is made up of two cameras mounted on a PAN / TILT system, a system that is capable of orienting these cameras by making horizontal and vertical movements. These two together cameras can get a 3D image from which it can determine the position and size of the captured object, but do not cover 360 and they are limited to grades covering each of the chambers.
  • the direction in which the detected bird is located is sent (since it does not know the position C, U, Z of it) to the system PAN / TILT 3D superior, which points to the object and, now, determines its position in the three dimensions, initiating a follow-up of this through movements of the PAN / TILT.
  • the system PAN / TILT 3D superior which points to the object and, now, determines its position in the three dimensions, initiating a follow-up of this through movements of the PAN / TILT.
  • the IdentiFlight 3D system can only point to one of the objects, chosen from among those that is detecting the lower 2D system that covers 360 e , and follows it in space through the movements of the PAN / TILT system. This way, you can only track one target, and if the IdentiFlight 360 e system detects several birds in flight, you must choose which one to target. Furthermore, the precision with which a stereoscopic system can locate an object in space is directly determined by the separation between the cameras and the resolution of the optical system.
  • the stereoscopic pair of the IdentiFlight system is made up of two cameras horizontally separated by about 50 cm, while the stereoscopic pairs of the present invention have different specifications since they are arranged vertically and much further apart (several meters apart). ).
  • this IdentiFlight system is not capable of simultaneously tracking multiple targets in 3D since the 3D positioning and calculation is carried out from a stereoscopic PAN-TILT system that has to "aim" at the target, in such a way that You can only follow one target at a time. This is an important limitation in areas where there may be multiple birds flying in the vicinity of the wind turbine. Besides, the Accuracy in three-dimensional measurements is limited due to the reduced separation between the cameras that make up the stereoscopic pair.
  • this system has a higher and non-operational response time for the purpose sought in the present invention, since the initial detection is carried out in 2D, determining only the direction in which the bird would approach, therefore, the total response time of the system is not functional in 3D until the stereoscopic PAN / TILT has been positioned pointing at the bird and can be tracked, which may have been when this bird has already impacted the wind turbine. This, coupled with the fact that, if there were several birds in potential risk trajectories, the system would be unable to follow them, having to choose only one of them.
  • DTBird The system known as DTBird is also known and can be seen and consulted at: https://dtbird.com/index.php/es/. Like the previous system (IdentiFlight), this system is also based on machine vision technology. This system is mounted on the mast of a wind turbine itself, the cameras being arranged as a belt looking from the lower part of the pole towards the upper part where the rotor and blades are located.
  • the DTBird System is not a stereoscopic system, it is a two-dimensional (2D) system and this implies that, although it can detect objects in flight, it cannot locate them in three-dimensional space. Due to this limitation, it is not able to determine real distances, sizes, 3D positions or trajectories.
  • the generation of the alarms is carried out under the assumption (without certainty) that the trajectory of the bird implies a risk, from the projection of its 2D image to some supposed “real" positions, and since it cannot determine Neither the distance nor the size of the detected object confuses small-sized nearby objects with larger-sized distant objects, generating numerous false positives, as cited in the article: “Norwegian Institute for Nature Research (NINA). 2012. Evaluation of the DTBird video- system at the Smola wind-power plant. Detection capabilities for capturing near-turbine avian behavior.
  • the system based on the analysis of the movement made by the three-dimensionally positioned birds in real time, acts with sufficient anticipation to make decisions ranging from the shutdown of wind turbines to the activation of any deterrent method.
  • an electronic artificial intelligence team trained with the aforementioned database, makes, in real time, the decision to stop the wind turbines, or to activate other deterrence methods, from the variables that are recorded at the time of detection: the position and trajectory of the bird in space, its size, the meteorological variables and the variables of the state of the wind turbines.
  • the present invention solves the previous technical problems related to being able to calculate the position, size and real trajectories of the birds; it does not generate coverage problems for the study area; and the response time is reduced.
  • Figure 1 shows a representation of the system's operating mode, including the essential components of the system.
  • the present invention consists of a system for the observation and determination, in real time, of the three-dimensional position and the flight path of the birds that occupy the space of a wind farm, using stereoscopic artificial vision techniques complemented with two-dimensional images ( 2D) of high resolution, meteorological variables, data on the status of the different wind turbines that make up said Park and an analysis by means of artificial intelligence techniques of a Historical Register of Detections.
  • 2D two-dimensional images
  • the system makes it possible to predict possible collisions of birds with wind turbines and act before they occur, reducing the probability that birds will hit the blades of the wind turbines.
  • the present invention proposes a system based on artificial vision, which consists of a stereoscopic equipment with a set of 3D cameras and a high-resolution equipment, equipped with high-resolution cameras and a high-precision Pan Tilt system.
  • the 3D camera set is mounted on a column several meters high.
  • High resolution equipment is also mounted on this column.
  • Several columns, equipped with stereoscopic cameras and high-resolution equipment, strategically distributed, can cover the entire space occupied by the Wind Farm in which the death of birds is to be avoided.
  • the set of columns thanks to the high resolution of the cameras and the great separation between the sensors that make up each stereoscopic pair, obtain with high precision, the three-dimensional position that the birds occupy in the airspace of the Wind Farm.
  • the identification of which species or group the bird belongs to is improved with the help of close-up images captured by a high-resolution computer with a 2D camera with a horizontal (PAN) and vertical (TILT) axis movement control system.
  • the 2D camera in addition to having a high-resolution sensor, has a long-focal zoom lens, that is, a telephoto lens that, guided by the stereoscopic system, obtains close-ups of the birds in flight.
  • this high-resolution equipment is only capable of following one bird, or a group of birds, at a time, this does not impair the ability of the system to prevent bird mortality since these images, together with the flight sequences of the detected birds and the various parameters, their main purpose is to improve the identification of the species or group to which the detected birds belong, but it is not used to determine the position and trajectory of the birds
  • the invention is based on the capture of images by using at least one stereoscopic equipment using 3D cameras and at least one high-resolution equipment equipped with high-resolution 2D cameras, and processing equipment and Image analysis that, from the images of the 3D cameras and by means of artificial vision algorithms, detects the birds in flight and determines their spatial position at certain times, obtaining a set of coordinates (X, Y, Z, t) to each of the birds detected.
  • the entire system is referenced and topographically calibrated with respect to the space to be monitored.
  • the three-dimensional position is obtained, for each detected bird, in real time and several times per second, making it possible to determine its flight path and allowing to act accordingly in the event that it is observed that a bird is flying on a collision course with a wind turbine. .
  • the system also receives the data corresponding to the meteorological variables (wind speed and direction, temperature and relative humidity) from an electronic module for detecting meteorological variables, as well as from an electronic module for detecting state variables. of the wind turbines (orientation of the rotor (of the axis of rotation) with respect to the axes of geographical coordinates and the speed of rotation).
  • the system that is the object of this patent not only receives the data to process it in real time, but also has the capacity to store all the data that is collected in each of the detections it carries out.
  • an electronic detection recording module in which the coordinates of each detected bird, its size, the images corresponding to each detection (both from stereoscopic equipment and high-resolution equipment), as well as the various meteorological and status variables of the wind farm turbines, forming a database in the electronic detection record module, which is a historical record of detections.
  • the 3D and 2D images, the coordinates (X, Y, Z, t), the meteorological data of the moment (speed and direction of the wind, temperature, etc.) and the status data of the wind turbines (orientation of the rotor and turning speed), will form a database that, over time and processed through statistical analysis and artificial intelligence techniques, allow modeling and making a detailed description of the most common birds in the wind farm, as well as their behavior according to the species , the time of year, the meteorological variables or the orientations and speeds of the wind turbines.
  • the analysis of the stored data is used to determine the species, or the group to which the bird belongs, as well as to model its behavior (type of flight, trajectories, possible maneuvers to avoid collisions, etc.), but linked its behavior to the Other variables that were recorded at the time of detection (size, position in space, speed, acceleration, meteorological parameters, time of day, parameters of the wind turbines, etc.).
  • the uniqueness of this system lies in the fact that, after some time after its installation in a wind farm, an exhaustive knowledge of the behavior of the birds that populate is achieved. the place where it is installed. Behavior that is obviously linked to the rest of the environment variables (meteorological parameters and position and movements of the wind turbines).
  • the analysis of the data stored in the registration system makes it possible to determine the most probable routes that a type of bird will take, taking into account the meteorological variables and the existing wind turbines at that time.
  • the high-resolution equipment and the electronic detection recording module there is an electronic artificial intelligence equipment.
  • This artificial intelligence equipment has been previously fed with data from the statistical analysis of the historical record of detections, data that is periodically updated as more data accumulates over time in the registration module.
  • the artificial intelligence team receives all the data that is being generated at the moment, that is, the three-dimensional coordinates, the estimated size and the images generated by the stereoscopic system, the high-resolution two-dimensional images, coming from the equipment. high resolution, and the rest of the environment variables (meteorological variables and the position and status of the wind turbines).
  • the artificial intelligence team makes the decision to activate or not an alarm signal based on the data that reaches it in real time, but comparing it with the statistical analysis of the historical record of detections that provides it with the most probable flight routes that the birds can follow according to the conditions existing at the time.
  • This alarm signal can be translated into the shutdown of the wind turbines or the activation of other dissuasive methods.
  • the method with which the detection of birds in the environment of a wind farm is achieved and the taking of the necessary measurements comprises: the taking of stereoscopic images with at least a stereoscopic equipment with 3D cameras arranged in the vicinity of the wind turbines, covering a sufficient area to be able to detect in advance the birds that carry trajectories of possible impact.
  • the 3D cameras are grouped in columns. Each column ( ⁇ 1, ... in) can have multiple stereoscopic cameras and multiple columns strategically distributed, they can cover the entire space occupied by the wind farm.
  • obtaining the meteorological variables, wind speed and direction, temperature, relative humidity, precipitation from an electronic meteorological variables detector module located in the control center of the wind farm (SCADA) iv.
  • Obtaining the state variables of the wind turbines of the wind farm that is, the orientation of the axis of rotation of the blades with respect to the geographic north and their speed in revolutions per minute, from an electronic module detecting the state variables of the wind turbines located in the control center of the wind farm (SCADA) v. the registration of all the aforementioned variables in an electronic detection recording module, generating a database also called with the historical record of detections vi.
  • the system includes several possibilities of action that can act individually or in a complementary way, and that can consist of: a.
  • Wind turbine stop (or decrease in rotation speed), in such a way that the risk of impact is minimized, minimizing the loss of efficiency in power generation and also the possible negative effects on the mechanical elements of the wind turbine.
  • deterrent systems in the wind turbine blades for example, lights or any mechanism that changes their morphological appearance; the activation of other deterrent systems through sounds, light, water, etc.
  • UAV Unmanned Aerial Vehicle
  • a stereoscopic camera is considered to consist of two conventional two-dimensional cameras or 2D cameras, joined in solidarity and separated from each other by a distance such that, for the purposes of this patent, it allows the three-dimensional position of the birds to be determined with adequate precision.
  • This precision is determined by the existing separation between the 2D cameras, by the resolution of the sensor and by the optics used, but to simplify the explanation we will not refer, if it is not essential, to these technical specifications.
  • a set of two 2D cameras constitutes a stereoscopic pair and forms a 3D camera (hereinafter 3D camera).
  • the present invention comprises at least one stereoscopic equipment consisting of several 3D cameras that together cover the airspace under surveillance.
  • the field of vision and the three-dimensional precision of the equipment is sufficient to be able to detect and position the birds at distances that allow, in advance, to activate any method that prevents the impact of these with the wind turbines.
  • Each piece of equipment made up of a set of 3D cameras, is installed in the vicinity of the wind turbine on a Support: a column.
  • the stereoscopic equipment is comprised of at least one support pole (1), which is the component in which all the rest of the elements of the stereoscopic equipment are located.
  • the support mast (1) consists of a column whose dimensions vary depending on the project and which, in general, can be between 10 and 15 m in height and 35-100 cm in diameter, and which in one embodiment is preferably 12 m in height and 40 cm in diameter. This column is located in the vicinity of the wind turbines on which the impact of birds is to be avoided.
  • a first section that goes from the zero level to the 4 m height.
  • This section is exempt in order to avoid vandalism.
  • There is only one register that allows the passage of the electrical supply and communication cables, and access to them.
  • a second section goes from 4 meters high to 12 meters.
  • a ladder 11
  • this 3D camera equipment is mounted, but also the high resolution equipment, the electrical panel, computers and communication antennas.
  • the support pole is conceived as a modular system in which different devices are mounted on the T.B according to needs.
  • the support mast (1) is installed at a certain distance from the wind turbine (A), as seen in Figure 5. This distance depends on the resolution of the cameras, the optics that are installed and the needs regarding the distance to the one that the birds have to detect, as well as the range of their sizes. For descriptive purposes only, this distance is considered to be 250 meters.
  • the position that the support mast (1) occupies with respect to the wind turbine (A), if the orography allows it, is optimized to avoid as much as possible glare from the Sun, that is, in such a way that the wind turbine is to the North of the Support.
  • the column is located to the south of the wind turbine to be monitored and aligned with the geographic north. Therefore the cameras are protected to the maximum with respect to possible glare from the sun, since it almost always runs on the side or back of them. This ensures maximum efficiency for as long as possible and regardless of the season of the year.
  • Each 3D camera is a stereoscopic pair that is composed of two cameras mounted on the TB of the column vertically, that is to say, one camera is in the upper part (2S) of this section and the other in the lower part of it but separated a certain distance from the upper part (2I). In this way the stereoscopic pair is mounted vertically. This can be seen in Figure 7.
  • each 3D camera is made up of 2 2D cameras aligned with each other and separated vertically at a distance, which, for descriptive purposes, is 8 meters.
  • the 2D cameras that make up the stereoscopic pair would be located at a height of 4 m and 12 m respectively.
  • Each of the 2D cameras that make up a stereoscopic pair are equipped with housings that protect them from meteorological factors.
  • 3D cameras In the event that you want to cover an angle, horizontally, greater than that covered by a 3D camera, several 3D cameras are mounted on the support pole. The number of 3D cameras that are installed depends on the angle that each camera covers and the desired overlap between each of them.
  • the cameras can be mounted in conventional mode (in the example, the highest angle, 57.37 e , horizontally) or rotated 90 e in order to obtain the maximum possible vertical field.
  • conventional mode in the example, the highest angle, 57.37 e , horizontally
  • rotated 90 e in order to obtain the maximum possible vertical field.
  • each 3D camera is made up of two 2D cameras (the upper camera and the lower one), both the 2D cameras that are in the upper part and those that are in the lower part are located, covering the perimeter of the column, but aligned between them vertically and, in such a way, that the angles of their optics allow to cover the desired total angle.
  • each column covers a certain airspace that has to do with the angular coverage (in this example 220.89 e in the horizontal direction and 40.08 e in the vertical direction) and with the distance at which it is able to detect the birds.
  • the distance at which the birds are detected is determined by the resolution of the cameras sensor, by the focal length of the optics that are mounted and by the minimum area (in pixels) that a bird must occupy to be detected.
  • this distance is 750m.
  • each column is limited, and therefore to cover a complete wind farm it may be necessary to install several columns so that together they are able to cover the desired area.
  • Several columns, strategically distributed, can cover the totality of the space occupied by the Wind Farm in which the death of birds is to be avoided. In Figure 1 this is indicated by the existence of “n” possible columns.
  • each column (C) covers a horizontal area that covers 220, 89 e at a maximum distance of 750 meters.
  • the wind turbine to monitor (A1) Inside this area, to the north and at a distance of 375 meters from the column, is the wind turbine to monitor (A1).
  • A2 Next to it, located to the East and at a distance of 800 meters, is another wind turbine (A2), which this column cannot cover since it is out of reach.
  • each 3D camera is made up of two cameras located vertically and separated by a distance of 8 meters.
  • the resolution of the cameras and their optics determine the precision in determining the coordinates (X, Y, Z) of the birds detected.
  • This vertical arrangement allows a great separation between the two cameras that make up the stereoscopic pair and, therefore, for the same cameras and optics, this separation allows achieving high precision in three-dimensional measurements.
  • the TB there is also a high resolution equipment equipped with a high resolution 2D camera (5) with a movement control system for the horizontal (Pan) and vertical (Tilt) axes, and with zoom and focus controls, in order to obtain high-resolution, close-up images of birds in flight.
  • a high resolution 2D camera equipped with a high resolution 2D camera (5) with a movement control system for the horizontal (Pan) and vertical (Tilt) axes, and with zoom and focus controls, in order to obtain high-resolution, close-up images of birds in flight.
  • stereoscopic vision is based on the fact that the depth of any point in space is calculated based on the difference in position of its projection in the sensors of a pair of cameras that capture said point. from different angles.
  • the coordinates in the three-dimensional space of an object (C, U, Z) can be calculated and, knowing the area that the object occupies in the image sensor, its size can be deduced. Furthermore, if we add the time coordinate (X, Y, Z, t) to each pair of stereoscopic images, we can easily obtain motion variables such as speed or acceleration.
  • the general methodology to approach the problem of stereoscopic calculation is to establish, firstly, the correspondence of individual points of the pair of images and, then, by means of the difference in their position, calculate the depth at which each of the points are found. .
  • Errors in the data extracted from the images originate because the images that are formed in the camera suffer optical distortions, called aberrations, due to constructional imperfections or misalignment of the lenses or the sensor. That is, as a preliminary step, a calibration of its intrinsic parameters must be established for each camera, which model the internal geometry of the camera and the optical characteristics of the sensor.
  • each stereoscopic pair is mounted vertically, that is, the two cameras are in the same vertical plane separated by a distance (which is 8 meters for illustration).
  • FIG. 10 shows a stereoscopic equipment formed by two vertically aligned cameras, whose optical axes are horizontal and parallel to each other and are separated by a distance b, called the baseline.
  • the cameras are shown schematized by means of a lens (0) of a focal length (F) that represents the optics and a plane (S) that represents the sensor.
  • F focal length
  • S plane
  • the baseline is perpendicular to the optical axes and coincident with the Y axis of the real world coordinate system.
  • the point (0 x , 0 y , 0 z ) is located at the point of intersection of the Y axis with the horizontal plane (plane formed by the X and Z axes).
  • the stereoscopic pair is formed by two cameras located vertically, suppose that camera 1 is at a height Hi above the horizontal plane and that camera 2 is at a height Fl 2 on the same plane.
  • the pixel that corresponds to the optical axis of the camera is in the center of the image and therefore is at the point (Horizontal Pixels of the Sensor / 2, Vertical Pixels of the Sensor / 2). If we make the origin of the X s and Y s axes coincide with this point, we will have that the optical axis corresponds to the pixel whose coordinates (X S , Y S ) are (0X S , 0Y s ).
  • the pixel that captures the point is in the upper chamber of the different values axis s of the lower chamber (remember that the axis Y s is the vertical axis). They are different on the Y s axis because the two 2D cameras that make up the 3D camera are vertically separated.
  • the values of the X axis s in which the point is captured in both cameras are the same since the cameras are perfectly aligned and calibrated, separated only vertically and not horizontally. If they were separated horizontally, the coordinate that would differ is X s .
  • the depth Z is inversely proportional to the disparity.
  • the distance to nearby objects can be determined more precisely than to distant objects, or in other words, the precision in the measuring range is not uniform.
  • the loss of precision with distance can be reduced by increasing the distance b between the cameras, since the disparity is directly proportional to the separation between them.
  • a similar effect is also obtained by increasing the focal length f of the lens (the disparity is directly proportional to the focal length) or the resolution of the cameras, since increasing this improves the precision of the measurement. Since the system object of this patent must cover a considerable volume, mounting cameras with long focal lengths is ruled out, for this reason it has been decided to increase the separation between the cameras that make up each stereoscopic pair.
  • the general process to obtain the coordinate (C, U, Z) of a point captured by both cameras is: From the coordinates that the two cameras occupy in space, and the Y s coordinates of their sensors, by solving the system of equations based on similar triangles, the Z coordinate (distance) and the coordinate Y (height).
  • the Azimuth (angle with respect to north) is obtained from the X s coordinates of the sensors.
  • the stereoscopic system once it detects a bird in space, obtains, as long as it continues to detect it and at a rate of several times per second, the following images and variables:
  • the system object of the present invention also comprises at least one high resolution equipment with a 2D camera with a movement control system of the horizontal (PAN) and vertical (TILT) axes, known as Pan-Tilt.
  • the system in addition to comprising the 3D cameras (2), comprises at least one high-resolution 2D camera (5) mounted on a Pan-Tilt system with high mechanical precision, Figure 11, in order to capture two-dimensional images of the bird that it has been detected by the stereoscopic system, but with much higher resolution than it can capture.
  • the artificial intelligence team When the artificial intelligence team requires it, it sends an order to the 3D processing and analysis computer of the stereoscopic system so that it supplies the position coordinates of the detected bird, the coordinates (C, U, Z), to the high resolution equipment, so that it can point its camera, with precision, at the bird and thus capture higher resolution images, close-ups, which facilitate the identification of the species to which it belongs.
  • This high-resolution 2D camera (5) is equipped with long-focal optics, but it is also mounted on an electromechanical device, the Pan Tilt, whose movements allow it to correctly aim and follow the birds in flight. This is possible because this system can carry out, with high precision and sufficient speed, horizontal movements (PAN), vertical movements (TILT) and optics adjustments (zoom and focus).
  • PAN horizontal movements
  • TILT vertical movements
  • optics adjustments zoom and focus
  • this system seeks to capture close-up shots of the birds, when monitoring it can only follow one bird, or a group of birds, of those detected by the 3D camera system.
  • the lens of the camera is a lens, usually long focal length, which has motorized focus and zoom.
  • This motorization is calibrated in such a way that a table has been generated in which, for each focus distance there corresponds a position of the motor that acts on the focus. The same happens for the zoom in which each focal point of the lens corresponds to a position of the motor that acts on it.
  • This high-resolution 2D camera (5) can be mounted at any point on the TB of the support mast (1).
  • this platform there is, in addition to the 3D analysis and processing control computer of the stereoscopic equipment; the computer that controls the high resolution equipment which is the computer of the high resolution equipment.
  • the high-resolution camera (5) thanks to the Pan-Tilt system, is capable of performing precision movements both vertically and horizontally. It is also capable of carrying out these movements at a sufficient speed to be able to follow the birds in flight, in such a way that, following in real time the coordinates provided by the 3D processing and analysis computer of the stereoscopic equipment, the birds remain centered within of the visual field captured by the high-resolution camera.
  • the mechanism that moves the high resolution camera is able to rotate, in both axes with an angular velocity of at least 100 e / second.
  • Each axis is equipped with an absolute encoder that determines its position with the appropriate precision.
  • the axes of camera movements have a common center of rotation. This center of rotation corresponds to the nodal point of the camera. These axes, and their center of rotation, are calibrated with respect to the same three-dimensional reference system as 3D cameras. As we have mentioned, the focus and zoom parameters of the optics are also calibrated.
  • any point that a bird occupies in the three-dimensional space controlled by the 3D cameras can be joined by means of a line to the center of rotation of the mechanism of the high resolution camera, as seen in Figure 12
  • This line corresponds to the position that It must occupy the optical axis of the high-resolution camera in order to capture an image so that the bird is centered on it.
  • the 3D processing and analysis computer of the stereoscopic equipment locates and follows a bird, it sends the three-dimensional coordinates (C, U, Z) in which the bird is located in real time to the high-resolution equipment.
  • the movements of the high-resolution camera are controlled by the computer of the high-resolution unit, which, equipped with the appropriate software, calculates, in real time, and from the coordinates supplied by the 3D processing and analysis computer, the position they must occupy. each of the horizontal and vertical axes so that the optical axis of the high resolution 2D camera coincides with the point where the bird is.
  • the computer of the high resolution equipment sends each of the "drivers” that control the motors horizontal and vertical (Pan-Tilt), the appropriate orders for each axis to go to the calculated position.
  • the optical axis is oriented, approximately and at all times, towards the point where the bird is located and this regardless of the distance at which it is.
  • the camera must zoom and focus to the distance that the bird is.
  • the computer of the high-resolution equipment also calculates the focus distance from the distance between the coordinates where the camera is mounted and the coordinates where the bird is.
  • the computer of the high-resolution team only has to send the order to these motors so that they go to the appropriate positions, ensuring that the optics focus correctly on the bird (focus) and cover the appropriate visual field (zoom).
  • the parameters of the motors of the vertical axis (Tilt), horizontal (Pan), zoom and focus are adjusted, ensuring that the bird is in the center of the image captured by the camera and, in addition , perfectly focused, as seen in Figure 13.
  • the taking of images from the 3D cameras (13A) is observed, their processing in the analysis computer and 3D processing (13B), taking into account the data of the artificial intelligence equipment (13C), how everything is processed in the computer of the high resolution equipment (13D), and how the parameters of the motors of the vertical (Tilt), horizontal (Pan), zoom and focus axis are adjusted.
  • These images are sent, in real time, to the electronic artificial intelligence equipment, so that a species identification module can improve its classification.
  • these images together with the stereoscopic 3D images, the coordinates (X, Y, Z, t), the current meteorological data and the status data of the wind turbines are stored in the electronic detection log module.
  • this information can be analyzed using various methods so that the artificial intelligence team can identify the species of the bird and its foreseeable trajectory.
  • the system object of the present invention also comprises an electronic detection recording module, with which a historical database or historical record is achieved. of detections, which is necessary to determine the causes of accidents and risk situations for birds.
  • this system records all the aforementioned data in an electronic detection record module, also called the historical record of detections, Figure 14.
  • This figure shows how the data (14A) arrives at the historical record of detections, on the one hand, the data from the wind farm control center (14B), together with other variables (14C) both meteorological and the state of the wind turbines; data from the 3D analysis and processing computers (14D) together with other data such as images, coordinates (X, Y, Z, t), speed, acceleration and size of the bird (14E); and data from the high resolution equipment (14F) together with the high resolution 2D images (14G); and how from the historical record a statistical analysis (14H) of these received data is carried out and a calculation of the most probable route (141) of each species is carried out according to the meteorological variables and the state of the wind turbines.
  • the stop order must be given sufficiently in advance, that is, at a distance that cannot be be less than the Ahead Distance, therefore making decisions based only on the estimated trajectory at the time the bird is at that distance, would mean generating numerous alarms that are not actually correct.
  • This analysis has several phases, firstly, a classification of all the stored detections is made according to the species or group of bird detected.
  • the decision to launch an alarm signal is taken by the electronic artificial intelligence team after defining, in real time, to which species or group the bird belongs and compare the trajectory and the variables of the moment, with the most probable trajectory that the statistical model gives for this species and with those variables.
  • the system proceeds to follow it as long as it does not leave the visual field or moves away, exceeding the resolution of the system.
  • the system captures images synchronized by the two cameras that make up the stereoscopic pair, at a rate of several images per second (for example, at 15 frames per second). At this rate, images are captured in the upper and lower cameras of the stereoscopic pair that are detecting the bird in its visual field, and from these images the position coordinates of the bird are obtained in three-dimensional space.
  • each of the captures of each detected bird we have: the moment T at which the capture is made; the images of the upper and lower cameras; the three-dimensional coordinate (C, U, Z) referenced to time T and to the geographic coordinates of the place; the area, in pixels, that the bird occupies in said images; and from the above, the size of the bird (maximum, minimum and average) and its speed and acceleration in the three axes of space (Vx, Vy, Vz) and (Ax, Ay, Az) are deduced.
  • the mechanism orients the optical axis of the camera, adjusting the zoom and focus to the position of the bird detected by the stereoscopic equipment, in order to capture focused and high-resolution images of it.
  • the electronic artificial intelligence equipment As anticipated in the first figure, it is based on computer tools capable of performing machine learning using Deep Learning algorithms.
  • This “Deep Learning” system is trained with the data from the statistical analysis of the historical record of detections, where this statistical analysis has previously made it possible to classify the most probable flight behavior of bird species according to the meteorological characteristics and the state of the wind turbines. that existed at all times.
  • the result of the statistical analysis allows the electronic artificial intelligence team to determine the most likely trajectory that it is going to follow the bird according to the meteorological and operating conditions of the existing wind turbines at the time, in order to generate an alarm signal that activates any mechanism that prevents the bird from impacting the wind turbines
  • the electronic artificial intelligence equipment receives the following data in real time:
  • the electronic artificial intelligence equipment based on the previous data and the Deep Learning system trained with the statistical analysis of the historical record of detections, is able to identify the species or group to which the bird belongs and determine the most probable trajectory. that will follow, in order to generate an alarm signal that activates any mechanism that prevents the bird from impacting the wind turbines.
  • This system allows simultaneous monitoring of multiple birds, independently managing the identification and trajectory of each one of them.
  • the electronic artificial intelligence module makes the decision to activate or not an alarm signal based on the data that reaches it in real time, but comparing it with the statistical analysis of the historical record of detections with which the team has been trained.
  • electronic artificial intelligence analysis that provides the most probable flight routes that the bird will follow considering the meteorological variables and existing wind turbines at that time. This alarm signal can be translated into the shutdown of the wind turbines or the activation of other dissuasive methods.
  • the system includes methods to avoid the impact of birds with wind turbines:
  • the wind turbine takes a time (about 25 seconds depending on the model) from when the stop order is given until the blades stop.
  • these stop orders are not determined only by proximity or by the trajectory that it is describing at that moment, but rather the system obtains the most probable trajectory that the bird will follow by adding the statistics provided by the historical data records of that wind farm, taking into account the most probable behavior of the birds according to numerous variables (the species or group of birds in question, their size , meteorological variables, etc.).
  • the system If that trajectory that is finally determined as the most probable gives a high possibility of collision, the system generates an alarm that translates into a stop order for one or more wind turbines.
  • This order is sent through a data connection to the SCADA of the wind farm, in order for this order to be sent to the wind turbine or turbines to be stopped, as shown in figure 15.
  • This stop order can also activate deterrence mechanisms found on the wind turbines involved, such as modifying the appearance or morphology of the blades (using lights or other methods).
  • the system can also activate other deterrent methods, such as the emission of sounds or lights, the effectiveness of which depends on the method and the species on which it is applied.
  • deterrent methods such as the emission of sounds or lights, the effectiveness of which depends on the method and the species on which it is applied.
  • these systems are activated through the use of conventional artificial vision systems, that is, vision systems that are not stereoscopic, because these systems cannot determine the position or trajectory of the birds, numerous alarms are generated that are totally unnecessary. .
  • With the stereoscopic system object of this patent not only is the emission of false alarms significantly reduced, but also the possibility of activating these methods with directivity (Figure 16), that is, using systems that emit with a narrow beam and pointing at the coordinate in which the bird is located. This increases the effectiveness of these systems since, for the same energy emitted by the deterrent system, the bird receives much more energy than with non-directive systems, which increases the effectiveness of these deterrence methods and reduces the impact. negative in the environment.
  • UAVs Unmaned Aerial Vehicle
  • This adaptive flight is possible thanks to the fact that the system that controls the UAVs can continuously compare their positions with the position of the bird to be driven away, supplied in real time from the 3D processing and analysis computer, calculating in real time which one should be the next spatial coordinates to which the UAVs should be directed.
  • the system that is the object of the present invention requires four computers or processing modules in each unit for its correct operation:
  • the computer that performs the three-dimensional analysis is the 3D analysis and processing computer. It is a computer that continuously captures images from 3D cameras and, using artificial vision techniques, analyzes the presence of an object in the observation area.
  • the system incorporates a software for detecting the birds present in the area to be analyzed, determining their coordinates (X, Y, Z, t) and monitoring their flight path so that, thanks to the data provided to the electronic artificial intelligence equipment, a possible collision can be established on the wind turbines present in the Wind Farm.
  • This computer allows: positioning and calibrating each of the pairs of cameras that make up each 3D camera, to obtain a high resolution three-dimensional measurement system; manage the connection of the cameras of the 3D vision system, modify their configuration, in real time, and depending on the ambient lighting conditions, capture images at the specified frequency; analyze in real time the images from the 3D cameras, to detect and determine the trajectory of the birds; calculate the three-dimensional position (X, Y, Z, t) at each moment of each detected bird, determine the trajectory made and predict the direction of flight of the bird in order to carry out the most exact tracking possible; the data output includes data related to the detection date, detection time, coordinates (X, Y, Z, t), velocity and acceleration in the three axes of space (V x , V y , V z ) and ( A x , A y , A z ), area in pixels that the bird occupies in the image from which the size and size of the bird can be deduced; recording of the sequences in which the passage of birds through the analysis field is
  • - Calibration module which allows: to adjust the field of view of the optics; calibrate the intrinsic parameters of each optics, aberrations, deformations; determine the orientation and position of the cameras, as well as fix them correctly in the desired direction of observation. This step is carried out during installation and may be repeated in the event that after a maintenance intervention the system is decalibrated; calibrate the calculation of three-dimensional coordinates obtained by the system, through known points in three-dimensional space where the system is installed; modify the adjustment parameters (stereoscopic calibration parameters) for the determination of sizes and distances, to be able to make measurements of three-dimensional objects, checking that the dimensions of the birds in the images are adequate; verify that the system maintains calibration; Digitally correct adjustments that cannot be made mechanically or optically.
  • - capture module that allows: capturing the images provided by the 3D cameras; check the connection with the cameras, verifying that images are received from both sets of detection cameras (UP + DOWN) with a resolution, in our descriptive example, of 5472x3648 pixels at least at 15 frames per second synchronously between them; configure the parameters of the cameras to make an image capture appropriate to the conditions of the scene (exposure times, gain) It also adjusts the working resolution, which in this case will be 5472x3648 pixels since, being the maximum resolution, allows to obtain the best precision in 3D positioning while being able to work with a frequency of at least 15 images per second (enough to make an adequate prediction of the evolution of the trajectories, with a reduced response time).
  • - Detection module which allows, from the images of each pair of cameras that make up each 3D camera (Upper and Lower camera), each one of them is analyzed to detect the presence of birds in the space to be monitored.
  • the detection algorithms take into account the dynamic conditions of the environment in which the system is located (change in lighting produced by the sun, movement of the clouds, effect of the wind, allowing a robust detection of birds in flight; it performs the check that the detected bird appears in the two images, so that stereoscopic calculations can be made.
  • the - three-dimensional positioning calculation module which allows once it has been verified that the detected bird appears in the two images of the stereoscopic pair, it obtains, for each moment t, the position C, U, Z of the bird from the coordinates of the sensor in each one of the images of the Upper and Lower camera, that is to say the coordinates (X si, Y si ) and (X S 2, Y S 2) mentioned in previous sections.
  • - monitoring module which allows: from the moment a bird is detected, the tracking of its coordinates (X, Y, Z, t) in all captures and to evaluate from these positions the possible trajectory that It will continue. That is, from the components of velocity and acceleration in each of the axes of three-dimensional space (V x , V y , V z ) and (A x , A y , A z ) that are determined in each capture, we can infer where the bird should be found in the next frame, and thus be able to track it more precisely; This system allows simultaneous monitoring of multiple birds within the field of vision, independently managing the trajectory of each one of them.
  • - Image recording module which allows: storing the images captured by each of the cameras of the stereoscopic pair that is detecting the bird.
  • - communications module which allows an Internet connection to be available either by cable, satellite or other means such as GSM, 5G, etc. This connection is used to monitor the behavior of the system remotely, as well as to adjust the operating parameters or to carry out backup copies. All the elements of this system are interconnected through its own network, which in turn is connected to the SCADA system of the wind farm.
  • this network is used to: store all images and data in the registry system; communicate with the electronic artificial intelligence equipment to send the 3D images and the data obtained from them: the coordinates (X, Y, Z, t) the size of the birds; communicate with the electronic artificial intelligence equipment to receive the data of the birds of which it is required to obtain high resolution images; send the coordinates of the birds detected to the computer of the high resolution equipment so that it is properly oriented and captures high resolution images of the birds indicated by the electronic artificial intelligence equipment.
  • the movements of the Pan-Tilt control mechanism are controlled by a processing system, the computer of the high resolution unit, which, equipped with computer tools that calculates from the coordinates supplied by the 3d processing and analysis computer, and in real time, the position that each of the motors that control the vertical and horizontal axes must occupy, as well as the zoom and focus motors of the high-resolution camera optics.
  • the computer of the high resolution unit sends to each one of the servo drives or controllers or "drivers” that control the mentioned motors the appropriate orders so that they go to the calculated position.
  • the parameters of the vertical, horizontal, zoom and focus axis motors are adjusted, ensuring that the bird is in the center of the image captured by the camera and perfectly focused, as seen in Figure 13.
  • the high-resolution team's computer sends orders to capture an image to the camera, obtaining a sequence of high-resolution images in which the bird is recorded in flight, captured in different positions.
  • the electronic artificial intelligence equipment decides that it is necessary, it orders the 3D processing and analysis computer of the stereoscopic equipment to send in real time to the Pan-Tilt system, the three-dimensional coordinates (C, U, Z) in which the bird is found. Only then is the bird tracking and high-resolution image capture sequence started on the high-resolution equipment.
  • This sequence consists of the following:
  • the high resolution equipment that receives coordinates from the 3D processing and analysis computer of the stereoscopic equipment, automatically positions itself to correctly point to those coordinates. To do this, it must move from its current position to the aiming position, a position in which the system points in the appropriate direction to follow the bird and in which, in addition, the zoom and focus values of the optics are adjusted to obtain correct images.
  • the tracking begins, that is, each of the coordinates supplied by the 3D processing and analysis computer of the stereoscopic equipment is repositioned in real time.
  • the image capture is started according to a previous schedule, in which the number of images to be captured and the time interval between each one have been determined.
  • a previous schedule in which the number of images to be captured and the time interval between each one have been determined.
  • the high-resolution system is available to carry out a new sequence of monitoring and high-resolution image capture. This is communicated by the high-resolution equipment computer to the stereoscopic equipment's 3D processing and analysis computer so that it can supply the coordinates of another bird, at which point another sequence will begin.
  • This computer has several processing modules:
  • this module determines, from a previously stored table that corresponds to the optics that the system has mounted, the position to be reached by the zoom motors and focus of the optics so that they cover and focus correctly on the bird
  • Image capture module it is responsible for capturing the images captured by the high-resolution camera. This module, once the bird's tracking and focus has been achieved, at preprogrammed intervals and for a specified time, sends the camera orders to capture an image, obtaining a sequence of high-resolution images in which the bird is recorded in flight, captured in various positions.
  • the captured images in addition to being sent to the electronic artificial intelligence equipment and the electronic detection recording module, are also stored in the high-resolution equipment's computer.
  • This module is in charge of communications with the rest of the elements of the system: it receives the coordinates (C, U, Z, t) of the bird detected from the 3D processing and analysis computer; Once the positions to be reached by the horizontal-vertical (Pan-Tilt), Zoom and Focus motors have been calculated, it sends the orders to the servo drives or controllers of these motors; when a high resolution image capture and tracking sequence ends, the images captured from the high resolution camera are supplied to the electronic artificial intelligence equipment and the electronic detection recording module; when a high resolution image capture and tracking sequence ends, it communicates to the 3D processing and analysis computer that the system is available to start a new sequence.
  • Artificial intelligence computer in the artificial intelligence electronic equipment is based on a computer that performs automatic learning based on Deep Learning algorithms, based on the following data: the data supplied by the 3D processing and analysis computer: the 2D and 3D images; the coordinates in space (X, Y, Z, t)); the components of velocity and acceleration in each of the axes of three-dimensional space (V x , V y , V z ) and (A x , A y , A z ); the direction vector of the path that follows; the size of the bird the high resolution images provided by the high resolution equipment the variables provided by the control center of the Wind farm, the SCADA: the meteorological variables (wind speed and direction, temperature, relative humidity and precipitation); the state variables of the wind turbines (rotor orientation and blade rotation speed) the statistical analysis that has made it possible to classify the most probable flight behavior of each species, or group, of birds according to the meteorological characteristics and state of the wind turbines that existed at all times are capable of generating an alarm signal that activates any
  • the artificial intelligence (OI) computer receives all the aforementioned data that is being generated at the moment, that is, data from stereoscopic equipment (EE), high resolution equipment (EA), and databases generated in the electronic detection recording module (MH) but, in addition, it has the “Deep Learning” computer system that has been fed with data from the statistical analysis of the historical record of detections ( Figure 15).
  • the artificial intelligence computer (OI) makes the decision to activate or not (D), an alarm signal to the wind turbine (A), based on the data that reaches it in real time but comparing them with the statistical analysis of the historical record of detections with which said electronic artificial intelligence equipment has been trained, analysis that provides the most probable flight routes that the bird will follow considering the meteorological variables and existing wind turbines at that time.
  • This alarm signal (S) can be translated into the shutdown of the wind turbines or the activation of other dissuasive methods.
  • the computer tools of the artificial intelligence computer consist of several modules: - communications module, which allows: receiving the data and images generated by the 3D processing and analysis computer and the computer of the high-resolution unit; communication with the control systems of the wind turbines, the SCADA of the wind farm, to receive their status data, that is, to receive the rotor orientation variables and the rotation speed of each wind turbine; communication with the control systems of the wind turbines, communication with the SCADA of the wind farm, to receive data on meteorological variables: wind speed and direction, temperature, relative humidity and precipitation; communication with the SCADA of the wind farm, to communicate with the control systems of the wind turbines, in order to send alarm signals in the event of possible collisions and manage the response of the system ;; communication with the SCADA of the wind farm, to send the orders to stop the wind turbines; communication with other deterrent methods (lights, sounds, UAVs) to indicate possible collision situations and manage the response of these systems; communication with the registration module in order to store all the data.
  • - communications module which allows: receiving the data
  • This module has two sub-modules: o Pre-identification sub-module of the species or group to which the bird belongs through the images of the detected bird, position and size variables, that is, from the (coordinates (X, Y, Z, t) ), of the components of velocity and acceleration in each of the axes of three-dimensional space (V x , V y , V z ) and (A x , A y , A z ) and of the size supplied by the processing and analysis computer 3D. It is very important because it allows us to approximate the classification of the species or group to which the detected bird belongs using only the data provided by the stereoscopic equipment.
  • this module uses the data provided by the species or group identification submodule to which the bird belongs through the position and size variables, orders a priority list so that the high resolution equipment can obtain sequential close-up images of the birds and thus fine-tune the classification by means of the Identification Submodule of the species or group to which the bird belongs by means of the images provided by the stereoscopic and high-definition equipment. resolution.
  • the high-resolution equipment can only track one bird or group of birds, so it is necessary to establish an order of priority in the targeting sequence for each of the birds detected.
  • the first place it establishes a priority, based on the data provided by the species or group identification submodule to which the bird belongs through the position and size variables, to determine the order in which the high-resolution equipment must capture. close-ups of detected birds; Based on the priority list above, and sequentially, manage which bird the high-resolution equipment should target to get close-up images of it. To do this, it tells the 3d processing and analysis computer to send the high-resolution equipment the coordinates of the corresponding bird on the priority list.
  • the high-resolution equipment When the high-resolution equipment receives coordinates from the 3D processing and analysis computer, it initiates the sequence of tracking and capturing high-resolution images. Once this sequence is complete, the high-resolution equipment is available again and the next bird on the priority list is moved. Once the birds have been classified with the species or group identification submodule to which the bird belongs through the images provided by the stereoscopic and high-resolution equipment, it determines the protection status of the detected species and acts accordingly.
  • Each alarm will store the most probable trajectory that has been defined for each bird, as well as all the parameters that are of interest for later analysis. This module makes it possible to determine which wind turbine or turbines are going to receive a stop order.
  • the alarm generated contains the wind turbine on which to act; before the Lead Distance is reached, the system generates an alarm for the wind turbine involved; In the event that the analyzed trajectory foresees a possible impact of the bird on several wind turbines, before the Advancement Distance is reached, the system generates an alarm for each one of them; In the event that the bird detected is a species classified as Endangered, it does not act on the wind turbines taking into account the Advance Distance, but generates an immediate stop order for all wind turbines that are on the route of the most probable trajectory.
  • the species or group to which the Species Identification module has assigned the bird For each alarm, the species or group to which the Species Identification module has assigned the bird, the actual positions (X, Y, Z, t) of the bird, the most probable trajectory, as well as the size and possible parameters of the bird will be stored. interest for subsequent analysis (meteorological variables, wind turbine status, monitoring time, alarm duration, expected impact time); and in the event that there is the possibility of other deterrence methods, the order is also sent for its activation.
  • the system for detecting birds in wind farms that is the object of the present invention, where said wind farm comprises a plurality of wind turbines (A) distributed throughout the area of the wind farm, and which are managed by a control center of the wind farm (SCADA), has the particularity compared to any known system, of comprising: at least one stereoscopic equipment, equipment that comprises a 3D camera (2) made up of at least two 2D cameras that form a stereoscopic pair; where the 2D cameras are located, vertically, on a support pole (1); equipment comprising a 3D processing and analysis computer of the images captured by the 3D camera and a telecommunications module; at least one high resolution equipment, equipment comprising a 2D high resolution camera (5)
  • a method can be developed that solves the problems of detection and taking measures to protect the birds that may be in a wind farm, which includes the stages of: i. the taking of stereoscopic images with at least one stereoscopic equipment with 3D cameras arranged in the area where the wind turbines are located, detecting in advance the birds that carry trajectories of possible impact;

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Sustainable Development (AREA)
  • Combustion & Propulsion (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Pest Control & Pesticides (AREA)
  • Sustainable Energy (AREA)
  • Chemical & Material Sciences (AREA)
  • Wood Science & Technology (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Insects & Arthropods (AREA)
  • Birds (AREA)
  • Catching Or Destruction (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention concerne un système et un procédé pour détecter l'avifaune dans des parcs éoliens. des parcs comprenant une pluralité d'aérogénérateurs qui sont gérés par un centre de commande du parc éolien, un système qui comprend au moins un équipement stéréoscopique, cet équipement comprenant une caméra 3D située, dans le sens vertical, sur un mât de support placé dans la zone où sont situés les aérogénérateurs; au moins un équipement à haute résolution qui comprend une caméra à haute résolution 2D et se situe également sur le mât de support; des modules de détection de variables météorologiques et de variables d'état des aérogénérateurs du parc éolien; un module électronique d'enregistrement des détections; et un équipement électronique d'intelligence artificielle qui reçoit les données des équipements et modules précédents et qui comprend un ordinateur avec des outils informatiques de prise de décision d'activation de l'arrêt des aérogénérateurs ou d'activation de systèmes disuasifs.
PCT/ES2020/070415 2020-06-29 2020-06-29 Système et méthode pour détecter l'avifaune dans des parcs éolients WO2022003213A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/ES2020/070415 WO2022003213A1 (fr) 2020-06-29 2020-06-29 Système et méthode pour détecter l'avifaune dans des parcs éolients
ES202290033U ES1303416Y (es) 2020-06-29 2020-06-29 Sistema para detectar avifauna en parques eolicos

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/ES2020/070415 WO2022003213A1 (fr) 2020-06-29 2020-06-29 Système et méthode pour détecter l'avifaune dans des parcs éolients

Publications (1)

Publication Number Publication Date
WO2022003213A1 true WO2022003213A1 (fr) 2022-01-06

Family

ID=79315606

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/ES2020/070415 WO2022003213A1 (fr) 2020-06-29 2020-06-29 Système et méthode pour détecter l'avifaune dans des parcs éolients

Country Status (2)

Country Link
ES (1) ES1303416Y (fr)
WO (1) WO2022003213A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2017470A1 (fr) * 2006-04-27 2009-01-21 The Tokyo Electric Power Co., Inc. Dispositif de generation d'electricite d'origine eolienne, procede de controle du dispositif de generation d'electricite d'origine eolienne et programme informatique
GB2470806A (en) * 2009-04-09 2010-12-08 Zeiss Carl Optronics Gmbh Detecting objects by comparing digital images
US20130050400A1 (en) * 2011-08-31 2013-02-28 Henrik Stiesdal Arrangement and Method to Prevent a Collision of a Flying Animal with a Wind Turbine
US20160050889A1 (en) * 2014-08-21 2016-02-25 Identiflight, Llc Imaging array for bird or bat detection and identification
US20160055400A1 (en) * 2014-08-21 2016-02-25 Boulder Imaging, Inc. Avian detection systems and methods

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2017470A1 (fr) * 2006-04-27 2009-01-21 The Tokyo Electric Power Co., Inc. Dispositif de generation d'electricite d'origine eolienne, procede de controle du dispositif de generation d'electricite d'origine eolienne et programme informatique
GB2470806A (en) * 2009-04-09 2010-12-08 Zeiss Carl Optronics Gmbh Detecting objects by comparing digital images
US20130050400A1 (en) * 2011-08-31 2013-02-28 Henrik Stiesdal Arrangement and Method to Prevent a Collision of a Flying Animal with a Wind Turbine
US20160050889A1 (en) * 2014-08-21 2016-02-25 Identiflight, Llc Imaging array for bird or bat detection and identification
US20160055400A1 (en) * 2014-08-21 2016-02-25 Boulder Imaging, Inc. Avian detection systems and methods

Also Published As

Publication number Publication date
ES1303416Y (es) 2023-12-20
ES1303416U (es) 2023-09-29

Similar Documents

Publication Publication Date Title
ES2821735T3 (es) Sistema y procedimiento de detección de pájaros
US11751560B2 (en) Imaging array for bird or bat detection and identification
US11017228B2 (en) Method and arrangement for condition monitoring of an installation with operating means
CN110603379B (zh) 用于风力设备检查工具的检查工具控制装置
ES2627017T3 (es) Procedimiento y sistema para facilitar el aterrizaje autónomo de vehículos aéreos sobre una superficie
ES2730975T3 (es) Procedimiento y sistema para examinar una superficie en cuanto a defectos de material
CN103733234B (zh) 用于检测飞机场中的外物、碎片或损坏的监视系统和方法
CN109164443A (zh) 基于雷达及图像分析的铁路线路异物检测方法及系统
RU2716936C1 (ru) Система навигационного освещения парка ветроэнергетических установок, а также парк ветроэнергетических установок с такой системой и способ сигнального освещения парка ветроэнергетических установок
CN101968913B (zh) 一种森林火灾区域的火焰跟踪方法
ES2886184T3 (es) Sistema que registra las colisiones de animales voladores con turbinas eólicas, su aplicación y manera de registrar colisiones de animales voladores con turbinas eólicas con el uso del sistema
CN113778137A (zh) 输电线路的无人机自主巡检方法
CN111966121A (zh) 一种无人机倾斜摄影测量偏航角自动纠偏装置
Pinney et al. Drone path planning and object detection via QR codes; a surrogate case study for wind turbine inspection
WO2022003213A1 (fr) Système et méthode pour détecter l'avifaune dans des parcs éolients
EP4296973A1 (fr) Système et procédé de localisation de phénomènes anormaux dans des actifs
CN205354138U (zh) 无人飞行器
ES2945632B2 (es) Sistema de auscultacion de aerogeneradores de parques eolicos en operacion y procedimiento para dicho sistema
RU126173U1 (ru) Система видеонаблюдения с транспортного средства, находящегося в движении
ES1306718U (es) Sistema electroóptico transportable en una aeronave para detectar e identificar automáticamente plantaciones en superficie
JP2020118619A (ja) 移動体追跡システム、および移動体追跡方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20943070

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20943070

Country of ref document: EP

Kind code of ref document: A1