GB2470806A - Detecting objects by comparing digital images - Google Patents
Detecting objects by comparing digital images Download PDFInfo
- Publication number
- GB2470806A GB2470806A GB1005935A GB201005935A GB2470806A GB 2470806 A GB2470806 A GB 2470806A GB 1005935 A GB1005935 A GB 1005935A GB 201005935 A GB201005935 A GB 201005935A GB 2470806 A GB2470806 A GB 2470806A
- Authority
- GB
- United Kingdom
- Prior art keywords
- image
- clusters
- original
- detected
- objects
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 61
- 239000013598 vector Substances 0.000 claims abstract description 46
- 230000005855 radiation Effects 0.000 claims abstract description 45
- 238000006073 displacement reaction Methods 0.000 claims abstract description 37
- 230000000875 corresponding effect Effects 0.000 claims abstract description 36
- 238000001514 detection method Methods 0.000 claims abstract description 27
- 230000002596 correlated effect Effects 0.000 claims abstract description 12
- 244000144992 flock Species 0.000 claims description 58
- 238000004590 computer program Methods 0.000 claims description 24
- 238000012545 processing Methods 0.000 claims description 21
- 238000012806 monitoring device Methods 0.000 claims description 17
- 238000001931 thermography Methods 0.000 claims description 14
- 238000011156 evaluation Methods 0.000 claims description 9
- 238000012544 monitoring process Methods 0.000 claims description 6
- 241000894007 species Species 0.000 claims description 5
- 230000008859 change Effects 0.000 claims description 3
- 230000002950 deficient Effects 0.000 claims description 3
- 238000012935 Averaging Methods 0.000 claims description 2
- 238000001914 filtration Methods 0.000 claims description 2
- 230000002123 temporal effect Effects 0.000 claims description 2
- 230000015572 biosynthetic process Effects 0.000 description 8
- 238000004891 communication Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000000454 anti-cipatory effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000004148 unit process Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/06—Interpretation of pictures by comparison of two or more pictures of the same area
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64F—GROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
- B64F1/00—Ground or aircraft-carrier-deck installations
- B64F1/18—Visual or acoustic landing aids
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0004—Transmission of traffic-related information to or from an aircraft
-
- H04N13/0003—
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02B—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
- Y02B10/00—Integration of renewable energy sources in buildings
- Y02B10/30—Wind power
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Multimedia (AREA)
- Acoustics & Sound (AREA)
- Mechanical Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
A method for the detection of an object 6 and/or a group of objects in digital image sequences recorded stereoscopically by a calibrated stereo camera means uses at least first and second cameras (3a,3b). Relevant image regions, which are grouped to form one or more original clusters, are determined in a first image of the first camera. Clusters corresponding to the original clusters are correlated in a first image of the second camera based on a similarity criterion. The disparities of the respective original clusters from the corresponding clusters are determined in the first image of the second camera and the clusters corresponding to the original clusters are correlated in a second image and at least one displacement vector of the respective original cluster is determined Then the individual original clusters are each assigned to an object to be detected or a group of objects to be detected. The position, the distance and the velocity of the at least one object to be detected and/or the at least one group of objects to - be detected are determined accordingly. A further method uses a camera which emits a radiation source. The method can particularly used on airfields.
Description
METHOD FOR THE DETECTION OF AT LEAST ONE OBJECT
AND/OR AT LEAST ONE GROUP OF OBJECTS, COMPUTER PROGRAM, COMPUTER PROGRAM PRODUCT, STEREO CAMERA MEANS, IMAGE SENSOR SYSTEM ACTIVELY EMITTING RADIATION, AND MONITORING DEVICE [001] The invention relates to a method for the detection of at least one object and/or at least one group of objects in digital image sequences recorded stereoscopically by a calibrated stereo camera means comprising at least a first camera and at least a second camera. Furthermore, the invention relates to a method for the detection of at least one object and/or at least one group of objects in digital image sequences recorded by at least one image sensor system actively emitting radiation. The invention also relates to a computer program and to a computer program product for executing or carrying out methods of this type. In addition, the invention also relates to a stereo camera means, to an image sensor system actively emitting radiation and to a monitoring device for wind power plants, buildings with transparent regions, take-off and landing runways and/or flight corridors of airports.
[002] Aircraft frequently collide with birds or flocks of birds when taking off and landing. The term "flocks of birds" denotes in this case groups of birds usually of the same size and species, which often fly in the same direction. This risk is greatly increased in particular when flight paths of birds of the regional or supraregional migration of birds, which frequently take their bearings from landscape structures such as bodies of water, valleys or coastlines, intersect flight corridors of airports.
Collisions of this type can lead to damage inter alia to the engines of aircraft.
[003] The earlier, non-anticipatory DE 10 2008 018 880.8 proposes a monitoring device for wind power plants, buildings with transparent regions, take-off and landing runways and/or flight corridors of airports with a stereoscopic detection of approaching or present birds or flocks of birds for carrying out a monitoring method, wherein parameters such as flight altitude, direction of flight, flying velocity, species and size of the birds or the flocks of birds can be ascertained. At least one stereo camera means, which has at least two cameras, in particular thermal imaging cameras, which are arranged at a defined and adapted spacing from one another, and run in synchronisation during recording, the recording moments of which are at least approximately identical and the respective fields of view of which have an overlapping region, is provided in the region of the wind power plants, buildings with transparent regions, take-off and landing runways andfor the flight corridors. The system therefore consists substantially of two cameras which are oriented in parallel and can detect an area of defined size. The cameras can for example be arranged in such a way that the flocks of birds fly into the observation area at an angle of 90 degrees to the orientation of the cameras. An evaluation unit or image processing unit processes the image signals of the two cameras and calculates therefrom the location and the velocity with the direction thereof or the velocity vector of the flocks of birds.
The method is based on the evaluation of data of the calibrated stereoscopic camera system. The evaluation method is intended to determine the above-cited parameters of a flock of birds as accurately and reliably as possible.
[004] DE 10 2005 055 879 Al relates to an air traffic guidance means for inspecting and controlling air traffic in the region of an airport with a plurality of observation cameras which can be aligned with assigned regions of the airport, with a video projection means in a control centre for displaying information detected using the observation cameras as a video panorama.
[005] With regard to further prior art, reference is made to US 2006/0049930 Al and DE 10 2005 008 131 Al.
[006] Starting therefrom, the present invention is based on the object of providing methods, a computer program or a computer program product, a stereo camera means, an image sensor system actively emitting radiation and a monitoring device of the type mentioned at the outset that reliably and accurately detect the location and the velocity or the velocity vector of objects or groups of objects, in particular birds or flocks of birds, in particular reduce the occurrence or the likelihood of false alarms.
[007] According to the invention, this object is achieved by claim 1 or claim 2. With regard to the computer program or the computer program product, the object is achieved by claim 14 and claim 15 respectively.
With regard to the stereo camera means and the image sensor system actively emitting radiation, the object is achieved by claim 16 and claim 17 respectively. With regard to the monitoring device, the object is achieved by claim 18.
[008] The method according to the invention for the detection of at least one object and/or at least one group of objects in digital image sequences recorded stereoscopically by a calibrated stereo camera means comprising at least a first camera and at least a second camera allows objects or groups of objects to be detected and if appropriate to be identified as birds or groups of birds using a passive measuring method. This allows the likelihood of false alarms in monitoring methods utilising the method according to the invention to be minimised. The position, the flying velocity and direction of flight or the flying velocity vector of a flock of birds is, in particular, determined accurately and reliably in this case.
Within a monitoring method, this allows reliable and prompt advance warning of bird strike to be provided. This allows air traffic control (for example Deutsche Flugsicherung (DFS)) or flight security systems to inform pilots or if appropriate to alter or to set back or to delay take-offs and landings of aircraft in such a way as to effectively prevent a collision with a bird or a flock of birds. A further advantage consists in the fact that the system is a passive system, as operation is carried Out close to the airport. Image sensor systems actively emitting radiation, such as radars, lasers or the like, might lead to disturbances of other monitoring systems, in particular systems which are likewise active. The method according to the invention for the stereoscopic detection of at least one object and/or at least one group of objects includes a determination of the positions and distances of the objects or groups of objects to be detected and also a detection of the velocity or the velocity vector with the direction of movement or flight.
[0091 What are known as disparities are ascertained for determining the depth information or the distances of objects in a rectified, temporally synchronised pair of stereo images. A (horizontal) disparity is in this case defined as a one-dimensional displacement vector in the direction of the image line and indicates, starting from an image point in the left stereo image, the corresponding image point in the right stereo image. The depth information of the stereo image can then be determined with the aid of the disparities in consideration of the geometry of the stereo camera.
[0010] In the method according to the invention for the detection of at least one object and/or at least one group of objects in digital image sequences recorded stereoscopically by a calibrated stereo camera means comprising at least a first camera and at least a second camera: * relevant image regions, which are grouped to form one or more original clusters, are determined in a first image of the first camera. In order to reduce computing effort and at the same time to increase recognition robustness, only relevant image regions are taken into account. The result is a binary image in which a logic one is assigned to the pixels of the relevant image regions.
The resulting image regions or segments are then subjected to a grouping algorithm in which subclusters or original clusters are produced.
* Afterwards, the clusters corresponding to the respective original clusters are correlated in a first image of the second camera based on a similarity criterion. The extracted content from the original image or the original clusters of the first image of the first camera is used in a correlation algorithm for finding the corresponding image structures in the second stereo image. This produces a correlation field in which the minimum is represented with subpixel accuracy. Should, for example, disturbances (for example caused by clouds or the like in the case of birds as objects) prevent correlation, the current images can be rejected in order to avoid misdetections.
* Subsequently, the disparities of the original clusters from the corresponding clusters are determined in a first image of the second camera. The disparities from these clusters are thus ascertained from the pair of stereo images.
[0011] Thereupon: * the clusters corresponding to the original clusters are correlated in a second image, which is recorded in a temporally offset manner (in other words, it is recorded at a time which is before or after the time when the other image was recorded), of the first camera and/or the second camera based on the similarity criterion and at least one displacement vector of the respective original cluster is determined therefrom. Thus, the displacement vectors for the clusters are determined from the current and a temporally offset image, i.e. a subsequent or preceding image. The extracted content from the original image is accordingly also used for finding the corresponding image structures in the subsequent or preceding image. For this purpose, the same correlation algorithm can be used as in the determination of the disparities of the pair of stereo images.
* Subsequently, the individual original clusters are each assigned, in particular in consideration of the disparity and the at least one displacement vector of the original cluster, to an object to be detected or a group of objects to be detected.
* The position and the distance of the at least one object to be detected and/or the at least one group of objects to be detected from the stereo camera means are ascertained based on the position and the distance of the at least one assigned original cluster from the stereo camera means, which is obtained, taking into account the geometry of the stereo camera means, from the disparity of the at least one assigned original cluster (17a, 17b).
Thus, the depth information is determined taking into account the stereo geometry used. The distance from the object corresponding in the camera coordinated system is determined for each cluster with the aid of the resulting disparities.
* Subsequently, the velocity of the at least one object to be detected and/or the at least one group of objects to be detected is determined from the at least one displacement vector of the at least one assigned original cluster (17a, 17b) taking into account the ascertained position and distance of the at least one assigned original cluster from the stereo camera means. The velocity is ascertained with the aid of the depth information and the displacement vectors. The velocity of the objects or groups of objects is obtained from the assigned displacement vector taking into account the previously determined distance information.
[0012] Alternatively, according to the invention, a method is proposed for the detection of at least one object and/or at least one group of objects in digital image sequences recorded by at least one image sensor system actively emitting radiation, wherein * relevant image regions, which are grouped to form one or more original clusters, are determined in a first image of the image sensor system actively emitting radiation, after which * the clusters corresponding to the original clusters are correlated in a second image, which is recorded in a temporally offset manner, of the image sensor system actively emitting radiation based on a similarity criterion and at least one displacement vector of the respective original cluster is determined therefrom, after which * the individual original clusters are each assigned, in particular in consideration of the at least one displacement vector of the original cluster, to an object to be detected or a group of objects to be detected, wherein * the position and the distance of the at least one object to be detected and/or the at least one group of objects to be detected are determined based on the position and the distance of the at least one assigned original cluster from the image sensor system actively emitting radiation, which are directly ascertained by the image sensor system actively emitting radiation, and wherein * the velocity of the at least one object to be detected and/or the at least one group of objects to be detected is determined from the at least one displacement vector of the at least one assigned original cluster taking into account the ascertained position and distance of at least one assigned original cluster from the image sensor system actively emitting radiation.
[00131 Examples of image sensor systems actively emitting radiation are, in particular, radar sensors, lidar sensors, laser sensor systems or laser scanners, runtime cameras or the like.
[0014] It is advantageous if the velocity of the at least one group of objects to be detected is determined from a weighted mean value of the velocities of the different original clusters assigned to the corresponding group of objects. The velocity of a group of objects or an entire flock of birds may be determined by forming the weighted mean value of the velocities of the various clusters.
[0015] It is also advantageous if the relevant image regions display sufficiently high movement-induced change.
[00161 These measures reduce computing effort and at the same time increase recognition robustness. Account is taken only of image regions displaying sufficiently high movement. For this purpose, the content of the current image can be subtracted pixel by pixel from the content of a background image. This background image is constantly updated in order to take account of changes in the surveyed scene. This updating can be performed in a parameterised manner and be optimally controlled in accordance with the velocity of the objects sought. Movement can thus be detected by means of differential formation. As a differential formation of this type returns values or results even in the case of slight differences of the surveyed images caused by noise or the like, for example, it is possible to predefine a threshold, i.e. a threshold value, from which the differential values are classified as being beneficial. For this purpose, a histogram of the image can be analysed and the main component of the signal energy can be established therein. This produces a binary image in which a logic one is assigned to the pixels of the relevant image regions.
An adaptive threshold value formation is accordingly carried out to generate binary images.
[00171 Furthermore, according to the invention, provision may be made for the relevant image regions to be grouped to form the original clusters based on a stochastic model, in particular at random. Accordingly, the binary image segments are grouped to form clusters of differing shape and size based on a stochastic model. The production of random subclusters increases the likelihood of the formation of structures of different shapes. This advantageously increases the number of correct disparities and also of displacement vectors. The original clusters or the corresponding clusters thereof may be different shapes and sizes.
[0018] The original clusters can be grouped at different cluster spacings.
The clusters spacing can be reduced, starting from an admissible maximum value, step by step down to an admissible minimum value, new original clusters being grouped and added to the total quantity of original clusters step by step at each newly selected cluster spacing.
[0019] The original clusters or the corresponding clusters thereof can be identified and/or described by their circumscribing or enclosing rectangles in the images of the first and second camera. For this purpose, these rectangles are used to extract the content from the original image for determining the disparities and the displacement vectors.
[0020] In the correlation, the similarity criterion used may be the sum of the absolute differences. It goes without saying that further criteria which are conventional in stereoscopic evaluation may also be applied.
[00211 It is advantageous if a plausibility algorithm checks, in the at least one object to be detected and/or the at least one group of objects to be detected, the disparities of the respectively assigned original clusters and/or the at least one displacement vector of the respectively assigned original clusters (17a, 17b), defective results being eliminated. A plausibility check is carried out to discard the false disparities and displacement vectors. The plausibility algorithm thus extracts the correct disparities and provides these for ascertaining the depth information. The plausibility algorithm can be formed iteratively, the outlier being eliminated step by step from the set of disparities and the displacement vectors taking into account predefined criteria. This increases the reliability and security of the method.
[0022] The objects or groups of objects may be birds or flocks of birds.
[0023] It is advantageous if a temporal consideration of the detection of the at least one object and/or the at least one group of objects is carried out via at least two temporally successive images of the image sequences, in particular by means of a corresponding filtering or averaging. This can additionally increase the stability of the methods according to the invention. A type of tracking of the objects or groups of objects can, thus, be carried out.
[0024] Claim 16 specifies a stereo camera means comprising at least two mutually calibrated cameras which are arranged at an, in particular defined and adapted, spacing and run in synchronisation during recording.
The at least two cameras of the stereo camera means can be designed as cameras in the visual range with CCD or CMOS image sensors or as thermal imaging cameras.
[00251 Claim 17 describes an image sensor system actively emitting radiation.
[0026] The methods according to the invention for the detection of at least one object and/or at least one group of objects are preferably carried out as a computer program on an image processing means of the stereo camera means and/or the image sensor system actively emitting radiation.
For this purpose, the computer program is stored in a memory element of the image processing means. The method is executed by processing on a microprocessor of the image processing means. The computer program can be stored as a computer program product on a computer-readable data carrier (floppy disk, CD, DVD, hard disk, USB memory stick or the like) or an Internet server and be transmitted from there to the memory element of the image processing means.
[0027] A computer program or computer program product of this type with program code means is specified in claim 14 and claim 15 respectively.
[00281 Claim 18 specifies a monitoring device for wind power plants, buildings with transparent regions, take-off and landing runways and/or flight corridors of airports with a stereoscopic detection of approaching or present birds or flocks of birds. In an advantageous manner, the accurate and reliable detection, resulting from the methods according to the invention, of the position, flying velocity or direction of flight of the birds and/or flocks of birds enables the monitoring device to provide prompt and reliable advance warning of bird strike, i.e. of collisions with birds and/or flocks of birds.
[0029] Advantageous configurations and developments of the invention emerge from the sub-claims. The principles of an exemplary embodiment of the invention will be specified hereinafter with reference to the drawings, in which: Figure 1 is a schematic illustration of a monitoring device according to the invention; Figure 2 is a simplified illustration of an arrangement of a stereo camera means according to the invention or an image sensor system according to the invention actively emitting radiation in the region of a flight corridor of an aircraft; Figure 3 is a schematic illustration to clarify the methods according to the invention; Figure 4 is a schematic flow chart of a method according to the invention; Figure 5 is a schematic flow chart of a grouping algorithm within a method according to the invention; Figure 6 is a schematic flow chart of a correlation algorithm within a method according to the invention; and Figure 7 is a schematic flow chart of a plausibility algorithm for use in a method according to the invention.
[00301 The invention will be described hereinafter within the scope of a monitoring device for wind power plants, buildings with transparent regions, take-off and landing runways and/or flight corridors of airports, in particular against bird strike. It goes without saying that the invention is not limited to these applications. Birds and flocks of birds will accordingly be regarded hereinafter as objects and groups of objects respectively.
[0031] Figure 1 shows a stereo camera means 1 of a monitoring device 2 for take-off and landing runways 14 and/or flight corridors 11 (see Figure 2) of airports with a stereoscopic detection of approaching birds 6 and/or flocks of birds 6, wherein parameters such as flight altitude, direction of flight, flying velocity and species/ size of the birds 6 or the flocks of birds 6' can be ascertained. One or more stereo camera means 1 of this type are arranged in the region of the take-off and landing runways 14 and/or the flight corridors 11 and have at least two thermal imaging cameras 3a, 3b which are arranged at a defined and adapted spacing from one another and run in synchronisation during recording. In further exemplary embodiments (not shown), cameras in the visual range with CCD or CMOS image sensors could also be used. The recording timings of the thermal imaging cameras 3a, 3b are at least approximately identical and their respective fields of view 4a, 4b have an overlapping region 5.
In the overlapping region 5, the object detected is, as may be seen from Figure 1, a bird 6. The two thermal imaging cameras 3a, 3b are adjusted and calibrated to each other. Examples of the thermal imaging cameras 3a, 3b are both thermal imaging regions such as LWIR, MWIR, VLWIR, FIR and SWIR, NIR. Thermal imaging cameras in the medium-wavelength infrared (MWIR) range are particularly suitable, in particular from about 3 pm to 5 pm, or in the long-wavelength infrared range, in particular from about 7 pm to 14 pm, preferably from about 8 pm to 12 pm. The direction of movement of the bird 6 is indicated in Figure 1 by means of a dashed arrow. The thermal imaging cameras 3a, 3b are preferably oriented in such a way that the birds 6 and/or flocks of birds 6' fly into the observation area at an angle of approximately 90 degrees to the orientation of the cameras.
[0032] The stereo camera means 1 has an evaluation unit or image processing means 7 which is provided for processing the image data or image signals recorded using the two thermal imaging cameras 3a, 3b and which calculates therefrom the position and flying velocity or direction of flight of the birds 6 or flocks of birds 6'.
[00331 In addition, the stereo camera means 1 has a radio station 8 as an interface, in particular a network interface, for communication with further stereo camera means 1 or with control systems, in particular flight security systems 9 (indicated in Figure 1 by the double-headed arrow 8').
The stereo camera means 1 operates autonomously, i.e. on a stand-alone basis. However, further stations or stereo camera means 1 can be connected by means of the networking or the radio transmission via the radio station 8. Both the information and the recordings are thus available outside the individual stations. These data are mainly transmitted to air traffic control.
[0034] A monitoring method for take-off and landing runways 14 and/or flight corridors 11 of airports, which is used to stereoscopically detect approaching birds 6 or flocks of birds 6' by means of the monitoring device 2 or the stereo camera means 1, proceeds inter alia on the image processing means 7 of the stereo camera means 1, parameters such as flight altitude, direction of flight, flying velocity and species/size of the birds 6 or flocks of birds 6' or the flock density thereof being ascertained. The parameters are in this case determined by means of a stereoscopic evaluation. In this case, absolute spatial points of the birds 6 or flocks of birds 6' to be detected are determined through the at least two viewing angles onto the region 5 or observation region recorded by the at least two thermal imaging cameras 3a, 3b of the stereo camera means 1. The flying velocity or direction of flight of the birds 6 or the flocks of birds 6 is determined by viewing over an appropriate period of time. This takes place with the aid of the method according to the invention for the detection of objects or groups of objects. Even birds 6 or flocks of birds 6' at greater distance (for example about 20 km) can be detected, a correspondingly longer focal length being used for the two thermal imaging cameras 3a, 3b. In addition, flying objects such as model aircraft, steering kites or the like can also be detected by the stereo camera means 1 (not shown).
[0035] Alternatively or additionally, the monitoring device 2 can also have an image sensor system 1' (indicated in Fig. 1 by dashed lines) which actively emits radiation and is also provided with an evaluation unit or image processing means 7 for processing the recorded image data or image signals, which calculates therefrom the position, distance and flying velocity or direction of flight of the birds 6 or flocks of birds 6'.
Likewise, the image sensor system 1' actively emitting radiation has a radio station 8 as an interface, in particular a network interface, for communication with further image sensor systems 1' actively emitting radiation, stereo camera means 1 or superordinate systems, in particular flight security systems 9. As may also be seen from Fig. 1, birds 6 can be detected in a field of view 4c of the image sensor system 1 actively emitting radiation. Examples of image sensor systems 1' actively emitting radiation are, in particular, radar sensors, lidar sensors, laser sensor systems or laser scanners, runtime cameras or the like.
[0036] An evaluation is carried out and if appropriate a corresponding warning message is issued based on the parameters.
[0037] As may be seen from Figure 2, an image sensor system 1' actively emitting radiation or a stereo camera means 1 with a stereo vision region or overlapping region 5 monitors a known flight path 10 of birds 6 and/or flocks of birds 6'. The camera means 1, 1' are in this case arranged in such a way that a flight corridor 11 or a region 12 of intersection of the flight corridor 11 with the known flight path 10 of the birds 6 or flocks of birds 6' is monitored. An aircraft 13 is shown by way of example in the flight corridor 11. A moment at which the detected birds 6 or the detected flocks of birds 6' arrive at the region 12 of intersection with the flight corridor 11 of the aircraft is also determined. In order to be able to make a precise prediction about the moment of arrival of the detected birds 6 or the detected flocks of birds 6', the camera means 1, 1' are aligned with the known flight path 10 of the birds 6 or flocks of birds 6'. In addition, in further exemplary embodiments, further camera means 1, 1' can be arranged, in particular at greater distances from the region 12 of intersection (for example a plurality of kilometres), preferably along the known flight path 10 of the birds 6 or flocks of birds 6'.
[00381 If birds 6 or flocks of birds 6' are detected, a warning message to introduce countermeasures or reduction or avoidance measures is if appropriate issued to the control system, in particular the flight security system 9, or to taking-off or landing aircraft 13. Within the scope of the monitoring device 2, as indicated by dashed lines in Figure 2, the stereo camera means 1 or the image sensor system 1' actively emitting radiation can also monitor take-off and landing runways 14 or wind power plants 15 or buildings with transparent regions 15'.
[0039] Figure 3 illustrates the principles of the method according to the invention for the detection of birds 6 or flocks of birds 6', 6" in digital image sequences recorded stereoscopically by the calibrated stereo camera means 1 comprising the first camera 3a and the second camera 3b. In this case, within a first current image recording at a moment ti: * relevant image regions 16, which are grouped to form original clusters 17a, 17b, are determined in a first image L1 of the first camera 3a, after which * the clusters 18a, 18b corresponding to the original clusters 17a, 17b are correlated in a first image R1 of the second camera 3b based on a similarity criterion, after which * the, in particular horizontal, disparities (both cameras 3a, 3b are preferably arranged at the same height) of the respective original clusters 17a, 17b from the corresponding clusters 18a, 18b are determined in the first image R1 of the second camera 3b, after which * within a subsequent second image recording recorded in a temporally offset manner at a moment t2: * the clusters 19a, 19b, 20a, 20b corresponding to the original clusters 17a, 17b are correlated in a temporally successive second image L2, R2 of the first camera 3a and/or the second camera 3b based on the similarity criterion and at least one displacement vector of the respective original cluster 17a, 17b is determined therefrom, after which * the individual original clusters 17a, 17b are each assigned, in particular in consideration of the disparity and the at least one displacement vector of the original cluster 17a, 17b, to an object 6 to be detected or a group of objects 6', 6" to be detected, wherein * the position and the distance of the at least one bird 6 to be detected and/or the at least one flock of birds 6', 6'' to be detected from the stereo camera means 1 are ascertained based on the position and the distance of the at least one assigned original cluster 17a, 17b from the stereo camera means 1, which is obtained, taking into account the geometry of the stereo camera means 1, from the disparity, determined within the first image recording at the moment ti, of the at least one assigned original cluster 17a, 17b, as a result of which the absolute spatial points of the birds 6 and/or flocks of birds 6' are determined, and after which * the velocities or the velocity vectors with the directions of flight of the birds 6 and/or flocks of birds 6', 6' to be detected are determined from the at least one displacement vector, determined within the subsequent second image recording recorded in a temporally offset manner at the moment t2, of the at least one assigned original cluster 17a, 17b taking into account the ascertained position and distance of the at least one assigned original cluster (17a, 17b) from the stereo camera means 1.
[0040] In the present exemplary embodiment, a subsequent image L2, R2 within a subsequent second image recording at the moment t2 was used as the second image recorded in a temporally offset manner. Alternatively, in further exemplary embodiments, a preceding image within a preceding second image recording at the moment t2 can also be used -starting from the current first image L1, R, -as the second image recorded in a temporally offset manner. The arrangement of the moments ti, t2 along the timeline in Fig. 3 would then be swapped over.
[0041] Likewise, the left-hand part of Fig. 3 illustrates the principles of the alternative method for the detection of birds 6 or flocks of birds 6', 6'' in digital image sequences recorded by the image sensor system 1' actively emitting radiation, wherein * relevant image regions 16, which are grouped to form one or more original clusters 17a, 17b, are determined in a first image Li of the image sensor system actively emitting radiation, after which * the clusters i9a, 19b corresponding to the original clusters 17a, 17b are correlated in a subsequent second image L2, which is recorded in a temporally offset manner, of the image sensor system 1' actively emitting radiation based on a similarity criterion and at least one displacement vector of the respective original cluster i7a, 17b is determined therefrom, after which * the individual original clusters i7a, i7b are each assigned, in particular in consideration of the at least one displacement vector of the original cluster 17a, 17b, to a bird 6 to be detected or a flock of birds 6', 6''to be detected, wherein * the position and the distance of the birds 6 and/or flocks of birds 6', 6" to be detected are determined based on the position and the distance of the at least one assigned original cluster 17a, 17b from the image sensor system 1' actively emitting radiation, which are directly ascertained by the image sensor system 1' actively emitting radiation, and wherein * the velocity or the velocity vector with the directions of flight of the birds 6 or flocks of birds 6', 6'' to be detected is determined from the at least one displacement vector of the at least one assigned original cluster 17a, 17b taking into account the ascertained position and distance of the at least one assigned original cluster 17a, 17b from the image sensor system 1' actively emitting radiation.
[0042] In this case too, it is conceivable for a preceding image to be used as the second image L2 recorded in a temporally offset manner.
[0043] The velocities of the flocks of birds 6', 6'' are determined from a weighted mean value of the velocities of the different original clusters 17a, 17b assigned to the corresponding group of objects 6', 6''.
[0044] It is also possible to separate different flocks of birds 6', 6'' from one another.
[0045] Flocks of birds can sometimes reach very high velocities, in particular on account of winds. Marked deviations can for example result from the correlation of different flocks. Such deviations can be recognised in consideration of ornithological studies.
[0046] The relevant image regions 16 display sufficiently high movement-induced change. The relevant image regions 16 are grouped to form the original clusters 17a, 17b based on a stochastic model or purely at random. In this case, in further exemplary embodiments (not shown), different numbers of birds could also be grouped to form clusters (for example five or eight birds). The original clusters 17a, 17b or the corresponding clusters 18a, 19a, 20a, 18b, 19b, 20b thereof may be different shapes and sizes. As indicated by dashed lines in Figure 3, the original clusters 17a, 17b or the corresponding clusters 18a, 19a, 20a, 18b, 19b, 20b thereof are identified by their circumscribing rectangles.
[0047] Figure 4 is a simplified flow chart of the method according to the invention for the detection of birds 6 or flocks of birds 6', 6'' in digital image sequences recorded stereoscopically by the calibrated stereo camera means 1 comprising the first camera 3a and the second camera 3b with further optional method steps. In this case, starting from a left image L, a differential image is formed in a step 101. For this purpose, the content of the current image is subtracted pixel by pixel from the content of a background image. This background image is constantly updated in order to take account of changes in the surveyed scene. Afterwards, a threshold value is formed in a step 102. The cluster formation of the original clusters 17a, 17b, which are identified in step 104 by their circumscribing rectangles, is carried out in a step 103. In a step 105, the original clusters 17a, 17b are used to mask the original image. Thereupon, in the subsequent step 106, the disparities of the original clusters 17a, 17b from the corresponding clusters 18a, 18b are determined in the image R of the second camera 3b. The optical flow, i.e. the displacement vectors, is then ascertained in a step 107. In addition, a plausibility check of the disparities and displacement vectors is carried Out in a step 108.
Subsequently, after an assignment of the original clusters 17a, 17b to birds 6 or flocks of birds 6', 6'' to be detected, the distance of the birds 6 or flocks of birds 6', 6'' is determined in step 109. The 3D movement, i.e. the velocity vectors, of the birds 6 or flocks of birds 6', 6'' is then determined in a step 110 from the displacement vectors with the aid of the distances taking into account the ascertained distances.
[00481 Subsequently, the steps 103, 106 and 108 of the method according to the invention for the detection of birds 6 or flocks of birds 6', 6'' in digital image sequences, recorded stereoscopically by the calibrated stereo camera means 1 comprising the first camera 3a and the second camera 3b, are illustrated in greater detail in Figures 5, 6 and 7. The step 103 illustrated in Figure 5 has a group algorithm for the formation of clusters. The original clusters 17a, 17b are grouped at different cluster spacings. The cluster spacing is reduced, starting from an admissible maximum value, step by step down to an admissible minimum value, new original clusters 17a, 17b being grouped and added to the total quantity of original clusters 17a, 17b step by step at each newly selected cluster spacing. A maximum admissible initial cluster spacing is predefined in a step 103a. The clusters or original clusters 17a, 17b at the predefined cluster spacing are now ascertained in a step 103b and added to the total quantity or list of original clusters 17a, 17b in a step 103c. Thereupon, it is established at a branching 103d whether the minimum cluster spacing has already been reached. If this is the case, the step 103 is concluded.
Otherwise, the admissible cluster spacing is reduced in a step 103e, after which the process is continued in a loop with the step 103b and the reduced cluster spacing. The formation of clusters is terminated when the minimum admissible cluster spacing is reached.
[0049] The step 106 for determining the disparities has a correlation algorithm. The correlation algorithm is illustrated in greater detail in Figure 6. The sum of absolute differences (SAD) is used as the similarity criterion in the correlation. Firstly, in a step 106a, the source region A (top, left, height, width) of the original cluster 17a, 17b is determined in the first image L1 of the first camera 3a, after which, in the step 106b, the target region or search region or correlation region S (Xmin, Xmax, Ymin, Ymax) is defined which relates, in the first image R1 of the second camera 3b, to the position of the source region A in the first image L1 of the first camera 3a. Now, in a step 106c, the first or next position is determined in the correlation region S for the purposes of correlation, after which, in a step 106d, the SAD correlation value between the source region A of the original cluster 17a, 17b and the correlation region S is determined and is stored in an array or a similarity matrix in a step 106e.
For all of the positions, this is carried out in the correlation region S. Afterwards, it is established at a branching 106f whether all the positions of the correlation region S have already been correlated. If this is not the case, step 106c is continued in a loop; if this is the case, in a step lO6g, the subpixel-accurate position of the minimum is determined in the array or the similarity matrix and this position is interpreted as a displacement factor. After the step 106g, the correlation algorithm is concluded. In other exemplary embodiments, depending on the selected measure, the maximum can also be determined instead of the aforementioned minimum.
[00501 The plausibility check step 108 is illustrated in greater detail in Figure 7. The step 108 has a plausibility algorithm which checks, in the birds 6 to be detected and/or the flocks of birds 6', 6''to be detected, the disparities of the original clusters 17a, 17b respectively assigned thereto and/or the at least one displacement vector of the respectively assigned original clusters 17a, 17b, defective results being eliminated. This will be described hereinafter for the first displacement vectors. In this case, a field V(x,y) of displacement vectors is checked, a mean value over all the elements of the field V(x,y) being calculated, after which, of all the displacement vectors, those in which a differential of their value from the mean value exceeds a predefined threshold value are removed. This is carried out until no further new elements have to be removed. All the elements in the vector field V(x,y) are set to active in a step 108a.
Afterwards, in a step 108b, a mean value is formed for each dimension of the vector field V(x,y) of all the active elements, after which, in a step 108c, the differential of the values of the active elements of the vector array from the mean value is calculated. At a branching 108d, it is checked whether at least one differential is greater than the predefined threshold value. Should this not be the case, the step 108 is concluded.
Otherwise, in a step 108e, those elements, the differential of which is greater than the threshold value, are set to inactive. Afterwards, step 108b is resumed in a ioop.
[0051] In order to increase the reliability or accuracy of the stereoscopic detection of birds 6 or flocks of birds 6', 6'', ascertained detection data or parameters of stereo camera means 1 can be matched with data of any redundantly present further stereo camera means 1 to check the plausibility. Active standby systems of this type can for example be arranged at a spacing of 500 m and have the same orientation. This allows, for example, for the correction of a velocity of a flock of birds 6', 6'' that has been calculated too high on account of a gust of wind which has occurred. It is equally possible to achieve this using redundant image sensor systems 1' actively issuing radiation. The redundant systems are basically provided in order to immediately take over the failed function, in the event of a failure of a stereo camera means 1 or of an image sensor system 1 actively emitting radiation, so that the monitoring device 2 can continue to operate.
[0052] The methods according to the invention are preferably carried out as a computer program on the image processing means 7 of the stereo camera means 1 and/or the image sensor system 1' actively emitting radiation, other solutions also being possible of course. For this purpose, the computer program is stored in a memory element (not shown in greater detail) of the image processing means 7. The method is executed by processing on a microprocessor of the image processing means 7. The computer program can be stored as a computer program product on a computer-readable data carrier (floppy disk, CD, DVD, hard disk, USB memory stick or the like) or an Internet server and be transmitted from there to the memory element of the image processing means 7.
List of reference numerals 1 stereo camera means 1' image sensor system actively emitting radiation 2 monitoring device 3a, 3b thermal imaging cameras
4a, 4b, 4c fields of view
overlapping region 6, 6', 6'' birds, flocks of birds 7 image processing means 8 interface 9 flight security system flight path of the birds 11 flight corridor 12 region of intersection 13 aircraft 14 take-off and landing runway wind power plant 15' buildings with transparent regions 16 relevant regions 17a, 17b original cluster image L, 18a, 18b corresponding cluster image R1 19a, 19b corresponding cluster image L2 20a, 20b corresponding cluster image R2 L, first image of the first camera L2 second image of the first camera R1 first image of the second camera R2 second image of the second camera R right image (second camera) L left image (first camera) 101-110 method steps 103a-103e method steps of the formation of clusters 106a-106g method steps of the correlation 108a-108e method steps of the plausibility test Beginn Start Ja Yes Nein No Ende End
Claims (24)
- CLAIMS1. Method for the detection of at least one object (6) and/or at least one group of objects (6', 6'') in digital image sequences recorded stereoscopically by a calibrated stereo camera means (1) comprising at least a first camera (3a) and at least a second camera (3b), wherein: -relevant image regions (16), which are first grouped to form one or more original clusters (17a, 17b), are determined in a first image (L1) of the first camera (3a), after which -the clusters (18a, 18b) corresponding to the original clusters (17a, 17b) are correlated in a first image (R1) of the second camera (3b) based on a similarity criterion, after which -the disparities of the respective original clusters (17a, 17b) from the corresponding clusters (18a, 18b) are determined in the first image (R1) of the second camera (3b), after which -the clusters (19a, 19b, 20a, 20b) corresponding to the original clusters (17a, 17b) are correlated in a second image (L2, R2), which is recorded in a temporally offset manner, of the first camera (3a) and/or the second camera (3b) based on the similarity criterion and at least one displacement vector of the respective original cluster (17a, 17b) is determined therefrom, after which -the individual original clusters (17a, 17b) are each assigned, in particular in consideration of the disparity and the at least one displacement vector of the original cluster (17a, 17b), to an object (6) to be detected or a group of objects (6', 6'') to be detected, wherein -the position and the distance of the at least one object (6) to be detected and/or the at least one group of objects (6', 6'') to be detected from the stereo camera means (1) are ascertained based on the position and the distance of the at least one assigned original cluster (17a, 17b) from the stereo camera means (1), which is obtained, taking into account the geometry of the stereo camera means (1), from the disparity of the at least one assigned original cluster (17a, 17b), and after which -the velocity of the at least one object (6) to be detected and/or the at least one group of objects (6', 6'') to be detected is determined from the at least one displacement vector of the at least one assigned original cluster (17a, 17b) taking into account the ascertained position and distance of the at least one assigned original cluster (17a, 17b) from the stereo camera means (1).
- 2. Method for the detection of at least one object (6) and/or at least one group of objects (6', 6'') in digital image sequences recorded by at least one image sensor system (1') actively emitting radiation, wherein -relevant image regions (16), which are grouped to form one or more original clusters (17a, 17b), are determined in a first image (L1) of the image sensor system (1') actively emitting radiation, after which -the clusters (19a, 19b) corresponding to the original clusters (17a, 17b) are correlated in a second image (L2), which is recorded in a temporally offset manner, of the image sensor system (1') actively emitting radiation based on a similarity criterion and at least one displacement vector of the respective original cluster (17a, 17b) is determined therefrom, after which -the individual original clusters (17a, 17b) are each assigned, in particular in consideration of the at least one displacement vector of the original cluster (17a, 17b), to an object (6) to be detected or a group of objects (6', 6'') to be detected, wherein -the position and the distance of the at least one object (6) to be detected and/or the at least one group of objects (6', 6'') to be detected from the image sensor system (1') actively emitting radiation are determined based on the position and the distance of the at least one assigned original cluster (17a, 17b) from the image sensor system (1') actively emitting radiation, which are directly ascertained by the image sensor system (1') actively emitting radiation, and wherein -the velocity of the at least one object (6) to be detected and/or the at least one group of objects (6', 6'') to be detected is determined from the at least one displacement vector of the at least one assigned original cluster (17a, 17b) taking into account the ascertained position and distance of the at least one assigned original cluster (17a, 17b) from the image sensor system (1') actively emitting radiation.
- 3. Method according to claim 1 or claim 2, characterised in that the velocity of the at least one group of objects (6', 6'') to be detected is determined from a weighted mean value of the velocities of the different original clusters (17a, 17b) assigned to the corresponding group of objects (6', 6'').
- 4. Method according to any one of claims 1 to 3, characterised in that the relevant image regions (16) display sufficiently high movement-induced change.
- 5. Method according to any one of claims 1 to 4, characterised in that the relevant image regions (16) are grouped to form the original clusters (17a, 17b) based on a stochastic model, in particular at random.
- 6. Method according to any one of claims 1 to 5, characterised in that the original clusters (17a, 17b) and/or the corresponding clusters (18a, 19a, 20a, 18b, 19b, 20b) thereof are different sizes and shapes.
- 7. Method according to any one of claims 1 to 6, characterised in that the original clusters (17a, 17b) are grouped at different cluster spacings.
- 8. Method according to claim 7, characterised in that the cluster spacing is reduced, starting from an admissible maximum value, step by step down to an admissible minimum value, new original clusters (17a, 17b) being grouped and added to the total quantity of original clusters (17a, 17b) step by step at each newly selected cluster spacing.
- 9. Method according to any one of claims 1 to 8, characterised in that the original clusters or the corresponding clusters thereof are identified by their circumscribing rectangles (17a, 18a, 19a, 20a, 17b, 18b, 19b, 20b).
- 10. Method according to any one of claims 1 to 9, characterised in that the sum of the absolute differences is used in the correlation as the similarity criterion.
- 11. Method according to any one of claims 1 to 10, characterised in that a plausibility algorithm checks, in the at least one object (6) to be detected and/or the at least one group of objects (6', 6'') to be detected, the disparities of the respectively assigned original clusters (17a, 17b) and/or the at least one displacement vector of the respectively assigned original clusters (17a, 17b), defective results being eliminated.
- 12. Method according to any one of claims 1 to 11, characterised in that the at least one object or the at least one group of objects is at least one bird (6) or, as the case may be, at least one flock of birds (6', 6'').
- 13. Method according to any one of claims 1 to 12, characterised in that a temporal consideration of the detection of the at least one object (6) and/or the at least one group of objects (6', 6'') is carried out via at least two temporally successive images (L1, L2, R1, R2) of the image sequences, in particular by means of a corresponding filtering or averaging.
- 14. Computer program with program code means in order to carry out a method according to any one of claims 1 to 13, when the program is executed on a microprocessor of a computer, in particular on an image processing means (7) of a stereo camera means (1) or an image sensor system (1) actively emitting radiation.
- 15. Computer program product with program code means stored on a computer-readable data carrier in order to carry out a method according to any one of claims 1 to 13, when the program is executed on an image processing means (7) of a stereo camera means (1) or an image sensor system (1) actively emitting radiation.
- 16. Stereo camera means (1) comprising at least two mutually calibrated cameras (3a, 3b) which are arranged at a spacing and run in synchronisation during recording, the recording moments of which are at least approximately identical and the respective fields of view (4a, 4b) of which have an overlapping region (5) and configured with an image processing means (7) for executing a computer program according to claim 14 for carrying out a method according to any one of claims 1 and 3 to 13.
- 17. Image sensor system (1') actively emitting radiation with an image processing means (7) configured for executing a computer program according to claim 14 for carrying out a method according to any one of claims 2 to 13.
- 18. Monitoring device (2) for wind power plants (15), buildings (15') with transparent regions, take-off and landing runways (14) and/or flight corridors (11) of airports with a stereoscopic detection of approaching or present birds (6) or flocks of birds (6', 6'') for carrying out a monitoring method, wherein parameters such as flight altitude, direction of flight, flying velocity, species and size of the birds (6) or the flocks of birds (6', 6'') can be ascertained, based on which an evaluation is carried out and if appropriate a corresponding warning message is issued, and wherein, in the region of the wind power plants (15), buildings (15') with transparent regions, take-off and landing runways (14) and/or the flight corridors (11), at least one image sensor system (1') actively emitting radiation according to claim 17 and/or at least one stereo camera means (1) according to claim 16 is provided with at least two thermal imaging cameras (3a, 3b) which run in synchronisation during recording, the recording moments of which are at least approximately identical and the respective fields of view (4a, 4b) of which have an overlapping region (5).
- 19. A method for the detection of at least one object and/or at least one group of objects substantially as hereinbefore described with reference to one or more of the figures of the accompanying drawings.
- 20. A computer program substantially as hereinbefore described with reference to one or more of the figures of the accompanying drawings.
- 21. A computer program product with program code means stored on a computer-readable data carrier substantially as hereinbefore described with reference to one or more of the figures of the accompanying drawings.
- 22. Stereo camera means substantially as hereinbefore described with reference to one or more of the figures of the accompanying drawings.
- 23. Image sensor system substantially as hereinbefore described with reference to one or more of the figures of the accompanying drawings.
- 24. Monitoring device substantially as hereinbefore described with reference to one or more of the figures of the accompanying drawings.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102009016819A DE102009016819B4 (en) | 2009-04-09 | 2009-04-09 | Method for detecting at least one object and / or at least one object group, computer program, computer program product, stereo camera device, actively radiation-emitting image sensor system and monitoring device |
Publications (3)
Publication Number | Publication Date |
---|---|
GB201005935D0 GB201005935D0 (en) | 2010-05-26 |
GB2470806A true GB2470806A (en) | 2010-12-08 |
GB2470806B GB2470806B (en) | 2012-10-03 |
Family
ID=42236071
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1005935.0A Active GB2470806B (en) | 2009-04-09 | 2010-04-09 | Method and apparatus for the detection of at least one object in digital image sequences |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE102009016819B4 (en) |
GB (1) | GB2470806B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105807332A (en) * | 2016-04-28 | 2016-07-27 | 长春奥普光电技术股份有限公司 | Bird detection system for airport |
GB2545900A (en) * | 2015-12-21 | 2017-07-05 | Canon Kk | Method, device, and computer program for re-identification of objects in images obtained from a plurality of cameras |
EP3183603B1 (en) | 2014-08-21 | 2020-02-12 | IdentiFlight International, LLC | Bird or bat detection and identification for wind turbine risk mitigation |
WO2022003213A1 (en) * | 2020-06-29 | 2022-01-06 | 3D Observer Project, S.L. | System and method for detecting birdlife in wind farms |
US11544490B2 (en) | 2014-08-21 | 2023-01-03 | Identiflight International, Llc | Avian detection systems and methods |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130050400A1 (en) | 2011-08-31 | 2013-02-28 | Henrik Stiesdal | Arrangement and Method to Prevent a Collision of a Flying Animal with a Wind Turbine |
GB2495526B (en) * | 2011-10-12 | 2016-08-31 | Savy Tech Ltd | Height measurement apparatus and method |
CN103852066B (en) * | 2012-11-28 | 2016-08-17 | 联想(北京)有限公司 | Method, control method, electronic equipment and the control system of a kind of equipment location |
DE102013016486A1 (en) | 2013-09-13 | 2015-04-02 | Stephan Hörmann | Surveying procedures for building openings and building closure manufacturing processes and devices for carrying them out |
DE102013107597A1 (en) | 2013-01-11 | 2014-08-14 | Stephan Hörmann | Method for measuring width and height of building opening for producing e.g. rolling gate to close opening in garage, involves determining width and/or height by evaluating obtained distance and image data of opening and calibration device |
DE102019000719A1 (en) | 2018-04-24 | 2019-10-24 | Bürgerwindpark Hohenlohe GmbH | Apparatus and method for protecting flying animals from the rotating blades of a wind turbine by a neural deep learning technology |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005000461A1 (en) * | 2003-06-06 | 2005-01-06 | Eastman Chemical Company | Polyester process using a pipe reactor |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19926559A1 (en) * | 1999-06-11 | 2000-12-21 | Daimler Chrysler Ag | Method and device for detecting objects in the vicinity of a road vehicle up to a great distance |
US8111289B2 (en) | 2002-07-15 | 2012-02-07 | Magna B.S.P. Ltd. | Method and apparatus for implementing multipurpose monitoring system |
DE102005008131A1 (en) | 2005-01-31 | 2006-08-03 | Daimlerchrysler Ag | Object e.g. road sign, detecting method for use with e.g. driver assistance system, involves determining position and movement of relevant pixels using filter and combining relevant pixels to objects under given terms and conditions |
DE102005055879A1 (en) | 2005-11-23 | 2007-05-31 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Air Traffic guide |
DE102008018880A1 (en) | 2008-04-14 | 2009-10-15 | Carl Zeiss Optronics Gmbh | Monitoring procedures and equipment for wind turbines, buildings with transparent areas, runways and / or airport corridors |
-
2009
- 2009-04-09 DE DE102009016819A patent/DE102009016819B4/en active Active
-
2010
- 2010-04-09 GB GB1005935.0A patent/GB2470806B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005000461A1 (en) * | 2003-06-06 | 2005-01-06 | Eastman Chemical Company | Polyester process using a pipe reactor |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3183603B1 (en) | 2014-08-21 | 2020-02-12 | IdentiFlight International, LLC | Bird or bat detection and identification for wind turbine risk mitigation |
US10883473B2 (en) | 2014-08-21 | 2021-01-05 | Identiflight International, Llc | Bird or bat detection and identification for wind turbine risk mitigation |
US11544490B2 (en) | 2014-08-21 | 2023-01-03 | Identiflight International, Llc | Avian detection systems and methods |
US11555477B2 (en) | 2014-08-21 | 2023-01-17 | Identiflight International, Llc | Bird or bat detection and identification for wind turbine risk mitigation |
US11751560B2 (en) | 2014-08-21 | 2023-09-12 | Identiflight International, Llc | Imaging array for bird or bat detection and identification |
US12048301B2 (en) | 2014-08-21 | 2024-07-30 | Identiflight International, Llc | Bird or bat detection and identification for wind turbine risk mitigation |
GB2545900A (en) * | 2015-12-21 | 2017-07-05 | Canon Kk | Method, device, and computer program for re-identification of objects in images obtained from a plurality of cameras |
US10430667B2 (en) | 2015-12-21 | 2019-10-01 | Canon Kabushiki Kaisha | Method, device, and computer program for re-identification of objects in images obtained from a plurality of cameras |
GB2545900B (en) * | 2015-12-21 | 2020-08-12 | Canon Kk | Method, device, and computer program for re-identification of objects in images obtained from a plurality of cameras |
CN105807332A (en) * | 2016-04-28 | 2016-07-27 | 长春奥普光电技术股份有限公司 | Bird detection system for airport |
WO2022003213A1 (en) * | 2020-06-29 | 2022-01-06 | 3D Observer Project, S.L. | System and method for detecting birdlife in wind farms |
Also Published As
Publication number | Publication date |
---|---|
DE102009016819B4 (en) | 2011-12-15 |
GB2470806B (en) | 2012-10-03 |
DE102009016819A1 (en) | 2010-11-04 |
GB201005935D0 (en) | 2010-05-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB2470806A (en) | Detecting objects by comparing digital images | |
CN110942449B (en) | Vehicle detection method based on laser and vision fusion | |
CN103733234B (en) | A surveillance system and a method for detecting a foreign object, debris, or damage in an airfield | |
US9520040B2 (en) | System and method for real-time 3-D object tracking and alerting via networked sensors | |
CN111368706A (en) | Data fusion dynamic vehicle detection method based on millimeter wave radar and machine vision | |
CN108227738A (en) | A kind of unmanned plane barrier-avoiding method and system | |
JP2010539740A (en) | Runway monitoring system and method | |
KR20180133745A (en) | Flying object identification system using lidar sensors and pan/tilt zoom cameras and method for controlling the same | |
US10861172B2 (en) | Sensors and methods for monitoring flying objects | |
JP2020112438A (en) | Sea level measurement system, sea level measurement method and sea level measurement program | |
CN115273034A (en) | Traffic target detection and tracking method based on vehicle-mounted multi-sensor fusion | |
US20140355869A1 (en) | System and method for preventing aircrafts from colliding with objects on the ground | |
CN112836634B (en) | Multi-sensor information fusion gate anti-trailing method, device, equipment and medium | |
CN115034324B (en) | Multi-sensor fusion perception efficiency enhancement method | |
CN110189363A (en) | A kind of low multi-view video speed-measuring method of the mobile target of airdrome scene | |
Huang et al. | Moving object tracking based on millimeter-wave radar and vision sensor | |
CN115291219A (en) | Method and device for realizing dynamic obstacle avoidance of unmanned aerial vehicle by using monocular camera and unmanned aerial vehicle | |
CN117409393A (en) | Method and system for detecting laser point cloud and visual fusion obstacle of coke oven locomotive | |
KR102349818B1 (en) | Autonomous UAV Navigation based on improved Convolutional Neural Network with tracking and detection of road cracks and potholes | |
CN117347973A (en) | Regional invasion intelligent control method based on point cloud three-dimensional target tracking | |
CN117197779A (en) | Track traffic foreign matter detection method, device and system based on binocular vision | |
Dang et al. | Moving objects elimination towards enhanced dynamic SLAM fusing LiDAR and mmW-radar | |
Fedorov et al. | Placement strategy of multi-camera volumetric surveillance system for activities monitoring | |
US20190068943A1 (en) | Environment Perception System for Autonomous Vehicle | |
CN103699883B (en) | A kind of method utilizing village landmark group identification to locate buildings |