CN116030501B - Method and device for extracting bird detection data - Google Patents

Method and device for extracting bird detection data Download PDF

Info

Publication number
CN116030501B
CN116030501B CN202310125656.5A CN202310125656A CN116030501B CN 116030501 B CN116030501 B CN 116030501B CN 202310125656 A CN202310125656 A CN 202310125656A CN 116030501 B CN116030501 B CN 116030501B
Authority
CN
China
Prior art keywords
bird
data
image data
image
original
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310125656.5A
Other languages
Chinese (zh)
Other versions
CN116030501A (en
Inventor
温建伟
邓迪旻
肖占中
袁潮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhuohe Technology Co Ltd
Original Assignee
Beijing Zhuohe Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhuohe Technology Co Ltd filed Critical Beijing Zhuohe Technology Co Ltd
Priority to CN202310125656.5A priority Critical patent/CN116030501B/en
Publication of CN116030501A publication Critical patent/CN116030501A/en
Application granted granted Critical
Publication of CN116030501B publication Critical patent/CN116030501B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The application discloses a method and a device for extracting bird detection data. Wherein the method comprises the following steps: acquiring an original bird data image; sharpening highlighting treatment is carried out on the original bird data image, so that first bird image data and environment data are obtained; obtaining a bird detection judgment result by utilizing the environmental data and the history comparison parameter model; and generating second bird image data according to the bird judging result and the first bird image data. The application solves the technical problems that in the prior art, the flying bird obstacle detection process directly judges the flying bird state only through judging and identifying the flying bird outline, and when a plurality of flying objects exist or the flying bird outline is influenced by weather and the like and is not clear, the corresponding flying objects cannot be accurately and quickly identified, thereby causing the occurrence of safety accidents.

Description

Method and device for extracting bird detection data
Technical Field
The application relates to the field of image data processing, in particular to a method and a device for extracting bird detection data.
Background
Along with the continuous development of intelligent science and technology, intelligent equipment is increasingly used in life, work and study of people, and the quality of life of people is improved and the learning and working efficiency of people is increased by using intelligent science and technology means.
At present, when high-precision cameras shoot in real time at high altitude, data of flying birds or other flying objects are usually identified and returned, so that the technical effects of avoiding risk flying objects and guaranteeing flying safety are achieved. However, in the flying bird obstacle detection process in the prior art, the flying bird state is directly judged only by judging and identifying the flying bird outline, and when a plurality of flying objects exist or the flying bird outline is not clear due to the influence of weather and the like, the corresponding flying objects cannot be accurately and rapidly identified, so that the safety accident is caused.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the application provides a method and a device for extracting bird detection data, which at least solve the technical problems that in the prior art, in the process of detecting bird obstacles, the bird state is judged directly only by judging and identifying the bird outline, and when a plurality of flying objects exist or the bird outline is influenced by weather and the like and is not clear, the corresponding flying objects cannot be identified accurately and quickly, so that safety accidents occur.
According to an aspect of an embodiment of the present application, there is provided a bird detection data extraction method including: acquiring an original bird data image; sharpening highlighting treatment is carried out on the original bird data image, so that first bird image data and environment data are obtained; obtaining a bird detection judgment result by utilizing the environmental data and the history comparison parameter model; and generating second bird image data according to the bird judging result and the first bird image data.
Optionally, after the acquiring the original bird data image, the method further includes: and training the historical comparison parameter model according to the demand information.
Optionally, the sharpening highlighting the original bird data image to obtain the first bird image data and the environment data includes: splitting the original bird data image to obtain bird target image data and original environment image data; sharpening highlighting the bird target image data according to an index sharpening algorithm to obtain first bird image data; and sharpening the original environment image data according to an index sharpening algorithm to obtain the environment data.
Optionally, the obtaining the bird detection judgment result by using the environmental data and the historical comparison parameter model includes: inputting the environmental data into the historical comparison parameter model to obtain a target complex parameter, wherein the target complex parameter represents a target detection range and a target quantity parameter under the current requirement; and carrying out matching identification on the target complex parameters and the original bird data image to obtain the bird detection judgment result.
According to another aspect of the embodiment of the present application, there is also provided a bird detection data extraction device, including: the acquisition module is used for acquiring an original bird data image; the processing module is used for carrying out sharpening highlighting processing on the original bird data image to obtain first bird image data and environment data; the judging module is used for comparing the parameter model with the history by utilizing the environment data to obtain a bird detection judging result; and the generation module is used for generating second bird image data according to the bird judging result and the first bird image data.
Optionally, the apparatus further includes: and the training module is used for training the historical comparison parameter model according to the demand information.
Optionally, the processing module includes: the splitting unit is used for splitting the original bird data image to obtain bird target image data and original environment image data; the processing unit is used for carrying out sharpening highlighting processing on the bird target image data according to an index sharpening algorithm to obtain the first bird image data; and the processing unit is also used for sharpening the original environment image data according to an index sharpening algorithm to obtain the environment data.
Optionally, the judging module includes: the input unit is used for inputting the environment data into the historical comparison parameter model to obtain a target complex parameter, wherein the target complex parameter represents a target detection range and a target quantity parameter under the current requirement; and the matching unit is used for carrying out matching identification on the target complex parameters and the original bird data image to obtain the bird detection judgment result.
According to another aspect of the embodiment of the present application, there is also provided a nonvolatile storage medium including a stored program, where the program when executed controls a device in which the nonvolatile storage medium is located to execute a bird detection data extraction method.
According to another aspect of the embodiment of the present application, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute a method for extracting bird detection data when executed.
In the embodiment of the application, the original bird data image is acquired; sharpening highlighting treatment is carried out on the original bird data image, so that first bird image data and environment data are obtained; obtaining a bird detection judgment result by utilizing the environmental data and the history comparison parameter model; the method for generating the second bird image data according to the bird judging result and the first bird image data solves the technical problem that the corresponding flying object cannot be accurately and rapidly identified when a plurality of flying objects exist or the bird outline is not clear due to the influence of weather and the like only by judging and identifying the bird outline, so that safety accidents occur in the bird obstacle detecting process in the prior art.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a flow chart of a method of extracting bird detection data according to an embodiment of the present application;
FIG. 2 is a block diagram of a bird detection data extraction device according to an embodiment of the present application;
fig. 3 is a block diagram of a terminal device for performing the method according to the application according to an embodiment of the application;
fig. 4 is a memory unit for holding or carrying program code for implementing a method according to the application, according to an embodiment of the application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present application, there is provided a method embodiment of a bird detection data extraction method, it being noted that the steps shown in the flowchart of the figures may be performed in a computer system such as a set of computer executable instructions, and, although a logical order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order other than that shown or described herein.
Example 1
Fig. 1 is a flowchart of a method for extracting bird detection data according to an embodiment of the present application, as shown in fig. 1, the method includes the steps of:
step S102, acquiring an original bird data image.
Specifically, in order to solve the technical problem that safety accidents occur due to the fact that in the prior art, the bird obstacle detection process directly judges the bird state only through judging and identifying the bird outline, and when a plurality of flying objects exist or the bird outline is not clear due to influences of weather and the like, the corresponding flying objects cannot be accurately and quickly identified, when the bird or the flying object is identified, the original bird data image is firstly required to be acquired and acquired in real time, and the acquired image data is converted and output to a CPU and a memory for later monitoring and extracting of the bird data.
Optionally, after the acquiring the original bird data image, the method further includes: and training the historical comparison parameter model according to the demand information.
Specifically, after the original bird data is obtained, in order to train the history comparison parameter model according to the user or the requirement data of the user, the embodiment of the application can also extract the data about the bird identification history comparison surface in the big data platform, and input and output the history data and the requirement information as characteristic input vectors to train, thereby obtaining a perfect history comparison parameter model.
And step S104, sharpening highlighting processing is carried out on the original bird data image, so that first bird image data and environment data are obtained.
Optionally, the sharpening highlighting the original bird data image to obtain the first bird image data and the environment data includes: splitting the original bird data image to obtain bird target image data and original environment image data; sharpening highlighting the bird target image data according to an index sharpening algorithm to obtain first bird image data; and sharpening the original environment image data according to an index sharpening algorithm to obtain the environment data.
Specifically, in order to perform optimization processing on detected bird data, for subsequent judgment and identification of the bird data, the embodiment of the application may perform sharpening highlighting processing on environment data except for bird image data in the bird data and original image data, so as to obtain image contour data with clear edges, for example, split the original bird data image to obtain bird target image data and original environment image data; sharpening highlighting the bird target image data according to an index sharpening algorithm to obtain first bird image data; and sharpening the original environment image data according to an index sharpening algorithm to obtain the environment data.
And S106, utilizing the environmental data and the history comparison parameter model to obtain a bird detection judgment result.
Optionally, the obtaining the bird detection judgment result by using the environmental data and the historical comparison parameter model includes: inputting the environmental data into the historical comparison parameter model to obtain a target complex parameter, wherein the target complex parameter represents a target detection range and a target quantity parameter under the current requirement; and carrying out matching identification on the target complex parameters and the original bird data image to obtain the bird detection judgment result.
Specifically, in order to judge the problems of the complex number of the flying birds and the problems of the judging range of the flying birds by utilizing the environment data, the environment data is required to be input into the historical comparison parameter model to obtain a target complex number parameter, wherein the target complex number parameter represents the range of target detection and a plurality of parameters under the current requirement; and carrying out matching identification on the target complex parameters and the original bird data image to obtain the bird detection judgment result.
And S108, generating second bird image data according to the bird judging result and the first bird image data.
Specifically, after the bird detection and judgment result is obtained, the sharpened first bird data is utilized to identify and collect all the bird data, so that the data of the accurate bird or flying object with use value is extracted, and the safety performance of the aircraft in the air flight is ensured.
Through the embodiment, the problem that the corresponding flying object cannot be accurately and rapidly identified when a plurality of flying objects exist or the flying object profile is influenced by weather and the like and is not clear is solved in the flying bird obstacle detection process in the prior art, and the technical problem of safety accidents is caused only by judging and identifying the flying bird profile.
Example two
Fig. 2 is a block diagram of a bird detection data extraction device according to an embodiment of the present application, as shown in fig. 2, the device includes:
an acquisition module 20 is configured to acquire an original bird data image.
Specifically, in order to solve the technical problem that safety accidents occur due to the fact that in the prior art, the bird obstacle detection process directly judges the bird state only through judging and identifying the bird outline, and when a plurality of flying objects exist or the bird outline is not clear due to influences of weather and the like, the corresponding flying objects cannot be accurately and quickly identified, when the bird or the flying object is identified, the original bird data image is firstly required to be acquired and acquired in real time, and the acquired image data is converted and output to a CPU and a memory for later monitoring and extracting of the bird data.
Optionally, the apparatus further includes: and the training module is used for training the historical comparison parameter model according to the demand information.
Specifically, after the original bird data is obtained, in order to train the history comparison parameter model according to the user or the requirement data of the user, the embodiment of the application can also extract the data about the bird identification history comparison surface in the big data platform, and input and output the history data and the requirement information as characteristic input vectors to train, thereby obtaining a perfect history comparison parameter model.
And the processing module 22 is used for carrying out sharpening highlighting processing on the original bird data image to obtain first bird image data and environment data.
Optionally, the processing module includes: the splitting unit is used for splitting the original bird data image to obtain bird target image data and original environment image data; the processing unit is used for carrying out sharpening highlighting processing on the bird target image data according to an index sharpening algorithm to obtain the first bird image data; and the processing unit is also used for sharpening the original environment image data according to an index sharpening algorithm to obtain the environment data.
Specifically, in order to perform optimization processing on detected bird data, for subsequent judgment and identification of the bird data, the embodiment of the application may perform sharpening highlighting processing on environment data except for bird image data in the bird data and original image data, so as to obtain image contour data with clear edges, for example, split the original bird data image to obtain bird target image data and original environment image data; sharpening highlighting the bird target image data according to an index sharpening algorithm to obtain first bird image data; and sharpening the original environment image data according to an index sharpening algorithm to obtain the environment data.
And the judging module 24 is used for obtaining a bird detection judging result by utilizing the environment data and the history comparison parameter model.
Optionally, the judging module includes: the input unit is used for inputting the environment data into the historical comparison parameter model to obtain a target complex parameter, wherein the target complex parameter represents a target detection range and a target quantity parameter under the current requirement; and the matching unit is used for carrying out matching identification on the target complex parameters and the original bird data image to obtain the bird detection judgment result.
Specifically, in order to judge the problems of the complex number of the flying birds and the problems of the judging range of the flying birds by utilizing the environment data, the environment data is required to be input into the historical comparison parameter model to obtain a target complex number parameter, wherein the target complex number parameter represents the range of target detection and a plurality of parameters under the current requirement; and carrying out matching identification on the target complex parameters and the original bird data image to obtain the bird detection judgment result.
And the generating module 26 is configured to generate second bird image data according to the bird judging result and the first bird image data.
Specifically, after the bird detection and judgment result is obtained, the sharpened first bird data is utilized to identify and collect all the bird data, so that the data of the accurate bird or flying object with use value is extracted, and the safety performance of the aircraft in the air flight is ensured.
Through the embodiment, the problem that the corresponding flying object cannot be accurately and rapidly identified when a plurality of flying objects exist or the flying object profile is influenced by weather and the like and is not clear is solved in the flying bird obstacle detection process in the prior art, and the technical problem of safety accidents is caused only by judging and identifying the flying bird profile.
According to another aspect of the embodiment of the present application, there is also provided a nonvolatile storage medium including a stored program, where the program when executed controls a device in which the nonvolatile storage medium is located to execute a bird detection data extraction method.
Specifically, the method comprises the following steps: acquiring an original bird data image; sharpening highlighting treatment is carried out on the original bird data image, so that first bird image data and environment data are obtained; obtaining a bird detection judgment result by utilizing the environmental data and the history comparison parameter model; and generating second bird image data according to the bird judging result and the first bird image data. Optionally, after the acquiring the original bird data image, the method further includes: and training the historical comparison parameter model according to the demand information. Optionally, the sharpening highlighting the original bird data image to obtain the first bird image data and the environment data includes: splitting the original bird data image to obtain bird target image data and original environment image data; sharpening highlighting the bird target image data according to an index sharpening algorithm to obtain first bird image data; and sharpening the original environment image data according to an index sharpening algorithm to obtain the environment data. Optionally, the obtaining the bird detection judgment result by using the environmental data and the historical comparison parameter model includes: inputting the environmental data into the historical comparison parameter model to obtain a target complex parameter, wherein the target complex parameter represents a target detection range and a target quantity parameter under the current requirement; and carrying out matching identification on the target complex parameters and the original bird data image to obtain the bird detection judgment result.
According to another aspect of the embodiment of the present application, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute a method for extracting bird detection data when executed.
Specifically, the method comprises the following steps: acquiring an original bird data image; sharpening highlighting treatment is carried out on the original bird data image, so that first bird image data and environment data are obtained; obtaining a bird detection judgment result by utilizing the environmental data and the history comparison parameter model; and generating second bird image data according to the bird judging result and the first bird image data. Optionally, after the acquiring the original bird data image, the method further includes: and training the historical comparison parameter model according to the demand information. Optionally, the sharpening highlighting the original bird data image to obtain the first bird image data and the environment data includes: splitting the original bird data image to obtain bird target image data and original environment image data; sharpening highlighting the bird target image data according to an index sharpening algorithm to obtain first bird image data; and sharpening the original environment image data according to an index sharpening algorithm to obtain the environment data. Optionally, the obtaining the bird detection judgment result by using the environmental data and the historical comparison parameter model includes: inputting the environmental data into the historical comparison parameter model to obtain a target complex parameter, wherein the target complex parameter represents a target detection range and a target quantity parameter under the current requirement; and carrying out matching identification on the target complex parameters and the original bird data image to obtain the bird detection judgment result.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, fig. 3 is a schematic hardware structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the terminal device may include an input device 30, a processor 31, an output device 32, a memory 33, and at least one communication bus 34. The communication bus 34 is used to enable communication connections between the elements. The memory 33 may comprise a high-speed RAM memory or may further comprise a non-volatile memory NVM, such as at least one magnetic disk memory, in which various programs may be stored for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the processor 31 may be implemented as, for example, a central processing unit (Central Processing Unit, abbreviated as CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the processor 31 is coupled to the input device 30 and the output device 32 through wired or wireless connections.
Alternatively, the input device 30 may include a variety of input devices, for example, may include at least one of a user-oriented user interface, a device-oriented device interface, a programmable interface of software, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware insertion interface (such as a USB interface, a serial port, etc.) for data transmission between devices; alternatively, the user-oriented user interface may be, for example, a user-oriented control key, a voice input device for receiving voice input, and a touch-sensitive device (e.g., a touch screen, a touch pad, etc. having touch-sensitive functionality) for receiving user touch input by a user; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, for example, an input pin interface or an input interface of a chip, etc.; optionally, the transceiver may be a radio frequency transceiver chip, a baseband processing chip, a transceiver antenna, etc. with a communication function. An audio input device such as a microphone may receive voice data. The output device 32 may include a display, audio, or the like.
In this embodiment, the processor of the terminal device may include functions for executing each module of the data processing apparatus in each device, and specific functions and technical effects may be referred to the above embodiments and are not described herein again.
Fig. 4 is a schematic hardware structure of a terminal device according to another embodiment of the present application. Fig. 4 is a specific embodiment of the implementation of fig. 3. As shown in fig. 4, the terminal device of the present embodiment includes a processor 41 and a memory 42.
The processor 41 executes the computer program code stored in the memory 42 to implement the methods of the above-described embodiments.
The memory 42 is configured to store various types of data to support operation at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, video, etc. The memory 42 may include a random access memory (random access memory, simply referred to as RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, a processor 41 is provided in the processing assembly 40. The terminal device may further include: a communication component 43, a power supply component 44, a multimedia component 45, an audio component 46, an input/output interface 47 and/or a sensor component 48. The components and the like specifically included in the terminal device are set according to actual requirements, which are not limited in this embodiment.
The processing component 40 generally controls the overall operation of the terminal device. The processing component 40 may include one or more processors 41 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 40 may include one or more modules that facilitate interactions between the processing component 40 and other components. For example, processing component 40 may include a multimedia module to facilitate interaction between multimedia component 45 and processing component 40.
The power supply assembly 44 provides power to the various components of the terminal device. Power supply components 44 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for terminal devices.
The multimedia component 45 comprises a display screen between the terminal device and the user providing an output interface. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation.
The audio component 46 is configured to output and/or input audio signals. For example, the audio component 46 includes a Microphone (MIC) configured to receive external audio signals when the terminal device is in an operational mode, such as a speech recognition mode. The received audio signals may be further stored in the memory 42 or transmitted via the communication component 43. In some embodiments, audio assembly 46 further includes a speaker for outputting audio signals.
The input/output interface 47 provides an interface between the processing assembly 40 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: volume button, start button and lock button.
The sensor assembly 48 includes one or more sensors for providing status assessment of various aspects for the terminal device. For example, the sensor assembly 48 may detect the open/closed state of the terminal device, the relative positioning of the assembly, the presence or absence of user contact with the terminal device. The sensor assembly 48 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 48 may also include a camera or the like.
The communication component 43 is configured to facilitate communication between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot, where the SIM card slot is used to insert a SIM card, so that the terminal device may log into a GPRS network, and establish communication with a server through the internet.
From the above, it will be appreciated that the communication component 43, the audio component 46, and the input/output interface 47, the sensor component 48 referred to in the embodiment of fig. 4 may be implemented as an input device in the embodiment of fig. 3.
In the several embodiments provided in the present application, it should be understood that the disclosed technology may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application, which are intended to be comprehended within the scope of the present application.

Claims (4)

1. A method of extracting bird detection data, comprising:
acquiring an original bird data image;
sharpening highlighting treatment is carried out on the original bird data image, so that first bird image data and environment data are obtained;
obtaining a bird detection judgment result by utilizing the environmental data and the history comparison parameter model;
generating second bird image data according to the bird judging result and the first bird image data;
after the acquiring the original bird data image, the method further comprises:
training the history comparison parameter model according to the demand information;
the sharpening highlighting processing is performed on the original bird data image, and obtaining the first bird image data and the environment data includes:
splitting the original bird data image to obtain bird target image data and original environment image data;
sharpening highlighting the bird target image data according to an index sharpening algorithm to obtain first bird image data;
sharpening the original environment image data according to an index sharpening algorithm to obtain the environment data;
the step of obtaining the bird detection judgment result by using the environment data and the history comparison parameter model comprises the following steps:
inputting the environmental data into the historical comparison parameter model to obtain a target complex parameter, wherein the target complex parameter represents a range and a number of parameters of target detection under the current requirement;
and carrying out matching identification on the target complex parameters and the original bird data image to obtain the bird detection judgment result.
2. A bird detection data extraction device, comprising:
the acquisition module is used for acquiring an original bird data image;
the processing module is used for carrying out sharpening highlighting processing on the original bird data image to obtain first bird image data and environment data;
the judging module is used for comparing the parameter model with the history by utilizing the environment data to obtain a bird detection judging result;
the generation module is used for generating second bird image data according to the bird judging result and the first bird image data;
the apparatus further comprises:
the training module is used for training the history comparison parameter model according to the demand information;
the processing module comprises:
the splitting unit is used for splitting the original bird data image to obtain bird target image data and original environment image data;
the processing unit is used for carrying out sharpening highlighting processing on the bird target image data according to an index sharpening algorithm to obtain the first bird image data;
the processing unit is also used for sharpening the original environment image data according to an index sharpening algorithm to obtain the environment data;
the judging module comprises:
the input unit is used for inputting the environment data into the historical comparison parameter model to obtain a target complex parameter, wherein the target complex parameter represents a range and a number of parameters of target detection under the current requirement;
and the matching unit is used for carrying out matching identification on the target complex parameters and the original bird data image to obtain the bird detection judgment result.
3. A non-volatile storage medium comprising a stored program, wherein the program when run controls a device in which the non-volatile storage medium resides to perform the method of claim 1.
4. An electronic device comprising a processor and a memory; the memory has stored therein computer readable instructions for execution by the processor, wherein the computer readable instructions when executed perform the method of claim 1.
CN202310125656.5A 2023-02-15 2023-02-15 Method and device for extracting bird detection data Active CN116030501B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310125656.5A CN116030501B (en) 2023-02-15 2023-02-15 Method and device for extracting bird detection data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310125656.5A CN116030501B (en) 2023-02-15 2023-02-15 Method and device for extracting bird detection data

Publications (2)

Publication Number Publication Date
CN116030501A CN116030501A (en) 2023-04-28
CN116030501B true CN116030501B (en) 2023-10-10

Family

ID=86079541

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310125656.5A Active CN116030501B (en) 2023-02-15 2023-02-15 Method and device for extracting bird detection data

Country Status (1)

Country Link
CN (1) CN116030501B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101694681A (en) * 2008-11-28 2010-04-14 北京航空航天大学 Bird striking risk assessment system and assessment method thereof
CN102150653A (en) * 2011-03-11 2011-08-17 湖南继善高科技有限公司 Movable airfield avian detection and directional anti-bird device
TW202114518A (en) * 2019-10-02 2021-04-16 國立中興大學 Method for automatically detecting and repelling birds
CN113978468A (en) * 2021-12-16 2022-01-28 诺博汽车系统有限公司 Vehicle speed control method, device, equipment and medium based on water accumulation environment
JP2022095408A (en) * 2020-12-16 2022-06-28 パナソニックIpマネジメント株式会社 Processing system, flight vehicle, processing method, and program
CN114842424A (en) * 2022-06-07 2022-08-02 北京拙河科技有限公司 Intelligent security image identification method and device based on motion compensation
NO20210472A1 (en) * 2021-04-15 2022-10-17 Spoor As Bird detection and species determination

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101694681A (en) * 2008-11-28 2010-04-14 北京航空航天大学 Bird striking risk assessment system and assessment method thereof
CN102150653A (en) * 2011-03-11 2011-08-17 湖南继善高科技有限公司 Movable airfield avian detection and directional anti-bird device
TW202114518A (en) * 2019-10-02 2021-04-16 國立中興大學 Method for automatically detecting and repelling birds
JP2022095408A (en) * 2020-12-16 2022-06-28 パナソニックIpマネジメント株式会社 Processing system, flight vehicle, processing method, and program
NO20210472A1 (en) * 2021-04-15 2022-10-17 Spoor As Bird detection and species determination
CN113978468A (en) * 2021-12-16 2022-01-28 诺博汽车系统有限公司 Vehicle speed control method, device, equipment and medium based on water accumulation environment
CN114842424A (en) * 2022-06-07 2022-08-02 北京拙河科技有限公司 Intelligent security image identification method and device based on motion compensation

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Air-to-Air Visual Detection of Micro-UAVs: An Experimental Evaluation of Deep Learning;Ye Zheng et al.;《EEE ROBOTICS AND AUTOMATION LETTERS》;第6卷(第2期);第1020-1027页 *
Detection of bird species related to transmission line faults based on lightweight convolutional neural network;Zhibin Qiu et al.;《WILEY》;第869-881页 *
基于两种扫描方式的雷达探鸟系统;陈唯实 等;《北京航空航天大学学报》;第35卷(第3期);第380-383页 *
基于区域卷积神经网络的空中飞行物识别算法;刘聪聪;《传感器与微系统》;第40卷(第1期);第110-113页,第117页 *

Also Published As

Publication number Publication date
CN116030501A (en) 2023-04-28

Similar Documents

Publication Publication Date Title
CN115623336B (en) Image tracking method and device for hundred million-level camera equipment
CN107291238B (en) Data processing method and device
CN115375582A (en) Moire digestion method and device based on low-order Taylor decomposition
CN116614453B (en) Image transmission bandwidth selection method and device based on cloud interconnection
CN116261044B (en) Intelligent focusing method and device for hundred million-level cameras
CN115409869B (en) Snow field track analysis method and device based on MAC tracking
CN116030501B (en) Method and device for extracting bird detection data
CN115527045A (en) Image identification method and device for snow field danger identification
CN115984333B (en) Smooth tracking method and device for airplane target
CN116228593B (en) Image perfecting method and device based on hierarchical antialiasing
CN116468883B (en) High-precision image data volume fog recognition method and device
CN116579965B (en) Multi-image fusion method and device
CN116088580B (en) Flying object tracking method and device
CN116758165B (en) Image calibration method and device based on array camera
CN116664413B (en) Image volume fog eliminating method and device based on Abbe convergence operator
CN115187570B (en) Singular traversal retrieval method and device based on DNN deep neural network
CN116744102B (en) Ball machine tracking method and device based on feedback adjustment
CN115914819B (en) Picture capturing method and device based on orthogonal decomposition algorithm
CN115546053B (en) Method and device for eliminating diffuse reflection of graphics on snow in complex terrain
CN116723419B (en) Acquisition speed optimization method and device for billion-level high-precision camera
CN116757981A (en) Multi-terminal image fusion method and device
CN116485841A (en) Motion rule identification method and device based on multiple wide angles
CN115511735B (en) Snow field gray scale picture optimization method and device
CN117367455A (en) Deep learning algorithm unmanned aerial vehicle route design method and device for photovoltaic power station
CN115994872A (en) Smooth adjustment method and device for objects around aircraft

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant