CN117557789B - Intelligent detection method and system for offshore targets - Google Patents

Intelligent detection method and system for offshore targets Download PDF

Info

Publication number
CN117557789B
CN117557789B CN202410044869.XA CN202410044869A CN117557789B CN 117557789 B CN117557789 B CN 117557789B CN 202410044869 A CN202410044869 A CN 202410044869A CN 117557789 B CN117557789 B CN 117557789B
Authority
CN
China
Prior art keywords
information
target
image information
track
preset track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410044869.XA
Other languages
Chinese (zh)
Other versions
CN117557789A (en
Inventor
滕哲
李烨
王娇颖
邱千钧
陈健
宋健
陈琴琴
王青瑜
王超
张芳
潘兰波
马政伟
林洪文
毛建舟
李东
王发龙
郭安邦
郭涵子
罗飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
91526 Troops Of Chinese Pla
Chinese People's Liberation Army 91959 Unit
Science And Technology Innovation Research Center Of Naval Research Institute Of People's Liberation Army Of China
Srif Software Co ltd
Xian institute of Applied Optics
PLA Dalian Naval Academy
Original Assignee
91526 Troops Of Chinese Pla
Chinese People's Liberation Army 91959 Unit
Science And Technology Innovation Research Center Of Naval Research Institute Of People's Liberation Army Of China
Srif Software Co ltd
Xian institute of Applied Optics
PLA Dalian Naval Academy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 91526 Troops Of Chinese Pla, Chinese People's Liberation Army 91959 Unit, Science And Technology Innovation Research Center Of Naval Research Institute Of People's Liberation Army Of China, Srif Software Co ltd, Xian institute of Applied Optics, PLA Dalian Naval Academy filed Critical 91526 Troops Of Chinese Pla
Priority to CN202410044869.XA priority Critical patent/CN117557789B/en
Publication of CN117557789A publication Critical patent/CN117557789A/en
Application granted granted Critical
Publication of CN117557789B publication Critical patent/CN117557789B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses an intelligent detection method and system for an offshore target, and relates to the technical field of data processing, wherein the method comprises the following steps: acquiring offshore image information; performing target identification according to the offshore image information to obtain a target identification result; inputting the target position information and the target type information into a number matching database to obtain a target similar number list; traversing a target similar number list, and calling a preset track list of a number unit; collecting continuous frame image information according to the target position information and the target type information; screening a preset track list of the number units based on the continuous frame image information to obtain the matched number units; and adding the matched number units into the first time node detection result. The invention solves the technical problems that the offshore target detection cannot be accurately carried out and the detection intelligent degree is low in the prior art, and achieves the technical effects of reducing interference items and improving detection accuracy.

Description

Intelligent detection method and system for offshore targets
Technical Field
The invention relates to the technical field of data processing, in particular to an intelligent detection method and system for an offshore target.
Background
Offshore object detection is currently in wide use in a variety of fields, such as: marine rescue, sea area monitoring and port ship information management. However, due to the influence of complex sea conditions and diversity of the offshore targets, the available information amount of the offshore targets is small, and the target detection difficulty is high.
Traditional maritime target detection mainly carries out manual detection through maritime staff, not only consumes a large amount of manpower, and is limited by maritime staff's ability, not only can't accurately detect the target, causes the staff who detects to produce visual fatigue under long-time detection moreover easily, leaks to examine the target. Therefore, the technical problems that the offshore target detection cannot be accurately performed and the detection intelligent degree is low exist in the prior art.
Disclosure of Invention
The application provides an intelligent detection method and system for an offshore target, which are used for solving the technical problems that the offshore target detection cannot be accurately performed and the detection intelligent degree is low in the prior art.
In view of the above problems, the present application provides an intelligent detection method and system for an offshore target.
In a first aspect of the present application, there is provided an intelligent detection method for an offshore target, the method comprising:
Acquiring offshore image information, wherein the offshore image information belongs to a first time node;
performing target recognition according to the offshore image information to obtain a target recognition result, wherein the target recognition result comprises target position information and target type information;
inputting the target position information and the target type information into a number matching database to obtain a target similarity number list;
traversing the target similar number list, and calling a number unit preset track list, wherein the number unit preset track list starts from the first time node;
collecting continuous frame image information according to the target position information and the target type information;
screening the serial number unit preset track list based on the continuous frame image information to obtain a matched serial number unit;
and adding the matching number unit into a first time node detection result.
In a second aspect of the present application, there is provided an intelligent detection system for an offshore target, the system comprising:
the image information acquisition module is used for acquiring offshore image information, wherein the offshore image information belongs to a first time node;
The identification result obtaining module is used for carrying out target identification according to the offshore image information to obtain a target identification result, wherein the target identification result comprises target position information and target type information;
the number list obtaining module is used for inputting the target position information and the target type information into a number matching database to obtain a target similar number list;
the track list calling module is used for traversing the target similar number list and calling a number unit preset track list, wherein the number unit preset track list starts from the first time node;
the continuous frame information acquisition module is used for acquiring continuous frame image information according to the target position information and the target type information;
the number unit obtaining module is used for screening the number unit preset track list based on the continuous frame image information to obtain a matched number unit;
and the number unit adding module is used for adding the matched number units into the first time node detection result.
One or more technical solutions provided in the present application have at least the following technical effects or advantages:
according to the method, offshore image information is acquired, the offshore image information belongs to a first time node, target identification is carried out according to the offshore image information, a target identification result is acquired, the target identification result comprises target position information and target type information, a target similar number list is acquired by inputting the target position information and the target type information into a number matching database, then the target similar number list is traversed, a number unit preset track list is acquired, the number unit preset track list starts from the first time node, continuous frame image information is acquired according to the target position information and the target type information, the number unit preset track list is screened based on the continuous frame image information, a matching number unit is acquired, and the matching number unit is added into a first time node detection result. The technical effects of improving the accuracy of the detection result, shortening the detection time, improving the detection efficiency and determining the intelligent degree of the offshore target are achieved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of an intelligent detection method for an offshore target according to an embodiment of the present application;
fig. 2 is a schematic flow chart of obtaining a target recognition result in the method for intelligently detecting an offshore target according to the embodiment of the present application;
fig. 3 is a schematic flow chart of obtaining a target similarity number list in the method for intelligently detecting an offshore target according to the embodiment of the present application;
fig. 4 is a schematic diagram of a backbone network-recursive feature pyramid-decoupling detection head network layer in an intelligent detection method for an offshore target according to an embodiment of the present application;
fig. 5 is a schematic diagram of a decoupling detection head in an intelligent detection method for an offshore target according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an intelligent detection system for an offshore target according to an embodiment of the present application.
Reference numerals illustrate: the device comprises an image information obtaining module 11, a recognition result obtaining module 12, a number list obtaining module 13, a track list calling module 14, a continuous frame information obtaining module 15, a number unit obtaining module 16 and a number unit adding module 17.
Detailed Description
The application provides an intelligent detection method for an offshore target, which is used for solving the technical problems that the offshore target detection cannot be accurately performed and the detection intelligent degree is low in the prior art.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or modules not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
As shown in fig. 1, the present application provides an intelligent detection method for an offshore target, where the method includes:
step S100: acquiring offshore image information, wherein the offshore image information belongs to a first time node;
specifically, the image information of the sea is obtained by performing image acquisition on an image on the sea level through an image acquisition device. Preferably, infrared imaging techniques are used to capture the infrared radiation generated by the moving object itself at sea level, thereby imaging it. Has the advantages of strong anti-interference capability and low cost. The marine image information is information obtained by carrying out image description on a ship running on the sea level, and comprises marine images and image acquisition time points. By acquiring the offshore image of the first time node, the position of the target on the sea level of the first time node is visually displayed, and the technical effect of providing basic information for the follow-up intelligent detection of the target is achieved.
Step S200: performing target recognition according to the offshore image information to obtain a target recognition result, wherein the target recognition result comprises target position information and target type information;
Further, as shown in fig. 2, the performing target recognition according to the offshore image information, and obtaining a target recognition result, where the target recognition result includes target location information and target type information, step S200 in this embodiment of the present application further includes:
step S210: acquiring a neural network model framework, wherein the neural network model framework comprises a backbone network-recursive feature pyramid-decoupling detection head network layer and a long-short-time network layer;
step S220: inputting the offshore image information into the backbone network-recursive feature pyramid-decoupling detection head network layer, and acquiring the target type information and the first time node position information;
step S230: inputting the target type information and the first time node position information into the long-short-time network layer to obtain position sequence information;
step S240: setting the position sequence information as the target position information.
Specifically, the target located on the sea level at the first time node is determined by intelligently identifying the target presented in the sea image information by using the neural network model framework. Preferably, the target recognition is carried out on the obtained offshore image information through a backbone network-recursive feature pyramid-decoupling detection head network layer in the neural network model framework, so as to obtain a target recognition result. The target recognition result comprises target position information and target type information. The target position information is information describing the geographic position of the target at the first time node, and the geographic position is represented by longitude and latitude. The target type information is information describing the type of the target presented in the offshore image information, and comprises a target type and a target specification. In other words, the target type information describes the type of ship traveling on the sea level, including fishing boats, cruise ships, container ships, barges, etc. The target specification is information describing the size of the target, such as the size, draft, size, degree of freshness, and the like of the target.
Specifically, the backbone network, the recursive feature pyramid and the decoupling detection head network layer are combined network layers obtained by sequentially connecting the backbone network, the recursive feature pyramid and the decoupling detection head network layer in series, and the aims of feature extraction, recursive feature fusion and information prediction on the offshore image information can be achieved. Exemplary, as shown in fig. 4, feature extraction is performed on input offshore image information through a backbone network, then feature aggregation aiming at a target is completed by using a channel attention mechanism, feature channel fusion and other methods, the obtained features are input into a recursive pyramid to perform first recursive operation to obtain fused features, the fused features are input into the backbone network again to perform feature extraction, the obtained features are input into the recursive pyramid to perform second recursive operation, so that feature information with stronger representation capability after the recursive feature aggregation is obtained, and then prediction is performed by using a decoupling detection head according to the obtained feature information to obtain target type information and first time node position information.
Preferably, the backbone network is constructed by a yolov5 model, the offshore image information is input into the model, and the output layer is arranged on the shallower characteristic map L out1 The input offshore image information is firstly subjected to downsampling by adopting a convolution+BN (batch normalization ) +SiLU activation function, a 4-time downsampling image is obtained by adopting a convolution with a layer step length of 2, a channel attention method is utilized, the importance degree of a channel is obtained by adopting a global pooling operation, the importance degree is redistributed according to weight pairs, and aggregation aiming at target features is realized, so that an output layer is obtainedL out1 The method avoids serious loss of target information in the downsampling process, improves the characteristics useful for small scale detection and suppresses the characteristics less useful for the current task, wherein the silu activation function is as follows:
further, toL out1 Building two branches for subsequent processing, one using max pooling operation Maxpooling (MP), the other convolving with 3x3 with step size 2, and finally channel fusion using 1x1 convolution for further downsampling, utilizing andL out1 the same channel attention method obtains the target feature output layer after aggregationL out2 Output layer for continuing downsampling and obtaining semantic aggregation feature information by using channel attention methodL out3 . The backbone network has an input layer, a convolution layer, and an output layerL out1 Output layerL out2 Output layerL out3
Further, by inputting the offshore image information into the backbone network layer, three feature output layers can be obtained B 1 1B 2 1B 3 1 At this time, the three feature output layers are sent into a feature pyramid to perform feature fusion once, and output after fusion is obtainedf 1 1f 2 1f t3 1 In order to further improve the feature aggregation and feature extraction effects, the output after the first fusion is performedf 1 1f 2 1f 3 1 After inputting the backbone network again and obtaining a new output layer, inputting the new output layer into a feature pyramid to obtain a second output, namely a final outputf 1 2f 2 2f 3 2 . By using a recursive pyramid structure, a backbone network of one target detector can pay attention to the picture twice, and the features are recursively aggregated, so that the generated feature representation is stronger and stronger. Then, as shown in fig. 5, the network layer of the decoupling detection head is constructed by performing channel dimension reduction by one 1x1 convolution followed by the decoupling head of the parallel branches of the two 3x3 convolutions. The network layer of the decoupling detection head can realize different degrees of attention to texture and edge information aiming at classification and positioning, so that the target detection performance is further improved, and a final detection result, namely the target type information and the first time node position information, is finally obtained through the decoupling head processing. The technical effect of effectively improving the pertinence of the extraction of the target features in the feature aggregation process during the target detection is achieved.
Specifically, the long-short time network layer is a long-short time memory neural network model, and ship track information which is indicated by the position information of the first time node and through which the sea level position should pass is stored in the neural network model. And inputting the target type information and the first time node position information into the long-short-time network layer, and acquiring track information of a ship passing through the position at the first time node before the first time node according to a preset travelling track, so as to obtain the position sequence information. The position sequence information is information describing the position of the ship running at different time nodes before the first time node, and reflects the running track of the ship. Therefore, irrelevant ships with the running track not belonging to the position sequence information can be eliminated, and the technical effects of improving the accuracy and the processing efficiency of the later detection are achieved.
Step S300: inputting the target position information and the target type information into a number matching database to obtain a target similarity number list;
further, as shown in fig. 3, the target location information and the target type information are input into a number matching database to obtain a target similarity number list, and step S300 in the embodiment of the present application further includes:
Step S310: according to the target type information, a similar unit number set is screened from the number matching database;
step S320: traversing the similar unit number set to obtain a similar unit preset track set, wherein the similar unit preset track set is truncated to the first time node;
step S330: traversing the similar unit preset track set to set a fluctuation interval, and obtaining a similar unit fluctuation track set;
step S340: screening similar fluctuation tracks of the same kind of units from the similar unit fluctuation track set according to the target position information;
step S350: and constructing the target similarity number list according to the similar fluctuation tracks of the similar units.
Specifically, the number matching database is a database in which numbers of ships traveling on the sea are stored, different numbers correspond to different ships, and information such as the type of ship and the traveling track is recorded in the number matching database. The target similarity number is a number corresponding to a ship whose travel locus of the detection target is similar.
Specifically, the similar number filtering is performed in the number matching database by taking the type characteristics of the detection target reflected in the target type information as indexes, so that the similar unit number set is obtained. The same kind of unit number set is a number conforming to the type feature reflected in the target type information. And carrying out preset track inquiry on the similar unit number sets one by one to obtain the similar unit preset track sets, wherein the similar unit preset track sets are cut off to the first time node. In other words, the running track from the starting time to the first time node of the ship in the similar unit number set is collected, and the fluctuation interval is set according to the running speed of the ship on the sea and the running environment according to the preset track. The fluctuation interval is a permissible track error between the running track of the ship and the preset track due to complex sea conditions, and is set by a worker by himself, and the limitation is not limited. By setting the fluctuation interval, the original track is changed from a line to a plane, the obtained fluctuation track is more accurate, and the actual running condition is met, so that the accuracy of subsequent detection is improved.
Specifically, the same type of unit preset track set is corrected and converted through the fluctuation interval, so that the same type of unit fluctuation track set is obtained. And screening the fluctuation track containing the target position information from the similar unit fluctuation track set by taking the target position information as a matching object of similar screening, thereby obtaining the similar fluctuation track of the similar unit. And obtaining the target similarity number list according to the similar fluctuation tracks of the similar units and the mapping relation between the tracks and the unit numbers. The target similar number list comprises numbers corresponding to ships with similar running tracks of detection targets. Therefore, the technical effect of reducing the detection range for detecting and eliminating the irrelevant ships for the offshore targets is achieved.
Step S400: and traversing the target similar number list, and calling a number unit preset track list, wherein the number unit preset track list starts from the first time node.
Specifically, the track list is preset from the number units consistent with the ship numbers in the number matching database according to the ship numbers contained in the target similar number list. The number unit preset track list is a list obtained after preset track collection by taking a first time node as a track starting point, wherein in the list, the number units are in one-to-one correspondence with the preset track list.
Step S500: collecting continuous frame image information according to the target position information and the target type information;
specifically, the object needing continuous frame acquisition and the position of the object are determined according to the target position information and the target type information. Preferably, the infrared camera shooting technology is used for continuous frame acquisition, so that continuous frame image information is obtained. Wherein the continuous frame image information reflects the movement of all ships conforming to the target position information and the target type information in a period of time after the first time node, and includes a plurality of image frames.
Step S600: screening the serial number unit preset track list based on the continuous frame image information to obtain a matched serial number unit;
further, based on the continuous frame image information, the filtering is performed on the preset track list of the number unit to obtain a matching number unit, and step S600 in the embodiment of the present application further includes:
step S610: acquiring first frame image information and second frame image information to kth frame image information according to the continuous frame image information, wherein the first frame image information and the second frame image information to the kth frame image information are in one-to-one correspondence with the time frame number of the serial number unit preset track list;
Step S620: and traversing the first frame image information and the second frame image information until the kth frame image information, and carrying out multi-level screening on the number unit preset track list to obtain the matching number unit.
Further, the traversing the first frame image information and the second frame image information until the kth frame image information performs multi-level filtering on the preset track list of the number unit to obtain the matching number unit, and step S620 in the embodiment of the present application further includes:
step S621: generating first frame track information according to the first frame image information and the target position information;
step S622: traversing the preset track list of the number unit according to the first frame track information to screen, and obtaining a one-time screening result of the preset track of the number unit;
step S623: repeating screening based on the primary screening result of the preset track of the number unit to obtain the k screening results of the preset track of the number unit;
step S624: and obtaining the matched number unit according to the k screening results of the preset track of the number unit.
Further, according to the first frame track information, traversing the number unit preset track list to screen, and obtaining a number unit preset track primary screening result, step S622 in this embodiment of the present application further includes:
Step S6221: traversing the serial number unit preset track list to calculate the similarity based on the first frame track information, and obtaining a plurality of similarities;
step S6222: setting a similarity threshold;
step S6223: screening the number unit preset track list with the similarity being greater than or equal to the similarity threshold, and adding the number unit preset track list into a number unit preset track one-time screening result.
Further, based on the first frame track information, traversing the number unit preset track list to perform similarity calculation, and obtaining a plurality of similarities, where step S622 in this embodiment of the present application further includes:
step S6224: obtaining a similarity evaluation formula:
wherein,representing the similarity between the preset track of any number unit and the track information of the first frame, and +.>Characterizing a first time node, ">Representing a time node corresponding to the first frame track information, < >>Representing the information of the preset track and the first frame track of any number unit, and the distance and the +.>Characterizing an area weight parameter +_>Characterizing head-to-tail deviation weight parameters;
step S6225: and traversing the number unit preset track list to calculate the similarity based on the first frame track information according to the similarity evaluation formula, and acquiring the multiple similarities.
Specifically, image extraction is performed one by one according to the number of frames from the continuous frame image information, and the first frame image information, the second frame image information and the up to the kth frame image information are obtained. The first frame image information is a first frame image acquired after a first time node. And carrying out multi-level screening on the number unit preset track list based on the first frame image information, the second frame image information and the kth frame image information, and screening out the number units of the objects meeting the detection requirements from the number unit preset track list. Wherein the matching number unit is a number corresponding to a target to be detected.
Specifically, the target position in the first frame of image information is extracted as a first position coordinate point, and the first position coordinate point is connected with the position coordinate point in the target position information to generate the first frame of track information. The first frame track information reflects the condition of a route of the detection target moving in a time period from the first time node to a time node corresponding to the first frame image information. And matching the first frame track with the tracks in the number unit preset track list one by taking the first frame track as a matched object, reserving the matched preset tracks, and eliminating the non-matched preset tracks from the number unit preset track list to obtain a number unit preset track one-time screening result.
Specifically, the number unit preset track primary screening result is taken as a screened object, the target position in the second frame image information is extracted to obtain a second position coordinate point, and the second position coordinate point is connected with the position coordinate point in the target position information and the first position coordinate point to generate the second frame track information. And taking the second frame track as a matched object, matching the second frame track with the tracks in the number unit preset track primary screening result one by one, reserving the matched preset tracks, and removing the non-matched preset tracks from the number unit preset track primary screening result to obtain the number unit preset track secondary screening result.
Specifically, based on the same screening method, the obtained screening result is subjected to progressive screening for a plurality of times, so that a k-time screening result of the preset track of the number unit is obtained. The number unit preset track k times screening result is obtained by taking a track obtained by connecting the coordinate points in the first frame image information and the second frame image information to the coordinate points in the kth frame image information and the position coordinate points in the target position information as a matched object, screening the number unit preset track k-1 times screening result, and calling numbers in a number matching database according to the track in the screening result to obtain the matched number unit.
Specifically, according to the similarity evaluation formula, similarity calculation is performed on the data information in the first frame track information and each preset track in the number unit preset track list, so as to obtain the multiple similarities. The plurality of similarities reflect the coincidence matching degree of the track in the track list preset by the number unit and the track in the track information of the first frame. Compared with the traditional similarity evaluation, the similarity evaluation mode has stronger accuracy and referencedue to the consideration of the area and the head-tail deviation. The similarity threshold is a similarity range in which the similarity degree between the preset tracks can meet the similarity requirement. And screening the preset tracks in the preset track list of the number unit one by taking the similarity threshold as a screening standard, and further adding the tracks with similarity greater than or equal to the similarity threshold into a one-time screening result of the preset tracks of the number unit.
Step S700: and adding the matching number unit into a first time node detection result.
Specifically, by adding the matching number unit into the first time node detection result, the purpose of detecting the offshore target at the first time node is achieved, and the technical effects of effectively extracting and screening information, reducing information complexity and improving detection performance are achieved.
In summary, the embodiments of the present application have at least the following technical effects:
according to the method, the offshore image of the first time node is acquired, the acquired image is used as basic information to conduct target identification, the position and the target type of the target are analyzed and acquired, furthermore, the number corresponding to the target position and the number corresponding to the target type are matched in the number matching database by taking the target position and the target type as indexes, so that the preset track corresponding to the number is inquired, after the inquiring result, namely a number unit preset track list is obtained, the object acquired by the continuous frame is determined by using the target position information and the target type information, then the continuous frame is acquired, continuous frame image information is obtained, the continuous frame image information is used as screening basis, the plurality of tracks in the number unit preset track list are subjected to matching screening, the number unit corresponding to the continuous frame image reflection track is obtained, and furthermore, the matching number unit is added into the first time node detection result. The intelligent degree of the offshore target monitoring is improved, and the technical effect of improving the detection efficiency is achieved.
Example two
Based on the same inventive concept as the method for intelligent detection of an offshore target in the foregoing embodiments, as shown in fig. 6, the present application provides an intelligent detection system for an offshore target, and the system and method embodiments in the embodiments of the present application are based on the same inventive concept. Wherein the system comprises:
An image information obtaining module 11, where the image information obtaining module 11 is configured to obtain offshore image information, where the offshore image information belongs to a first time node;
the identification result obtaining module 12 is configured to perform target identification according to the offshore image information, and obtain a target identification result, where the target identification result includes target location information and target type information;
a number list obtaining module 13, where the number list obtaining module 13 is configured to input the target location information and the target type information into a number matching database to obtain a target similar number list;
the track list calling module 14 is configured to traverse the target similar number list, and call a number unit preset track list, where the number unit preset track list starts from the first time node;
a continuous frame information obtaining module 15, where the continuous frame information obtaining module 15 is configured to collect continuous frame image information according to the target position information and the target type information;
a number unit obtaining module 16, where the number unit obtaining module 16 is configured to screen the number unit preset track list based on the continuous frame image information to obtain a matching number unit;
The number unit adding module 17 is configured to add the matching number unit to the first time node detection result.
Further, the system further comprises:
the model framework acquisition unit is used for acquiring a neural network model framework, wherein the neural network model framework comprises a backbone network, a recursive feature pyramid, a decoupling detection head network layer and a long-short-time network layer;
the position information obtaining unit is used for inputting the offshore image information into the backbone network-recursive feature pyramid-decoupling detection head network layer and obtaining the target type information and the first time node position information;
the sequence information acquisition unit is used for inputting the target type information and the first time node position information into the long-short-time network layer to acquire position sequence information;
and a target position setting unit configured to set the position sequence information as the target position information.
Further, the system further comprises:
the number set screening unit is used for screening similar unit number sets from the number matching database according to the target type information;
The track set acquisition unit is used for traversing the similar unit number sets to acquire similar unit preset track sets, wherein the similar unit preset track sets are cut off to the first time node;
the fluctuation track obtaining unit is used for traversing the similar unit preset track set to set a fluctuation interval and obtaining a similar unit fluctuation track set;
the same-kind fluctuation track screening unit is used for screening similar fluctuation tracks of the same-kind units from the same-kind unit fluctuation track set according to the target position information;
and the similar number list construction unit is used for constructing the target similar number list according to the similar fluctuation tracks of the similar units.
Further, the system further comprises:
the K frame image obtaining unit is used for obtaining first frame image information and second frame image information to kth frame image information according to the continuous frame image information, wherein the time frames from the first frame image information to the second frame image information to the kth frame image information correspond to the number unit preset track list one by one;
The multi-level screening unit is used for traversing the first frame image information and the second frame image information until the kth frame image information, and carrying out multi-level screening on the number unit preset track list to obtain the matching number unit.
Further, the system further comprises:
a first frame track generation unit, configured to generate first frame track information according to the first frame image information and the target position information;
the primary screening result obtaining unit is used for traversing the number unit preset track list to screen according to the first frame track information so as to obtain a number unit preset track primary screening result;
k screening result obtaining units, wherein the k screening result obtaining units are used for repeatedly screening based on the number unit preset track primary screening result to obtain the number unit preset track k screening result;
and the number unit matching unit is used for acquiring the matching number unit according to k screening results of the preset track of the number unit.
Further, the system further comprises:
The similarity obtaining unit is used for traversing the serial number unit preset track list to perform similarity calculation based on the first frame track information so as to obtain a plurality of similarities;
a similarity threshold setting unit for setting a similarity threshold;
and the screening result adding unit is used for screening the number unit preset track lists with the similarity being greater than or equal to the similarity threshold and adding the number unit preset track one-time screening results into the number unit preset track list.
Further, the system further comprises:
a similarity evaluation formula obtaining unit for obtaining a similarity evaluation formula:
wherein,representing the similarity between the preset track of any number unit and the track information of the first frame, and +.>Characterizing a first time node, ">Representing a time node corresponding to the first frame track information, < >>Representing the information of the preset track and the first frame track of any number unit, and the distance and the +.>Characterizing an area weight parameter +_>Characterizing head-to-tail deviation weight parameters;
the similarity obtaining units are used for performing similarity calculation through traversing the number unit preset track list based on the first frame track information according to the similarity evaluation formula to obtain the similarity.
It should be noted that the sequence of the embodiments of the present application is merely for description, and does not represent the advantages and disadvantages of the embodiments. And the foregoing description has been directed to specific embodiments of this specification. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The foregoing description of the preferred embodiments of the present application is not intended to limit the invention to the particular embodiments of the present application, but to limit the scope of the invention to the particular embodiments of the present application.
The specification and drawings are merely exemplary of the application and are to be regarded as covering any and all modifications, variations, combinations, or equivalents that are within the scope of the application. It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the present application and the equivalents thereof, the present application is intended to cover such modifications and variations.

Claims (4)

1. An intelligent detection method for an offshore target is characterized by comprising the following steps:
acquiring offshore image information, wherein the offshore image information comprises an image acquisition time point and an offshore image, and the image acquisition time point is a first time node;
performing target recognition according to the offshore image information to obtain a target recognition result, wherein the target recognition result comprises target position information and target type information;
inputting the target position information and the target type information into a number matching database to obtain a target similarity number list, wherein the target similarity number is a number corresponding to a ship similar to the running track of the detection target, and the number matching database is a data database for storing the numbers of ships running on the sea, and different numbers correspond to different ships;
traversing the target similar number list, and calling a number unit preset track list, wherein the number unit preset track list is obtained by taking a first time node as a track starting point to acquire a preset track;
collecting continuous frame image information according to the target position information and the target type information;
Screening the serial number unit preset track list based on the continuous frame image information to obtain a matched serial number unit, wherein the matched serial number unit is a serial number corresponding to a detection target;
adding the matching number unit into a first time node detection result;
screening the serial number unit preset track list based on the continuous frame image information to obtain a matched serial number unit, wherein the method comprises the following steps:
acquiring first frame image information and second frame image information to kth frame image information according to the continuous frame image information, wherein the first frame image information and the second frame image information to the kth frame image information are in one-to-one correspondence with the time frame number of the serial number unit preset track list;
traversing the first frame image information and the second frame image information until the kth frame image information, and carrying out multi-level screening on the number unit preset track list to obtain the matching number unit;
traversing the first frame image information and the second frame image information until the kth frame image information, performing multi-level screening on the number unit preset track list to obtain the matching number unit, wherein the method comprises the following steps:
Generating first frame track information according to the first frame image information and the target position information;
traversing the preset track list of the number unit according to the first frame track information to screen, and obtaining a one-time screening result of the preset track of the number unit;
repeating screening based on the primary screening result of the preset track of the number unit to obtain the k screening results of the preset track of the number unit;
obtaining the matching number unit according to the k screening results of the preset track of the number unit;
traversing the number unit preset track list for screening according to the first frame track information to obtain a number unit preset track one-time screening result, wherein the method comprises the following steps:
traversing the serial number unit preset track list to calculate the similarity based on the first frame track information, and obtaining a plurality of similarities;
setting a similarity threshold;
screening the number unit preset track lists with the similarity being greater than or equal to the similarity threshold value, and adding the number unit preset track list into a number unit preset track one-time screening result;
based on the first frame track information, traversing the serial number unit preset track list to calculate the similarity, and acquiring a plurality of similarities, wherein the method comprises the following steps:
Obtaining a similarity evaluation formula:
wherein D represents the similarity of any number unit preset track and the track information of the first frame, t 0 Characterizing a first time node, t 1 Representing a time node corresponding to the first frame track information, and l t Representing the distance between a preset track of any number unit and the track information of the first frame, representing an area weight parameter, and representing a head-tail deviation weight parameter;
and traversing the number unit preset track list to calculate the similarity based on the first frame track information according to the similarity evaluation formula, and acquiring the multiple similarities.
2. The method for intelligent detection of an offshore target according to claim 1, wherein target recognition is performed according to the offshore image information to obtain a target recognition result, wherein the target recognition result includes target position information and target type information, and the method comprises:
acquiring a neural network model framework, wherein the neural network model framework comprises a backbone network-recursive feature pyramid-decoupling detection head network layer and a long-short-time network layer;
inputting the offshore image information into the backbone network-recursive feature pyramid-decoupling detection head network layer, and acquiring the target type information and the first time node position information;
Inputting the target type information and the first time node position information into the long-short time network layer, and acquiring position sequence information, wherein the position sequence information is information describing the operation position of a ship at different time nodes before the first time node;
setting the position sequence information as the target position information.
3. The method for intelligent detection of an offshore object according to claim 2, wherein inputting the object location information and the object type information into a number matching database, obtaining a list of object similarity numbers, comprises:
according to the target type information, a similar unit number set is screened from the number matching database;
traversing the similar unit number set to obtain a similar unit preset track set, wherein the similar unit preset track set is truncated to the first time node;
traversing the similar unit preset track set to set a fluctuation interval, and obtaining a similar unit fluctuation track set;
screening similar fluctuation tracks of the same kind of units from the similar unit fluctuation track set according to the target position information;
and constructing the target similarity number list according to the similar fluctuation tracks of the similar units.
4. An intelligent detection system for an offshore target, comprising:
the system comprises an image information acquisition module, a data acquisition module and a data processing module, wherein the image information acquisition module is used for acquiring marine image information, the marine image information comprises an image acquisition time point and a marine image, and the image acquisition time point is a first time node;
the identification result obtaining module is used for carrying out target identification according to the offshore image information to obtain a target identification result, wherein the target identification result comprises target position information and target type information;
the number list obtaining module is used for inputting the target position information and the target type information into a number matching database to obtain a target similar number list, wherein the target similar number is a number corresponding to a ship with a similar running track of a detection target, the number matching database is a data database for storing numbers of ships running on the sea, and different numbers correspond to different ships;
the track list calling module is used for traversing the target similar number list and calling a number unit preset track list, wherein the number unit preset track list starts from the first time node, the number matching database is a data database after numbering and storing ships running on the sea, and different numbers correspond to different ships;
The continuous frame information acquisition module is used for acquiring continuous frame image information according to the target position information and the target type information;
the number unit obtaining module is used for screening the number unit preset track list based on the continuous frame image information to obtain a matched number unit, wherein the matched number unit is a number corresponding to a detection target;
the number unit adding module is used for adding the matched number units into the first time node detection result;
the K frame image obtaining unit is used for obtaining first frame image information and second frame image information to kth frame image information according to the continuous frame image information, wherein the time frames from the first frame image information to the second frame image information to the kth frame image information correspond to the number unit preset track list one by one;
the multi-level screening unit is used for traversing the first frame image information and the second frame image information to the kth frame image information, carrying out multi-level screening on the number unit preset track list and obtaining the matching number unit;
A first frame track generation unit, configured to generate first frame track information according to the first frame image information and the target position information;
the primary screening result obtaining unit is used for traversing the number unit preset track list to screen according to the first frame track information so as to obtain a number unit preset track primary screening result;
k screening result obtaining units, wherein the k screening result obtaining units are used for repeatedly screening based on the number unit preset track primary screening result to obtain the number unit preset track k screening result;
the number unit matching unit is used for obtaining the matching number unit according to k screening results of the preset track of the number unit;
the similarity obtaining unit is used for traversing the serial number unit preset track list to perform similarity calculation based on the first frame track information so as to obtain a plurality of similarities;
a similarity threshold setting unit for setting a similarity threshold;
the screening result adding unit is used for screening the number unit preset track lists with the similarity being greater than or equal to the similarity threshold value and adding the number unit preset track list into a number unit preset track one-time screening result;
A similarity evaluation formula obtaining unit for obtaining a similarity evaluation formula:
wherein D represents the similarity of any number unit preset track and the track information of the first frame, t 0 Characterizing a first time node, t 1 Representing a time node corresponding to the first frame track information, and l t Representing the distance between a preset track of any number unit and the track information of the first frame, representing an area weight parameter, and representing a head-tail deviation weight parameter;
the similarity obtaining units are used for performing similarity calculation through traversing the number unit preset track list based on the first frame track information according to the similarity evaluation formula to obtain the similarity.
CN202410044869.XA 2024-01-12 2024-01-12 Intelligent detection method and system for offshore targets Active CN117557789B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410044869.XA CN117557789B (en) 2024-01-12 2024-01-12 Intelligent detection method and system for offshore targets

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410044869.XA CN117557789B (en) 2024-01-12 2024-01-12 Intelligent detection method and system for offshore targets

Publications (2)

Publication Number Publication Date
CN117557789A CN117557789A (en) 2024-02-13
CN117557789B true CN117557789B (en) 2024-04-09

Family

ID=89811507

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410044869.XA Active CN117557789B (en) 2024-01-12 2024-01-12 Intelligent detection method and system for offshore targets

Country Status (1)

Country Link
CN (1) CN117557789B (en)

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006146823A (en) * 2004-11-24 2006-06-08 Nippon Hoso Kyokai <Nhk> Video object trajectory adding system and video object trajectory adding program
CN105788271A (en) * 2016-05-17 2016-07-20 厦门市美亚柏科信息股份有限公司 Method and apparatus for identifying target moving object through trajectory matching
CN108875666A (en) * 2018-06-27 2018-11-23 腾讯科技(深圳)有限公司 Acquisition methods, device, computer equipment and the storage medium of motion profile
JP2019008507A (en) * 2017-06-23 2019-01-17 株式会社東芝 Transformation matrix calculating apparatus, location estimating apparatus, transformation matrix calculating method, and location estimating method
CN110175654A (en) * 2019-05-29 2019-08-27 广州小鹏汽车科技有限公司 A kind of update method and system of track road sign
CN111107319A (en) * 2019-12-25 2020-05-05 眸芯科技(上海)有限公司 Target tracking method, device and system based on regional camera
CN111354013A (en) * 2020-03-13 2020-06-30 北京字节跳动网络技术有限公司 Target detection method and device, equipment and storage medium
CN111553223A (en) * 2020-04-21 2020-08-18 中国人民解放军海军七〇一工厂 Ship target identification method, device, equipment and readable storage medium
CN111814914A (en) * 2020-08-26 2020-10-23 珠海大横琴科技发展有限公司 Target object identification method and device
CN112037245A (en) * 2020-07-22 2020-12-04 杭州海康威视数字技术股份有限公司 Method and system for determining similarity of tracked target
CN112241686A (en) * 2020-09-16 2021-01-19 四川天翼网络服务有限公司 Trajectory comparison matching method and system based on feature vectors
CN112307148A (en) * 2020-10-30 2021-02-02 平安普惠企业管理有限公司 Indoor vehicle searching method and device, server and computer readable storage medium
CN112307805A (en) * 2019-07-25 2021-02-02 南京理工大学 Automatic identification method for offshore targets
CN112506221A (en) * 2020-12-04 2021-03-16 国网湖北省电力有限公司检修公司 Unmanned aerial vehicle route planning processing method based on laser point cloud
CN113538974A (en) * 2021-07-14 2021-10-22 电子科技大学 Multi-source data fusion-based flight target anomaly detection method
CN113934800A (en) * 2021-10-12 2022-01-14 广州汇智通信技术有限公司 Temporary number vehicle accompanying relation identification method, device, terminal and medium
CN114676756A (en) * 2022-03-04 2022-06-28 重庆中科云从科技有限公司 Image recognition method, image recognition device and computer storage medium
CN114820765A (en) * 2022-03-09 2022-07-29 亚信科技(中国)有限公司 Image recognition method and device, electronic equipment and computer readable storage medium
CN114898307A (en) * 2022-07-11 2022-08-12 浙江大华技术股份有限公司 Object tracking method and device, electronic equipment and storage medium
CN115147449A (en) * 2022-05-26 2022-10-04 北京旷视科技有限公司 Multi-target tracking method, electronic equipment, storage medium and product
CN115205655A (en) * 2022-09-15 2022-10-18 中国科学院长春光学精密机械与物理研究所 Infrared dark spot target detection system under dynamic background and detection method thereof
WO2023275544A1 (en) * 2021-06-29 2023-01-05 Sirius Constellation Limited Methods and systems for detecting vessels
CN116258748A (en) * 2023-03-15 2023-06-13 阿维塔科技(重庆)有限公司 Track tracking method
CN116580056A (en) * 2023-05-05 2023-08-11 武汉理工大学 Ship detection and tracking method and device, electronic equipment and storage medium
CN116630888A (en) * 2023-05-24 2023-08-22 平安科技(深圳)有限公司 Unmanned aerial vehicle monitoring method, unmanned aerial vehicle monitoring device, electronic equipment and storage medium
CN116821709A (en) * 2023-07-05 2023-09-29 中科世通亨奇(北京)科技有限公司 Airplane identity recognition method based on behavior characteristics

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971082A (en) * 2013-01-31 2014-08-06 威联通科技股份有限公司 Video object detecting system and method based on area conversion
CN110517293A (en) * 2019-08-29 2019-11-29 京东方科技集团股份有限公司 Method for tracking target, device, system and computer readable storage medium

Patent Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006146823A (en) * 2004-11-24 2006-06-08 Nippon Hoso Kyokai <Nhk> Video object trajectory adding system and video object trajectory adding program
CN105788271A (en) * 2016-05-17 2016-07-20 厦门市美亚柏科信息股份有限公司 Method and apparatus for identifying target moving object through trajectory matching
JP2019008507A (en) * 2017-06-23 2019-01-17 株式会社東芝 Transformation matrix calculating apparatus, location estimating apparatus, transformation matrix calculating method, and location estimating method
CN108875666A (en) * 2018-06-27 2018-11-23 腾讯科技(深圳)有限公司 Acquisition methods, device, computer equipment and the storage medium of motion profile
CN110175654A (en) * 2019-05-29 2019-08-27 广州小鹏汽车科技有限公司 A kind of update method and system of track road sign
CN112307805A (en) * 2019-07-25 2021-02-02 南京理工大学 Automatic identification method for offshore targets
CN111107319A (en) * 2019-12-25 2020-05-05 眸芯科技(上海)有限公司 Target tracking method, device and system based on regional camera
CN111354013A (en) * 2020-03-13 2020-06-30 北京字节跳动网络技术有限公司 Target detection method and device, equipment and storage medium
CN111553223A (en) * 2020-04-21 2020-08-18 中国人民解放军海军七〇一工厂 Ship target identification method, device, equipment and readable storage medium
CN112037245A (en) * 2020-07-22 2020-12-04 杭州海康威视数字技术股份有限公司 Method and system for determining similarity of tracked target
CN111814914A (en) * 2020-08-26 2020-10-23 珠海大横琴科技发展有限公司 Target object identification method and device
CN112241686A (en) * 2020-09-16 2021-01-19 四川天翼网络服务有限公司 Trajectory comparison matching method and system based on feature vectors
CN112307148A (en) * 2020-10-30 2021-02-02 平安普惠企业管理有限公司 Indoor vehicle searching method and device, server and computer readable storage medium
CN112506221A (en) * 2020-12-04 2021-03-16 国网湖北省电力有限公司检修公司 Unmanned aerial vehicle route planning processing method based on laser point cloud
WO2023275544A1 (en) * 2021-06-29 2023-01-05 Sirius Constellation Limited Methods and systems for detecting vessels
CN113538974A (en) * 2021-07-14 2021-10-22 电子科技大学 Multi-source data fusion-based flight target anomaly detection method
CN113934800A (en) * 2021-10-12 2022-01-14 广州汇智通信技术有限公司 Temporary number vehicle accompanying relation identification method, device, terminal and medium
CN114676756A (en) * 2022-03-04 2022-06-28 重庆中科云从科技有限公司 Image recognition method, image recognition device and computer storage medium
CN114820765A (en) * 2022-03-09 2022-07-29 亚信科技(中国)有限公司 Image recognition method and device, electronic equipment and computer readable storage medium
CN115147449A (en) * 2022-05-26 2022-10-04 北京旷视科技有限公司 Multi-target tracking method, electronic equipment, storage medium and product
CN114898307A (en) * 2022-07-11 2022-08-12 浙江大华技术股份有限公司 Object tracking method and device, electronic equipment and storage medium
CN115205655A (en) * 2022-09-15 2022-10-18 中国科学院长春光学精密机械与物理研究所 Infrared dark spot target detection system under dynamic background and detection method thereof
CN116258748A (en) * 2023-03-15 2023-06-13 阿维塔科技(重庆)有限公司 Track tracking method
CN116580056A (en) * 2023-05-05 2023-08-11 武汉理工大学 Ship detection and tracking method and device, electronic equipment and storage medium
CN116630888A (en) * 2023-05-24 2023-08-22 平安科技(深圳)有限公司 Unmanned aerial vehicle monitoring method, unmanned aerial vehicle monitoring device, electronic equipment and storage medium
CN116821709A (en) * 2023-07-05 2023-09-29 中科世通亨奇(北京)科技有限公司 Airplane identity recognition method based on behavior characteristics

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Shanwei Liu 等.An Improved Kuhn Munkres Algorithm for Ship Matching in Optical Satellite Images.IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing.2023,4724-4738. *
Songgang Bi 等.Classification and Identification of Moving Targets at Sea.2019 5th International Conference on Big Data and Information Analytics (BigDIA).2019,176-184. *
宁祥云.基于深度学习的海上多目标检测与跟踪.中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑.2021,第2021年卷(第5期),C032-17. *
张寅 等.联合时空信息和轨迹关联的空中多目标检测.武汉大学学报(信息科学版).2020,第45卷(第10期),1533-1540. *

Also Published As

Publication number Publication date
CN117557789A (en) 2024-02-13

Similar Documents

Publication Publication Date Title
CN111222526B (en) Method, device, equipment and storage medium for identifying real-time fishing behavior of fishing vessel
CN108615071B (en) Model testing method and device
CN110796048A (en) Ship target real-time detection method based on deep neural network
CN112200045A (en) Remote sensing image target detection model establishing method based on context enhancement and application
CN111161224A (en) Casting internal defect grading evaluation system and method based on deep learning
CN111126278A (en) Target detection model optimization and acceleration method for few-category scene
CN110751077A (en) Optical remote sensing picture ship detection method based on component matching and distance constraint
CN113723178A (en) Method and device for detecting video monitoring fire
CN115497015A (en) River floating pollutant identification method based on convolutional neural network
CN114721403B (en) Automatic driving control method and device based on OpenCV and storage medium
CN117765482B (en) Garbage identification method and system for garbage enrichment area of coastal zone based on deep learning
CN113723371B (en) Unmanned ship cleaning route planning method and device, computer equipment and storage medium
CN111539456A (en) Target identification method and device
CN109215059B (en) Local data association method for tracking moving vehicle in aerial video
CN118151119A (en) Millimeter wave radar open-set gait recognition method oriented to search task
CN117557789B (en) Intelligent detection method and system for offshore targets
CN111695048B (en) Epidemic situation tracing method and medium
CN113867410A (en) Unmanned aerial vehicle aerial photography data acquisition mode identification method and system
Choi et al. Automatic sea fog detection and estimation of visibility distance on CCTV
CN109598712A (en) Quality determining method, device, server and the storage medium of plastic foam cutlery box
CN117557780A (en) Target detection algorithm for airborne multi-mode learning
CN107886049B (en) Visibility recognition early warning method based on camera probe
CN115345021A (en) Measuring system for marine biofluid drag coefficient
CN114708519B (en) Elk identification and morphological contour parameter extraction method based on unmanned aerial vehicle remote sensing
CN115376023A (en) Cultivation area detection method based on deformation convolutional network, unmanned aerial vehicle and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant