CN113658196A - Method and device for detecting ship in infrared image, electronic equipment and medium - Google Patents

Method and device for detecting ship in infrared image, electronic equipment and medium Download PDF

Info

Publication number
CN113658196A
CN113658196A CN202110949228.5A CN202110949228A CN113658196A CN 113658196 A CN113658196 A CN 113658196A CN 202110949228 A CN202110949228 A CN 202110949228A CN 113658196 A CN113658196 A CN 113658196A
Authority
CN
China
Prior art keywords
image
ship
detection
edge
filtering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110949228.5A
Other languages
Chinese (zh)
Other versions
CN113658196B (en
Inventor
张韵东
隋红丽
刘小涛
徐祥
崔顺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhongxingtianshi Technology Co ltd
Original Assignee
Beijing Zhongxingtianshi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhongxingtianshi Technology Co ltd filed Critical Beijing Zhongxingtianshi Technology Co ltd
Priority to CN202110949228.5A priority Critical patent/CN113658196B/en
Publication of CN113658196A publication Critical patent/CN113658196A/en
Application granted granted Critical
Publication of CN113658196B publication Critical patent/CN113658196B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the disclosure discloses a detection method and device for a ship in an infrared image, electronic equipment and a medium. One embodiment of the method comprises: acquiring an infrared image; inputting the infrared image into a pre-trained ship detection model to obtain a detection image and a target pixel value; in response to the fact that the region-of-interest frame is not included in the detection image or the target pixel value is smaller than or equal to a preset threshold value, filtering the detection image to obtain a filtered image; carrying out edge detection on the filtered image to obtain edge characteristic information; based on the edge characteristic information, performing expansion processing on the filtered image; performing water filling treatment on the filter image subjected to the expansion treatment; carrying out threshold segmentation on the filter image after color filling to obtain a ship area image; based on the vessel region image, vessel position information is determined. The method and the device can improve the recall rate of the ship detection result in the complex scene.

Description

Method and device for detecting ship in infrared image, electronic equipment and medium
Technical Field
The embodiment of the disclosure relates to the technical field of computers, in particular to a method and a device for detecting a ship in an infrared image, electronic equipment and a medium.
Background
Object detection, also called object extraction, is an image segmentation based on object geometry and statistical features. The method plays an important role in real-time processing of targets in the infrared images of the aerial photography of the unmanned aerial vehicle. The existing detection of the ship in the image is generally to detect the ship by using a single target detection model or a traditional image detection algorithm based on edges.
However, there are often technical problems when the above-described method is adopted:
first, when the image resolution is low or the image size is small, it is difficult for the target detection model to detect the target in the image, and thus, the recall rate of the ship detection result is low.
Secondly, after the ship image is preliminarily detected, the obtained edge characteristics are low in degree of significance, so that the accuracy of information obtained after the image is further processed is low, and further, the joint degree of finally determined ship position information and actual ship position information is low.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose a method, an apparatus, an electronic device, and a medium for detecting a ship in an infrared image to solve one or more of the technical problems mentioned in the above background section.
In a first aspect, some embodiments of the present disclosure provide a method for detecting a ship in an infrared image, the method including: acquiring an infrared image sent by a shooting terminal; inputting the infrared image into a pre-trained ship detection model to obtain a detection image and a target pixel value; determining whether the detection image comprises an interested area frame or not and whether the target pixel value is larger than a first preset threshold value or not, responding to the detection image that the interested area frame is not included or the target pixel value is smaller than or equal to the preset threshold value, and filtering the detection image to obtain a filtered image; carrying out edge detection on the filtered image to obtain edge characteristic information; based on the edge feature information, performing expansion processing on the filtered image to obtain an expanded filtered image; performing water filling treatment on the filter image subjected to the expansion treatment to obtain a filter image subjected to color filling; carrying out threshold segmentation on the filter image after the color filling to obtain a ship area image; and determining ship position information based on the ship area image.
In a second aspect, some embodiments of the present disclosure provide an apparatus for detecting a ship in an infrared image, the apparatus including: an acquisition unit configured to acquire an infrared image transmitted by a photographing terminal; the input unit is configured to input the infrared image into a pre-trained ship detection model to obtain a detection image and a target pixel value; a first determination unit configured to determine whether a region-of-interest frame is included in the detection image and whether a target pixel value is greater than a first predetermined threshold value; a filtering processing unit configured to perform filtering processing on the detection image to obtain a filtered image in response to that the detection image does not include a region-of-interest frame or a target pixel value is less than or equal to a predetermined threshold value; an edge detection unit configured to perform edge detection on the filtered image to obtain edge feature information; an expansion processing unit configured to perform expansion processing on the filtered image based on the edge feature information to obtain an expanded filtered image; a flood filling processing unit configured to perform flood filling processing on the filter image subjected to the expansion processing to obtain a filter image subjected to color filling; a threshold segmentation unit configured to perform threshold segmentation on the color-filled filtered image to obtain a ship region image; a second determining unit configured to determine ship position information based on the ship region image.
In a third aspect, some embodiments of the present disclosure provide an electronic device, comprising: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement the method described in any of the implementations of the first aspect.
In a fourth aspect, some embodiments of the present disclosure provide a computer readable medium on which a computer program is stored, wherein the program, when executed by a processor, implements the method described in any of the implementations of the first aspect.
The above embodiments of the present disclosure have the following advantages: according to the ship detection method in the infrared image, the ship image is detected, and the recall rate of the ship detection result is improved. Specifically, the reason why the recall rate of the ship detection result is low is that: the object detection model has difficulty in detecting an object in an image when the image resolution is low or the image size is small. Based on this, according to the detection method of the ship in the infrared image of some embodiments of the present disclosure, first, the infrared image transmitted by the shooting terminal may be acquired. Thereby, a data basis may be provided for subsequent steps. Secondly, the infrared image can be input into a pre-trained ship detection model to obtain a detection image and a target pixel value, and whether the detection image comprises an interested region frame or not and whether the target pixel value is larger than a first preset threshold value or not is determined. Thus, it can be determined whether the ship image included in the detection image is easy to recognize. Then, in response to that the detection image does not include the region of interest frame or the target pixel value is less than or equal to a predetermined threshold value, the detection image may be subjected to filtering processing to obtain a filtered image. Therefore, after the ship image included in the detection image is determined to be difficult to identify, the image can be filtered, so that the noise of the image is reduced, and the subsequent identification of the ship area is facilitated. Then, edge detection may be performed on the filtered image to obtain edge feature information. Thereby, the edge of the ship area can be determined. Then, based on the edge feature information, the filter image may be subjected to an expansion process to obtain an expanded filter image, and the expanded filter image may be subjected to a flood filling process to obtain a color-filled filter image. Therefore, the ship area in the test image can be more obvious, and the identification of the ship area is facilitated. Then, threshold segmentation can be performed on the filtered image after the color filling to obtain a ship region image. Thus, a highly accurate ship region can be obtained. And finally, determining ship position information based on the ship area image. Therefore, the ship in the image can be identified by fusing the target detection model and the traditional image detection algorithm based on the edge, and the ship in the image can be detected when the image resolution is low or the image size is small, so that the recall rate of the ship detection result is improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale.
Fig. 1 is a schematic diagram of one application scenario of a method of detection of a vessel in an infrared image, according to some embodiments of the present disclosure;
FIG. 2 is a flow diagram of some embodiments of a method of detection of a vessel in an infrared image according to the present disclosure;
FIG. 3 is a schematic block diagram of some embodiments of an apparatus for detecting a vessel in an infrared image according to the present disclosure;
fig. 4 is a schematic structural diagram of an electronic device of the detection method of a ship in an infrared image according to the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is a schematic diagram 100 of one application scenario of a method of detection of a vessel in an infrared image, according to some embodiments of the present disclosure.
In the application scenario of fig. 1, first, the computing device 101 may acquire an infrared image 102 transmitted by a photographing terminal. The computing device 101 may then input the infrared image 102 to a pre-trained ship detection model 103, resulting in a detection image 104 and target pixel values 105. Thereafter, the computing device 101 may determine whether the region of interest box is included in the above-described detection image 104 and the target pixel value 105 is greater than a first predetermined threshold. Next, the computing device 101 may perform filtering processing on the detection image 104 to obtain a filtered image 106 in response to that the detection image 104 does not include the region of interest frame or the target pixel value 105 is less than or equal to a predetermined threshold value. The computing device 101 may then perform edge detection on the filtered image 106 to obtain edge feature information 107. Thereafter, the computing device 101 may perform expansion processing on the filtered image based on the edge feature information 107 to obtain an expanded filtered image 108. Next, the computing device 101 may perform a flood fill process on the above-described dilated filtered image 108, resulting in a color-filled filtered image 109. The computing device 101 may then perform threshold segmentation on the color-filled filtered image 109 to obtain a ship region image 110. Finally, the computing device 101 may determine the vessel location information 111 based on the vessel region image 110 described above.
The computing device 101 may be hardware or software. When the computing device is hardware, it may be implemented as a distributed cluster composed of multiple servers or terminal devices, or may be implemented as a single server or a single terminal device. When the computing device is embodied as software, it may be installed in the hardware devices enumerated above. It may be implemented, for example, as multiple software or software modules to provide distributed services, or as a single software or software module. And is not particularly limited herein.
It should be understood that the number of computing devices in FIG. 1 is merely illustrative. There may be any number of computing devices, as implementation needs dictate.
With continued reference to fig. 2, a flow 200 of some embodiments of a method of detection of a vessel in an infrared image is shown, in accordance with the present disclosure. The method for detecting the ship in the infrared image comprises the following steps:
step 201, acquiring an infrared image sent by a shooting terminal.
In some embodiments, an executing subject (such as the computing device 101 shown in fig. 1) of the ship detection method in the infrared image may acquire the infrared image transmitted by the shooting terminal through a wired connection manner or a wireless connection manner. Wherein, above-mentioned infrared image can be the infrared image including the boats and ships image that unmanned aerial vehicle shot.
Step 202, inputting the infrared image into a pre-trained ship detection model to obtain a detection image and a target pixel value.
In some embodiments, the executing body may input the infrared image to a pre-trained ship detection model, so as to obtain a detection image and a target pixel value. The pre-trained ship detection model may be a model capable of outputting an image with a region-of-interest frame. Meanwhile, related information of the input image (e.g., a maximum pixel value of the image, category information of the image, etc.) may also be output. The pre-trained ship detection model may be a network model trained by CNN (Convolutional Neural Networks), RNN (Recurrent Neural Networks), or DNN (Deep Neural Networks). The detection image may be an image with or without a region-of-interest frame. The target pixel value may be a maximum pixel value detected after the infrared image is input to a pre-trained ship detection model.
Step 203, determining whether the detection image includes a region of interest frame and whether the target pixel value is greater than a first predetermined threshold value.
In some embodiments, the execution subject may determine whether a region of interest box is included in the detection image and whether the target pixel value is greater than a first predetermined threshold. When the resolution of the ship image in the input infrared image is lower than a third predetermined threshold value, or the size of the ship image in the infrared image is lower than a fourth predetermined threshold value, the ship may not be identified by the detection image obtained by inputting the infrared image to the pre-trained ship detection model, and thus, the detection image output by the pre-trained ship detection model may not include the region-of-interest frame. In the case where the target pixel value output by the above-described vessel detection model trained in advance is equal to or less than the first predetermined threshold value, it may be difficult to determine the vessel position information. The number of the region of interest boxes may be one or more. The target pixel value may be a maximum pixel value detected after the infrared image is input to a pre-trained ship detection model.
As an example, the first predetermined threshold may be a size of a 12 × 12 area, such as a size of a 12 × 12pixel area. pixels represent pixels. For example, the resolution of one picture is 1920 × 1080 dpi. dpi represents dots per inch. 12 in 12 × 12 pixels refers to the probability of resolution.
And 204, responding to the situation that the detection image does not comprise the region-of-interest frame or the target pixel value is less than or equal to the preset threshold value, and filtering the detection image to obtain a filtered image.
In some embodiments, the execution subject may perform filtering processing on the detection image to obtain a filtered image in response to that the detection image does not include the region of interest frame or the target pixel value is less than or equal to a predetermined threshold value. The filtering process may be a process of reducing noise of the image by changing values of pixel points of the detected image.
As an example, the execution subject may call a gaussianbleed () function in OpenCV to perform the filtering process. OpenCV is an open-source, cross-platform computer vision and machine learning software library.
As another example, the filtering process for the detection image may be a gaussian filtering process for reducing noise of the detection image. The Gaussian filtering is linear smooth filtering, is suitable for eliminating Gaussian noise, and can be widely applied to the noise reduction process of image processing.
In some optional implementation manners of some embodiments, before the responding that the detection image does not include the region of interest frame, or the target pixel value is less than or equal to a predetermined threshold value, performing filtering processing on the detection image to obtain a filtered image, the method may further include the following steps:
and determining ship position information in response to the fact that the region of interest frame is included in the detection image and the target pixel value is larger than a first preset threshold value. The ship position information may include a contour upper left corner coordinate point, a contour upper right corner coordinate point, a contour height value, and a contour width value. The contour upper left corner coordinate point may be a coordinate point of an upper left corner of the rectangular bounding box. The contour upper right corner coordinate point may be a coordinate point of an upper right corner of the rectangular bounding box. The contour height value may be a height value of a rectangular bounding box. The contour width value may be a width value of a rectangular bounding box. The rectangular enclosure may be an enclosure that encloses the ship image.
And step 205, performing edge detection on the filtered image to obtain edge characteristic information.
In some embodiments, the executing entity may perform edge detection on the filtered image by using a Canny edge detection algorithm to obtain edge feature information. The Canny edge detection algorithm is an algorithm for calculating an image edge.
As an example, the execution agent may call a Canny () function in OpenCV to implement edge detection.
In some optional implementation manners of some embodiments, the performing edge detection on the filtered image to obtain edge feature information may include the following steps:
firstly, determining the gradient amplitude of each pixel point in the filtering image to obtain a gradient amplitude set.
And secondly, screening out gradient amplitudes meeting a first preset condition from the gradient amplitude set to serve as edge amplitudes, and obtaining an edge amplitude set. Wherein, the first preset condition may be that the gradient magnitude is greater than a predetermined threshold. In practice, the gradient amplitude of the edge of the region is larger than that of other regions, and the edge of the target region can be preliminarily determined by screening the gradient amplitudes larger than a predetermined threshold. The target region may be a region of a ship image.
And thirdly, determining the edge amplitude set as edge characteristic information. The edge feature information corresponds to at least one pixel point in the filtered image.
As an example, the edge feature information may be feature information of a ship region, that is, a set of pixel points enclosing the ship region.
And step 206, performing expansion processing on the filtered image based on the edge characteristic information to obtain an expanded filtered image.
In some embodiments, the execution subject may perform dilation processing on the filtered image based on the edge feature information to obtain a dilated filtered image. The above-mentioned dilation process may be to add pixel values to an object-aware boundary in the image, thereby enlarging a bright white region in the image.
As an example, the object perception boundary in the above-described image may be an edge of a ship region. The execution body can call a dilate () function in OpenCV to realize inflation processing.
In some optional implementation manners of some embodiments, the performing edge detection on the filtered image to obtain edge feature information may include the following steps:
firstly, at least one pixel point in the filtering image corresponding to the edge characteristic information is determined as an edge point set.
And secondly, carrying out binary conversion on the pixel value of each pixel point in the filtering image to obtain a binary data set. Wherein the binary data in the binary data set is data including 0 or 1.
And thirdly, assigning the binary data corresponding to the edge point set in the filtering image to obtain an assigned filtering image. The assignment filtering image comprises an edge point set assigned by binary data.
As an example, when the ship region visibility determined by the edge feature information is lower than a predetermined threshold, the value of the pixel point corresponding to the edge of the ship region may be assigned, so that the visibility of the edge of the ship region may be improved.
And fourthly, determining the assigned filtering image comprising the connecting point set as the filtering image after the expansion processing. And the connected point set is an edge point set assigned by binary data in the assigned filtering image.
And step 207, performing water filling processing on the filter image subjected to the expansion processing to obtain a filter image subjected to color filling.
In some embodiments, the execution subject may perform a flood filling process on the filter image after the expansion process to obtain a filter image after a color filling process. The flooding filling treatment may be a treatment of filling the communicating area with a specific color, and achieving different filling effects by setting upper and lower limits of pixels corresponding to the communicating area and a communication mode.
As an example, the communication area may be a ship area in an infrared image. The execution body can call the flodFill () function in OpenCV to realize color filling.
In some optional implementation manners of some embodiments, the performing edge detection on the filtered image to obtain edge feature information may include the following steps:
the method comprises the following steps of firstly, connecting communication points meeting a second preset condition in the communication point set in the expanded filtering image to generate a communication area, and obtaining a communication filtering image. The connected filter image may be a filter image including the connected region. The communication area may be an area formed by connecting communication points satisfying a second preset condition in the communication point set. The second preset condition may be that the pixel value of the communication point is greater than a predetermined threshold value.
And secondly, performing preset color filling on a communicated region included in the communicated filtering image to obtain a filtering image after color filling.
The above step 206 and step 207 serve as an invention point of the embodiment of the present disclosure, and solve the technical problem mentioned in the background art that "after the ship image is preliminarily detected, the obtained edge feature is low in significance, which results in that the accuracy of the information obtained after the image is further processed is low, and further results in that the final determined ship position information is low in degree of fitting with the actual ship position information". The reason why the degree of conformity between the determined ship position information and the actual ship position information is low is often as follows: after the ship image is preliminarily detected, the obtained edge features are low in significance, so that the accuracy of information obtained after the image is further processed is low. If the above-mentioned reasons are solved, the effect of improving the degree of fitting of the determined ship position information and the actual ship position information can be achieved. To achieve this effect, first, binary conversion may be performed on the pixel values of each pixel point in the filtered image to obtain a binary data set. Thus, the subsequent assignment processing to the pixel can be facilitated. Then, the binary data corresponding to the edge point set in the filtering image can be assigned to obtain an assigned filtering image. Thus, the pixel value of the edge of the ship region can be set to 1, and the edge of the ship region can be distinguished from other regions in the detection image. Then, the connected points meeting a second preset condition in the connected point set in the expanded filtered image can be connected to generate a connected region, so as to obtain a connected filtered image. Thus, a ship area surrounded by the edges of the ship area can be obtained. And finally, performing preset color filling on the communicated region included in the communicated filtering image to obtain the filtering image after the color filling. Therefore, after the ship image is preliminarily detected to obtain the edge characteristic with lower significance, the ship area is more obvious, and the fitting degree of finally determined ship position information and actual ship position information is improved.
And step 208, performing threshold segmentation on the filtered image after color filling to obtain a ship area image.
In some embodiments, the execution subject may perform threshold segmentation on the color-filled filtered image to obtain a ship region image. The threshold segmentation may be a region-based image segmentation technique. The threshold segmentation can divide pixel points in the image into a plurality of classes, so that region segmentation is realized.
In some optional implementation manners of some embodiments, the performing threshold segmentation on the color-filled filtered image to obtain a ship region image may include the following steps:
firstly, screening out pixel points with pixel values larger than a second preset threshold value from the filter image after color filling to serve as ship pixel points, and obtaining a ship pixel point set. In the filtered image after color filling, the pixel values and the saturation of the color filling area are different from those of other areas, and the pixel points corresponding to the ship area can be determined according to the difference between the pixel values and the saturation.
And secondly, determining an image area corresponding to the ship pixel point set as a ship area image.
Step 209 determines ship position information based on the ship area image.
In some embodiments, the execution subject may determine ship position information based on the ship region image.
In some optional implementations of some embodiments, the determining the ship position information based on the ship region image may include:
firstly, extracting the contour information of the ship area image to obtain ship contour information. The ship contour information comprises a contour upper left corner coordinate point, a contour upper right corner coordinate point, a contour height value and a contour width value. The contour upper left corner coordinate point may be a coordinate point of an upper left corner of the rectangular bounding box. The contour upper right corner coordinate point may be a coordinate point of an upper right corner of the rectangular bounding box. The contour height value may be a height value of a rectangular bounding box. The contour width value may be a width value of a rectangular bounding box. The rectangular enclosure may be an enclosure that encloses the ship image.
As an example, the communication area may be a ship area in an infrared image. The execution subject may call a findContours () function in OpenCV to extract the contour information of the ship region image.
And secondly, determining the ship contour information as ship position information.
The above embodiments of the present disclosure have the following advantages: according to the ship detection method in the infrared image, the ship image is detected, and the recall rate of the ship detection result is improved. Specifically, the reason why the recall rate of the ship detection result is low is that: the object detection model has difficulty in detecting an object in an image when the image resolution is low or the image size is small. Based on this, according to the detection method of the ship in the infrared image of some embodiments of the present disclosure, first, the infrared image transmitted by the shooting terminal may be acquired. Thereby, a data basis may be provided for subsequent steps. Secondly, the infrared image can be input into a pre-trained ship detection model to obtain a detection image and a target pixel value, and whether the detection image comprises an interested region frame or not and whether the target pixel value is larger than a first preset threshold value or not is determined. Thus, it can be determined whether the ship image included in the detection image is easy to recognize. Then, in response to that the detection image does not include the region of interest frame or the target pixel value is less than or equal to a predetermined threshold value, the detection image may be subjected to filtering processing to obtain a filtered image. Therefore, after the ship image included in the detection image is determined to be difficult to identify, the image can be filtered, so that the noise of the image is reduced, and the subsequent identification of the ship area is facilitated. Then, edge detection may be performed on the filtered image to obtain edge feature information. Thereby, the edge of the ship area can be determined. Then, based on the edge feature information, the filter image may be subjected to an expansion process to obtain an expanded filter image, and the expanded filter image may be subjected to a flood filling process to obtain a color-filled filter image. Therefore, the ship area in the test image can be more obvious, and the identification of the ship area is facilitated. Then, threshold segmentation can be performed on the filtered image after the color filling to obtain a ship region image. Thus, a highly accurate ship region can be obtained. And finally, determining ship position information based on the ship area image. Therefore, the ship in the image can be identified by fusing the target detection model and the traditional image detection algorithm based on the edge, and the ship in the image can be detected when the image resolution is low or the image size is small, so that the recall rate of the ship detection result is improved.
With further reference to fig. 3, as an implementation of the above-described methods for the above-described figures, the present disclosure provides some embodiments of an apparatus for detecting a ship in an infrared image, which correspond to those of the method embodiments described above with reference to fig. 2, and which may be applied in various electronic devices.
As shown in fig. 3, the detection apparatus 300 for a ship in an infrared image according to some embodiments includes: an acquisition unit 301, an input unit 302, a first determination unit 303, a filter processing unit 304, an edge detection unit 305, an expansion processing unit 306, a flood filling processing unit 307, a threshold segmentation unit 308, and a second determination unit 309. The acquisition unit 301 is configured to acquire an infrared image transmitted by the shooting terminal; an input unit 302 configured to input the infrared image into a pre-trained ship detection model, so as to obtain a detection image and a target pixel value; a first determining unit 303 configured to determine whether the detection image includes a region of interest frame and whether the target pixel value is greater than a first predetermined threshold; a filtering processing unit 304, configured to perform filtering processing on the detection image to obtain a filtered image in response to that the detection image does not include the region of interest frame or the target pixel value is less than or equal to a predetermined threshold value; an edge detection unit 305 configured to perform edge detection on the filtered image to obtain edge feature information; an expansion processing unit 306 configured to perform expansion processing on the filtered image based on the edge feature information to obtain an expanded filtered image; a flood filling processing unit 307 configured to perform flood filling processing on the filter image subjected to the expansion processing to obtain a filter image subjected to color filling; a threshold segmentation unit 308 configured to perform threshold segmentation on the color-filled filtered image to obtain a ship region image; a second determination unit 309 configured to determine the ship position information based on the ship region image.
It will be understood that the units described in the apparatus 300 correspond to the various steps in the method described with reference to fig. 2. Thus, the operations, features and resulting advantages described above with respect to the method are also applicable to the apparatus 300 and the units included therein, and are not described herein again.
Referring now to FIG. 4, a block diagram of an electronic device (e.g., computing device 101 of FIG. 1)400 suitable for use in implementing some embodiments of the present disclosure is shown. The electronic device shown in fig. 4 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 4, electronic device 400 may include a processing device (e.g., central processing unit, graphics processor, etc.) 401 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)402 or a program loaded from a storage device 408 into a Random Access Memory (RAM) 403. In the RAM 403, various programs and data necessary for the operation of the electronic apparatus 400 are also stored. The processing device 401, the ROM 402, and the RAM 403 are connected to each other via a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
Generally, the following devices may be connected to the I/O interface 405: input devices 406 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 407 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 408 including, for example, tape, hard disk, etc.; and a communication device 409. The communication means 409 may allow the electronic device 400 to communicate wirelessly or by wire with other devices to exchange data. While fig. 4 illustrates an electronic device 400 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 4 may represent one device or may represent multiple devices as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network through the communication device 409, or from the storage device 408, or from the ROM 402. The computer program, when executed by the processing apparatus 401, performs the above-described functions defined in the methods of some embodiments of the present disclosure.
It should be noted that the computer readable medium described in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the apparatus; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring an infrared image sent by a shooting terminal; inputting the infrared image into a pre-trained ship detection model to obtain a detection image and a target pixel value; determining whether the detection image comprises an interested area frame or not and whether the target pixel value is larger than a first preset threshold value or not, responding to the detection image that the interested area frame is not included or the target pixel value is smaller than or equal to the preset threshold value, and filtering the detection image to obtain a filtered image; carrying out edge detection on the filtered image to obtain edge characteristic information; based on the edge feature information, performing expansion processing on the filtered image to obtain an expanded filtered image; performing water filling treatment on the filter image subjected to the expansion treatment to obtain a filter image subjected to color filling; carrying out threshold segmentation on the filter image after the color filling to obtain a ship area image; and determining ship position information based on the ship area image.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by software, and may also be implemented by hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, an input unit, a first determination unit, a filtering processing unit, an edge detection unit, an expansion processing unit, a flood filling processing unit, a threshold segmentation unit, and a second determination unit. The names of these units do not in some cases constitute a limitation on the unit itself, and for example, the acquisition unit may also be described as a "unit that acquires an infrared image transmitted by the photographing terminal".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (10)

1. A method for detecting ships in infrared images comprises the following steps:
acquiring an infrared image sent by a shooting terminal;
inputting the infrared image into a pre-trained ship detection model to obtain a detection image and a target pixel value;
determining whether a region of interest box is included in the detection image and whether a target pixel value is greater than a first predetermined threshold value;
responding to the detected image without the region of interest frame or the target pixel value less than or equal to the preset threshold value, and filtering the detected image to obtain a filtered image;
carrying out edge detection on the filtered image to obtain edge characteristic information;
based on the edge feature information, performing expansion processing on the filtering image to obtain an expanded filtering image;
performing water filling treatment on the filter image subjected to the expansion treatment to obtain a filter image subjected to color filling;
performing threshold segmentation on the color-filled filtering image to obtain a ship region image;
and determining ship position information based on the ship region image.
2. The method according to claim 1, wherein before said responding to that no region-of-interest frame is included in the detection image or that a target pixel value is less than or equal to a predetermined threshold value, performing a filtering process on the detection image to obtain a filtered image, the method further comprises:
and determining ship position information in response to the fact that the region of interest frame is included in the detection image and the target pixel value is larger than a first preset threshold value.
3. The method of claim 2, wherein the performing edge detection on the filtered image to obtain edge feature information comprises:
determining the gradient amplitude of each pixel point in the filtering image to obtain a gradient amplitude set;
screening out gradient amplitudes meeting a first preset condition from the gradient amplitude set to serve as edge amplitudes, and obtaining an edge amplitude set;
and determining the edge amplitude set as edge characteristic information, wherein the edge characteristic information corresponds to at least one pixel point in the filtering image.
4. The method according to claim 3, wherein the expanding the filtered image based on the edge feature information to obtain an expanded filtered image comprises:
determining at least one pixel point in the filtering image corresponding to the edge feature information as an edge point set;
carrying out binary conversion on the pixel value of each pixel point in the filtering image to obtain a binary data set;
assigning binary data corresponding to the edge point set in the filtering image to obtain an assigned filtering image, wherein the assigned filtering image comprises the edge point set assigned by the binary data;
determining an assigned filtering image comprising a connected point set as an expanded filtering image, wherein the connected point set is an edge point set assigned by binary data in the assigned filtering image.
5. The method of claim 4, wherein the performing the flood fill process on the dilated filtered image to obtain a color-filled filtered image comprises:
connecting communication points meeting a second preset condition in the communication point set in the expanded filtered image to generate a communication area, so as to obtain a communication filtered image, wherein the communication filtered image is a filtered image comprising the communication area, and the communication area is an area formed by connecting the communication points meeting the second preset condition in the communication point set;
and performing preset color filling on a communicated region included in the communicated filtering image to obtain a filtering image after the color filling.
6. The method of claim 5, wherein the thresholding the color-filled filtered image to obtain a vessel region image comprises:
screening out pixel points with pixel values larger than a second preset threshold value from the filtered image after color filling to serve as ship pixel points, and obtaining a ship pixel point set;
and determining the image area corresponding to the ship pixel point set as a ship area image.
7. The method of claim 6, wherein the determining vessel location information based on the vessel region image comprises:
extracting contour information of the ship region image to obtain ship contour information, wherein the ship contour information comprises a contour upper left corner coordinate point, a contour upper right corner coordinate point, a contour height value and a contour width value;
and determining the ship contour information as ship position information.
8. A detection apparatus for a vessel in an infrared image, comprising:
an acquisition unit configured to acquire an infrared image transmitted by a photographing terminal;
the input unit is configured to input the infrared image into a pre-trained ship detection model to obtain a detection image and a target pixel value;
a first determination unit configured to determine whether a region-of-interest frame is included in the detection image and whether a target pixel value is greater than a first predetermined threshold;
a filtering processing unit configured to perform filtering processing on the detection image to obtain a filtered image in response to that the detection image does not include a region-of-interest frame or a target pixel value is less than or equal to a predetermined threshold value;
the edge detection unit is configured to carry out edge detection on the filtered image to obtain edge characteristic information;
the expansion processing unit is configured to perform expansion processing on the filtering image based on the edge feature information to obtain an expanded filtering image;
the flooding filling processing unit is configured to perform flooding filling processing on the filter image subjected to the expansion processing to obtain a filter image subjected to color filling;
a threshold segmentation unit configured to perform threshold segmentation on the color-filled filtered image to obtain a ship region image;
a second determination unit configured to determine vessel position information based on the vessel region image.
9. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
10. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-7.
CN202110949228.5A 2021-08-18 2021-08-18 Ship detection method and device in infrared image, electronic equipment and medium Active CN113658196B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110949228.5A CN113658196B (en) 2021-08-18 2021-08-18 Ship detection method and device in infrared image, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110949228.5A CN113658196B (en) 2021-08-18 2021-08-18 Ship detection method and device in infrared image, electronic equipment and medium

Publications (2)

Publication Number Publication Date
CN113658196A true CN113658196A (en) 2021-11-16
CN113658196B CN113658196B (en) 2024-07-30

Family

ID=78481014

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110949228.5A Active CN113658196B (en) 2021-08-18 2021-08-18 Ship detection method and device in infrared image, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN113658196B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114994671A (en) * 2022-05-31 2022-09-02 南京慧尔视智能科技有限公司 Target detection method, device, equipment and medium based on radar image
CN115082810A (en) * 2022-07-28 2022-09-20 中国科学院空天信息创新研究院 Synchronous orbit satellite infrared image ship detection method, device, equipment and medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030039402A1 (en) * 2001-08-24 2003-02-27 Robins David R. Method and apparatus for detection and removal of scanned image scratches and dust
WO2017111257A1 (en) * 2015-12-23 2017-06-29 한화테크윈 주식회사 Image processing apparatus and image processing method
US20180165522A1 (en) * 2016-02-26 2018-06-14 The Boeing Company Target Object Recognition in Infrared Images
WO2020113989A1 (en) * 2018-12-03 2020-06-11 Zhejiang Dahua Technology Co., Ltd. Method and system for image processing
KR102221096B1 (en) * 2020-09-23 2021-02-26 국방과학연구소 Deep learning training method and system using infrared image
CN112464933A (en) * 2020-11-30 2021-03-09 南京莱斯电子设备有限公司 Intelligent recognition method for small dim target of ground-based staring infrared imaging

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030039402A1 (en) * 2001-08-24 2003-02-27 Robins David R. Method and apparatus for detection and removal of scanned image scratches and dust
WO2017111257A1 (en) * 2015-12-23 2017-06-29 한화테크윈 주식회사 Image processing apparatus and image processing method
US20180165522A1 (en) * 2016-02-26 2018-06-14 The Boeing Company Target Object Recognition in Infrared Images
WO2020113989A1 (en) * 2018-12-03 2020-06-11 Zhejiang Dahua Technology Co., Ltd. Method and system for image processing
KR102221096B1 (en) * 2020-09-23 2021-02-26 국방과학연구소 Deep learning training method and system using infrared image
CN112464933A (en) * 2020-11-30 2021-03-09 南京莱斯电子设备有限公司 Intelligent recognition method for small dim target of ground-based staring infrared imaging

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SHUAI TAO; MINEICHI KUDO; HIDETOSHI NONAKA; JUN TOYAMA: "Camera view usage of binary infrared sensors for activity recognition", 《PROCEEDINGS OF THE 21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR2012)》, 31 December 2012 (2012-12-31) *
徐大琦;倪国强;许廷发;: "中高分辨力遥感图像中飞机目标自动识别算法研究", 光学技术, no. 06, 20 November 2006 (2006-11-20) *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114994671A (en) * 2022-05-31 2022-09-02 南京慧尔视智能科技有限公司 Target detection method, device, equipment and medium based on radar image
CN114994671B (en) * 2022-05-31 2023-11-28 南京慧尔视智能科技有限公司 Target detection method, device, equipment and medium based on radar image
CN115082810A (en) * 2022-07-28 2022-09-20 中国科学院空天信息创新研究院 Synchronous orbit satellite infrared image ship detection method, device, equipment and medium
CN115082810B (en) * 2022-07-28 2022-11-08 中国科学院空天信息创新研究院 Method, device, equipment and medium for detecting infrared image ship by synchronous orbit satellite

Also Published As

Publication number Publication date
CN113658196B (en) 2024-07-30

Similar Documents

Publication Publication Date Title
CN110163080B (en) Face key point detection method and device, storage medium and electronic equipment
CN111369427B (en) Image processing method, image processing device, readable medium and electronic equipment
CN112733820B (en) Obstacle information generation method and device, electronic equipment and computer readable medium
CN107622504B (en) Method and device for processing pictures
CN113658196B (en) Ship detection method and device in infrared image, electronic equipment and medium
CN110930296A (en) Image processing method, device, equipment and storage medium
CN110062157B (en) Method and device for rendering image, electronic equipment and computer readable storage medium
CN110059623B (en) Method and apparatus for generating information
CN110211195B (en) Method, device, electronic equipment and computer-readable storage medium for generating image set
EP4432215A1 (en) Image processing method and device
CN111783777B (en) Image processing method, apparatus, electronic device, and computer readable medium
CN114399814B (en) Deep learning-based occlusion object removing and three-dimensional reconstructing method
CN111757100A (en) Method and device for determining camera motion variation, electronic equipment and medium
CN115272182A (en) Lane line detection method, lane line detection device, electronic device, and computer-readable medium
CN115731341A (en) Three-dimensional human head reconstruction method, device, equipment and medium
CN112085733B (en) Image processing method, image processing device, electronic equipment and computer readable medium
CN114120423A (en) Face image detection method and device, electronic equipment and computer readable medium
CN111784709B (en) Image processing method, image processing device, electronic equipment and computer readable medium
CN115100536A (en) Building identification method, building identification device, electronic equipment and computer readable medium
CN114972020A (en) Image processing method and device, storage medium and electronic equipment
CN114723933A (en) Region information generation method and device, electronic equipment and computer readable medium
CN110348374B (en) Vehicle detection method and device, electronic equipment and storage medium
CN111325210B (en) Method and device for outputting information
CN111784710B (en) Image processing method, device, electronic equipment and medium
CN116993637B (en) Image data processing method, device, equipment and medium for lane line detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant