CN113705565A - Ship detection method, device, electronic equipment and computer readable medium - Google Patents

Ship detection method, device, electronic equipment and computer readable medium Download PDF

Info

Publication number
CN113705565A
CN113705565A CN202110927399.8A CN202110927399A CN113705565A CN 113705565 A CN113705565 A CN 113705565A CN 202110927399 A CN202110927399 A CN 202110927399A CN 113705565 A CN113705565 A CN 113705565A
Authority
CN
China
Prior art keywords
image
target
sub
ship
target ship
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110927399.8A
Other languages
Chinese (zh)
Inventor
张韵东
隋红丽
韩建辉
刘小涛
徐祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhongxingtianshi Technology Co ltd
Original Assignee
Beijing Zhongxingtianshi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhongxingtianshi Technology Co ltd filed Critical Beijing Zhongxingtianshi Technology Co ltd
Priority to CN202110927399.8A priority Critical patent/CN113705565A/en
Publication of CN113705565A publication Critical patent/CN113705565A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the disclosure discloses a ship detection method, a ship detection device, electronic equipment and a computer readable medium. One embodiment of the method comprises: acquiring a target ship image; performing labeling processing on the target ship image to obtain a labeled target ship image; cutting the marked target ship image to obtain a target ship subimage set; for each target ship subimage in the target ship subimage set, selecting a detection submodel matched with the target ship subimage from a small-scale detection submodel, a medium-scale detection submodel and a large-scale detection submodel; for each target ship subimage in the target ship subimage set, inputting the target ship subimage into a detection submodel matched with the target ship subimage to obtain a ship detection result; and sending the ship detection result to a target terminal for displaying. The embodiment realizes that the speed of ship detection is improved under the condition of keeping the precision unchanged.

Description

Ship detection method, device, electronic equipment and computer readable medium
Technical Field
The embodiment of the disclosure relates to the technical field of computers, in particular to a ship detection method, a ship detection device, electronic equipment and a computer readable medium.
Background
Object detection is a technique for determining the presence or absence of an object from an image or video and determining the location of the object. At present, when detecting ships, the method generally adopted is as follows: deploying a ship detection model on the detection equipment, and reducing the calculation amount of the model by adopting a model compression mode.
However, when the ship is detected in the above manner, the following technical problems often exist:
firstly, when the size of the image to be detected is too large, the detection equipment can lead to long detection time or even detection failure due to insufficient calculation force;
secondly, when the calculated amount of the detection model is reduced by adopting a model compression mode, the detection precision of the ship can be reduced, and the detection effect is poor or even the detection is wrong.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose methods, apparatuses, devices and computer readable media for ship detection to address one or more of the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a method for ship detection, the method comprising: acquiring a target ship image, wherein the target ship image comprises at least one ship area; performing labeling processing on the target ship image to obtain a labeled target ship image; cutting the marked target ship image to obtain a target ship subimage set; for each target ship subimage in the target ship subimage set, selecting a detection submodel matched with the target ship subimage from a small-scale detection submodel, a medium-scale detection submodel and a large-scale detection submodel; for each target ship subimage in the target ship subimage set, inputting the target ship subimage into a detection submodel matched with the target ship subimage to obtain a ship detection result; and sending the ship detection result of each target ship subimage in the target ship subimage set to a target terminal for displaying.
In a second aspect, some embodiments of the present disclosure provide a ship detection apparatus, the apparatus comprising: an acquisition unit configured to acquire a target ship image, wherein the target ship image includes at least one ship region; the labeling unit is configured to label the target ship image to obtain a labeled target ship image; the cutting unit is configured to cut the marked target ship image to obtain a target ship subimage set; a selection unit configured to select, for each target vessel sub-image in the target vessel sub-image set, a detection sub-model matching the target vessel sub-image from a small-scale detection sub-model, a medium-scale detection sub-model, and a large-scale detection sub-model; the detection unit is configured to input the target ship subimage into a detection submodel matched with the target ship subimage to obtain a ship detection result for each target ship subimage in the target ship subimage set; a sending unit configured to send the ship detection result of each target ship sub-image in the target ship sub-image set to a target terminal for display.
In a third aspect, some embodiments of the present disclosure provide an electronic device, comprising: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement the method described in any of the implementations of the first aspect.
In a fourth aspect, some embodiments of the present disclosure provide a computer readable medium on which a computer program is stored, wherein the program, when executed by a processor, implements the method described in any of the implementations of the first aspect.
The above embodiments of the present disclosure have the following advantages: by the ship detection method of some embodiments of the present disclosure, the detection speed can be increased while maintaining the accuracy unchanged. Specifically, the reason why the detection time of the ship is long and even the detection fails is that: when the image to be inspected is oversized, the inspection equipment can take a relatively long time to inspect or even fail due to insufficient computational power. Based on this, the ship detection method of some embodiments of the present disclosure first obtains a target ship image. Wherein the target vessel image includes at least one vessel region. Therefore, the acquired target ship image can provide data support for the back-side labeling processing of the target ship image. And then, performing labeling processing on the target ship image to obtain a labeled target ship image. Thus, a region of interest in the target vessel image can be distinguished by the annotation region, wherein the region of interest can be a region including the vessel. And then, cutting the marked target ship image to obtain a target ship sub-image set. Therefore, the target ship sub-image set capable of being detected by the detection sub-model can be obtained by cutting the target ship image. And then, for each target ship subimage in the target ship subimage set, selecting a detection submodel matched with the target ship subimage from a small-scale detection submodel, a medium-scale detection submodel and a large-scale detection submodel. Therefore, the target ship subimages in the target ship subimage set can be distributed to different detection submodels for processing, the processing speed of the target ship subimages is increased, and the processing speed of the target ship subimages is increased. And then, inputting the target vessel subimage into a detection submodel matched with the target vessel subimage to obtain a vessel detection result for each target vessel subimage in the target vessel subimage set. Therefore, the detection submodels with different scales can process the input target ship subimage to obtain the detection result of the target ship subimage. And finally, sending the ship detection result of each target ship subimage in the target ship subimage set to a target terminal for displaying. Therefore, the detection sub-model can display the detection result of each target ship sub-image in the target ship sub-image set. And the target detection image is divided into different sizes and processed by detection submodels with different scales, so that the pressure of one model during detection is reduced, and the detection speed of the model is increased.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale.
Fig. 1 is a schematic diagram of one application scenario of a ship detection method of some embodiments of the present disclosure;
fig. 2 is a flow diagram of some embodiments of a ship detection method according to the present disclosure;
figure 3 is a schematic structural diagram of some embodiments of a ship detection method according to the present disclosure;
FIG. 4 is a schematic block diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is a schematic diagram of one application scenario of a ship detection method of some embodiments of the present disclosure.
In the application scenario of fig. 1, first, the computing device 101 may acquire a target ship image 102. Wherein, the target ship image 102 includes at least one ship region. Next, the computing device 101 may perform annotation processing on the target ship image 102 to obtain an annotated target ship image 103. Then, the computing device 101 may perform a cropping process on the labeled target ship image 103 to obtain a target ship sub-image set 104. Then, for each target ship sub-image in the target ship sub-image set 104, the computing device 101 may select a detection sub-model 105 matching the target ship sub-image from the small-scale detection sub-model 1051, the medium-scale detection sub-model 1052, and the large-scale detection sub-model 1053. Then, the computing device 101 may input, for each target ship sub-image in the target ship sub-image set 104, the target ship sub-image into the detection sub-model 105 matched with the target ship sub-image, so as to obtain a ship detection result 106. Finally, the computing device 101 may send the ship detection result 106 for each target ship sub-image in the target ship sub-image collection 104 to the target terminal 107 for display.
The computing device 101 may be hardware or software. When the computing device is hardware, it may be implemented as a distributed cluster composed of multiple servers or terminal devices, or may be implemented as a single server or a single terminal device. When the computing device is embodied as software, it may be installed in the hardware devices enumerated above. It may be implemented, for example, as multiple software or software modules to provide distributed services, or as a single software or software module. And is not particularly limited herein.
It should be understood that the number of computing devices in FIG. 1 is merely illustrative. There may be any number of computing devices, as implementation needs dictate.
With continued reference to fig. 2, a flow 200 of some embodiments of a ship detection method according to the present disclosure is shown. The ship detection method comprises the following steps:
step 201, acquiring a target ship image.
In some embodiments, an execution subject of the ship detection method (e.g., the computing device 101 shown in fig. 1) may acquire the target ship image from the terminal through a wired connection or a wireless connection. Wherein the target vessel image includes at least one vessel region. It should be noted that the wireless connection means may include, but is not limited to, a 3G/4G connection, a WiFi connection, a bluetooth connection, a WiMAX connection, a Zigbee connection, a uwb (ultra wideband) connection, and other wireless connection means now known or developed in the future. Therefore, the acquired target ship image can provide data support for the back-side labeling processing of the target ship image.
In some optional implementations of some embodiments, the execution main body may acquire a target ship image uploaded by a user or a target ship image captured by a camera. Wherein, the target naval vessel image of shooting of above-mentioned camera can include the target naval vessel image of shooting of the camera of loading on unmanned aerial vehicle or the satellite.
And 202, performing labeling processing on the target ship image to obtain a labeled target ship image.
In some embodiments, the execution main body may perform labeling processing on the target ship image to obtain a labeled target ship image. Wherein, the label can mark the area of the ship in the target ship image. In practice, the execution main body may perform labeling processing on the target ship image by using a classification algorithm to obtain a labeled target ship image. The classification algorithm may include, but is not limited to, at least one of: k-means algorithm, ISODATA (Iterative Self-Organizing Data Analysis Technique), PCA (Principal Components Analysis). Thus, a region of interest in the target vessel image can be distinguished by the annotation region, wherein the region of interest can be a region including the vessel.
In some optional implementation manners of some embodiments, first, the execution main body may perform area division processing on the target ship image to obtain an area-divided target ship image. And the target ship image after the area division comprises at least one area to be marked. Alternatively, the execution main body may perform the same number of division processes on the length and the width of the target ship image, respectively. For example, the target ship image has a length of 200 pixels and a width of 100 pixels, and the target ship image is divided into 100 regions having a length of 20 pixels and a width of 10 pixels by 10 parts. Wherein, each area with the length of 20 pixels and the width of 10 pixels is the area to be marked.
Then, the execution main body can label the region to be labeled included in the target ship image after the region division to obtain the labeled target ship image. Optionally, the executing main body may label the region to be labeled included in the target ship image after the region division by using a region-sensitive identification network. The above-mentioned area sensitive identification network may include, but is not limited to, at least one of the following: k-means algorithm, ISODATA, PCA. Therefore, the position of the region where the target ship is located can be located by marking the target marking value.
And 203, cutting the marked target ship image to obtain a target ship sub-image set.
In some embodiments, the execution main body may perform clipping processing on the labeled target ship image to obtain a target ship sub-image set. In practice, the execution main body can cut the labeled target ship image, and the region corresponding to the target labeled value is used as a target ship sub-image, so as to obtain a target ship sub-image set. The target labeled value may be a preset threshold, and the setting of the preset threshold is not limited herein. Therefore, the target ship sub-image set capable of being detected by the detection sub-model can be obtained by cutting the target ship image.
And 204, selecting a detection sub-model matched with the target ship sub-image from a small-scale detection sub-model, a medium-scale detection sub-model and a large-scale detection sub-model for each target ship sub-image in the target ship sub-image set.
In some embodiments, the executing subject may select, for each target vessel sub-image in the target vessel sub-image set, a detection sub-model matching the target vessel sub-image from a small-scale detection sub-model, a medium-scale detection sub-model, and a large-scale detection sub-model. In practice, the execution subject may select a detection submodel matched with the target ship subimage from a small-scale detection submodel, a medium-scale detection submodel and a large-scale detection submodel in a cyclic manner. The above-described cyclic manner may refer to assigning a first target vessel sub-image in the target vessel sub-image set to a small-scale detection sub-model, a second target vessel sub-image to a medium-scale detection sub-model, a third target vessel sub-image to a large-scale detection sub-model, a fourth target vessel sub-image to a small-scale detection sub-model, and so on. Therefore, the target ship subimages in the target ship subimage set can be distributed to different detection submodels for processing, the processing speed of the target ship subimages is increased, and the processing speed of the target ship subimages is increased.
In some optional implementations of some embodiments, the selecting, by the execution main body, for each target ship sub-image in the target ship sub-image set, a detection sub-model matching the target ship sub-image from a small-scale detection sub-model, a medium-scale detection sub-model, and a large-scale detection sub-model may include:
in the first step, in response to the fact that the number of pixels included in the target ship sub-image is smaller than or equal to a first preset threshold value, a small-scale detection sub-model is selected as a detection sub-model matched with the target ship sub-image.
Wherein the number of pixels included in the target vessel sub-image is a product value of a length pixel and a width pixel of the target vessel sub-image. The first preset threshold may be a preset threshold, and the setting of the preset threshold is not limited herein. In practice, first, it may be determined whether the number of pixels included in the target ship sub-image is less than or equal to a first preset threshold. Then, in response to determining yes, a small-scale detection submodel may be selected as the detection submodel that matches the target vessel subimage. Therefore, the target ship sub-image meeting the processing condition of the small-scale detection sub-model can be allocated to the small-scale detection sub-model for processing.
As an example, one of the target ship sub-images in the target ship sub-image set may include 900 pixels, which are 30 × 30. The first preset threshold may be 2000. It may be determined that the number of pixels 900 comprised by the target vessel sub-image is less than a first preset threshold 2000 and a small scale detection sub-model is selected as the detection sub-model matching the target vessel sub-image.
And secondly, in response to the fact that the number of pixels included in the target ship subimage is larger than a first preset threshold value and smaller than or equal to a second preset threshold value, selecting a mesoscale detection submodel as a detection submodel matched with the target ship subimage.
And the second preset threshold is greater than the first preset threshold. The second preset threshold may be a preset threshold, and the setting of the preset threshold is not limited herein. In practice, first, the execution subject may determine whether the number of pixels included in the target ship sub-image is greater than a first preset threshold and is less than or equal to a second preset threshold. Then, in response to determining yes, a mesoscale detection submodel may be selected as the detection submodel that matches the target vessel subimage. Therefore, the target ship sub-image meeting the processing condition of the mesoscale detection sub-model can be allocated to the mesoscale detection sub-model for processing.
As an example, one of the target ship sub-images in the target ship sub-image set may include 4800 pixels by 80 × 60. The first preset threshold may be 2000, and the second preset threshold may be 8000. It may be determined that the number of pixels included in the target vessel subimage 4800 is greater than a first preset threshold 2000 and less than a second preset threshold 8000, and a mesoscale detection submodel is selected as the detection submodel matched with the target vessel subimage.
And thirdly, in response to the fact that the number of pixels included in the target ship subimage is larger than a second preset threshold value, selecting a large-scale detection submodel as a detection submodel matched with the target ship subimage.
In practice, first, the execution subject may determine whether the number of pixels included in the target ship sub-image is greater than the second preset threshold. Then, in response to determining yes, a large-scale detection submodel may be selected as the detection submodel that matches the target vessel subimage. Therefore, the target ship sub-image meeting the processing condition of the large-scale detection sub-model can be allocated to the large-scale detection sub-model for processing.
As an example, one of the target ship sub-images in the target ship sub-image set may include pixels of which the number is 200 × 200 ═ 40000. The second preset threshold may be 8000. It may be determined that the number of pixels 40000 included in the target vessel subimage is greater than a second preset threshold 8000, and the large-scale detection submodel is selected as the detection submodel matched with the target vessel subimage.
Therefore, each target ship subimage in the target ship subimage set can be allocated to the detection submodels with different scales for processing through preset conditions, and the processing speed of the target ship subimage is increased under the condition of not influencing the ship detection precision, so that the processing speed of the target ship subimage is further increased.
Step 205, inputting the target vessel subimage into a detection submodel matched with the target vessel subimage for each target vessel subimage in the target vessel subimage set to obtain a vessel detection result.
In some embodiments, the execution subject may input, for each target vessel sub-image in the target vessel sub-image set, the target vessel sub-image into a detection sub-model matched with the target vessel sub-image, so as to obtain a vessel detection result.
In practice, the execution main body may input, to the small-scale detection sub-model, for each target vessel sub-image in the target vessel sub-image set, the target vessel sub-image that satisfies that the number of pixels included in the target vessel sub-image is less than or equal to a first preset threshold, and obtain a detection result of the target vessel sub-image that the number of pixels is less than or equal to the first preset threshold. And inputting the target ship subimage meeting the condition that the number of pixels included in the target ship subimage is greater than a first preset threshold and less than or equal to a second preset threshold into the mesoscale detection submodel to obtain a detection result of the target ship subimage of which the number of pixels is greater than the first preset threshold and less than or equal to the second preset threshold. And inputting the target ship sub-image which meets the condition that the number of pixels included in the target ship sub-image is greater than a second preset threshold value into the large-scale detection sub-model to obtain a detection result of the target ship sub-image of which the number of pixels is greater than the second preset threshold value. Therefore, the detection submodels with different scales can process the input target ship subimage to obtain the detection result of the target ship subimage.
As an example, the execution subject may input the target ship sub-image, which includes the number of pixels equal to or less than 2000 and the number of pixels of which is 30 × 30 ═ 900, to the small-scale detection sub-model, and obtain the detection result of the target ship sub-image, which includes the number of pixels of which is 900. Inputting the target ship sub-image with the number of pixels which is larger than 2000 and is less than or equal to 8000 and the number of pixels of 80 multiplied by 60 being 4800 into the mesoscale detection sub-model, and obtaining the detection result of the target ship sub-image with the number of pixels of 4800. Inputting the target ship sub-image with the number of pixels which is larger than 8000 and the number of pixels which is 200 multiplied by 200 which is 40000 into the large-scale detection sub-model, and obtaining the detection result of the target ship sub-image with the number of pixels 40000.
And step 206, sending the ship detection result of each target ship sub-image in the target ship sub-image set to a target terminal for displaying.
In some embodiments, the execution subject may send the ship detection result of each target ship sub-image in the target ship sub-image set to a target terminal for display. The target terminal can comprise a computer display screen. Therefore, the detection sub-model can display the detection result of each target ship sub-image in the target ship sub-image set.
Optionally, the executing main body may perform image fusion processing on the ship detection result of each target ship sub-image in the target ship sub-image set and the target ship image to obtain a fused target ship image. In practice, a fusion algorithm can be adopted to perform image fusion processing on the target ship image. The fusion algorithm may include, but is not limited to, at least one of: DLF (Deep Learning Framework) or ResNet (Residual Network). And then, sending the fused target ship image to the target terminal for displaying. Therefore, the detection submodel can display the whole image obtained by fusing the detection result of each target ship subimage in the target ship subimage set and the target ship image.
The above embodiments of the present disclosure have the following advantages: by the ship detection method of some embodiments of the present disclosure, the detection speed can be increased while maintaining the accuracy unchanged. Specifically, the reason why the detection time of the ship is long and even the detection fails is that: when the image to be inspected is oversized, the inspection equipment can take a relatively long time to inspect or even fail due to insufficient computational power. Based on this, the ship detection method of some embodiments of the present disclosure first obtains a target ship image. Wherein the target vessel image includes at least one vessel region. Therefore, the acquired target ship image can provide data support for the back-side labeling processing of the target ship image. And then, performing labeling processing on the target ship image to obtain a labeled target ship image. Thus, a region of interest in the target vessel image can be distinguished by the annotation region, wherein the region of interest can be a region including the vessel. And then, cutting the marked target ship image to obtain a target ship sub-image set. Therefore, the target ship sub-image set capable of being detected by the detection sub-model can be obtained by cutting the target ship image. And then, for each target ship subimage in the target ship subimage set, selecting a detection submodel matched with the target ship subimage from a small-scale detection submodel, a medium-scale detection submodel and a large-scale detection submodel. Therefore, the target ship subimages in the target ship subimage set can be distributed to different detection submodels for processing, the processing speed of the target ship subimages is increased, and the processing speed of the target ship subimages is increased. And then, inputting the target vessel subimage into a detection submodel matched with the target vessel subimage to obtain a vessel detection result for each target vessel subimage in the target vessel subimage set. Therefore, the detection submodels with different scales can process the input target ship subimage to obtain the detection result of the target ship subimage. And finally, sending the ship detection result of each target ship subimage in the target ship subimage set to a target terminal for displaying. Therefore, the detection sub-model can display the detection result of each target ship sub-image in the target ship sub-image set. And the target detection image is divided into different sizes and processed by detection submodels with different scales, so that the pressure of one model during detection is reduced, and the detection speed of the model is increased.
With further reference to fig. 3, as an implementation of the methods shown in the above figures, the present disclosure provides some embodiments of a ship detection apparatus, which correspond to those of the method embodiments shown in fig. 2, and which may be applied in various electronic devices in particular.
As shown in fig. 3, the ship detection apparatus 300 of some embodiments includes: an acquisition unit 301, a labeling unit 302, a clipping unit 303, a selection unit 304, a detection unit 305, and a transmission unit 306. The acquisition unit 301 is configured to acquire a target ship image, where the target ship image includes at least one ship region; the labeling unit 302 is configured to label the target ship image to obtain a labeled target ship image; the cutting unit 303 is configured to cut the labeled target ship image to obtain a target ship sub-image set; a selecting unit 304 configured to select, for each target vessel sub-image in the target vessel sub-image set, a detection sub-model matching the target vessel sub-image from a small-scale detection sub-model, a medium-scale detection sub-model, and a large-scale detection sub-model; a detection unit 305 configured to input, for each target vessel sub-image in the target vessel sub-image set, the target vessel sub-image into a detection sub-model matched with the target vessel sub-image, so as to obtain a vessel detection result; a sending unit 306 configured to send the ship detection result of each target ship sub-image in the target ship sub-image set to a target terminal for display.
It will be understood that the units described in the apparatus 300 correspond to the various steps in the method described with reference to fig. 2. Thus, the operations, features and resulting advantages described above with respect to the method are also applicable to the apparatus 300 and the units included therein, and are not described herein again.
Referring now to FIG. 4, a block diagram of an electronic device (such as computing device 101 shown in FIG. 1)400 suitable for use in implementing some embodiments of the present disclosure is shown. The electronic device shown in fig. 4 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 4, electronic device 400 may include a processing device (e.g., central processing unit, graphics processor, etc.) 401 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)402 or a program loaded from a storage device 408 into a Random Access Memory (RAM) 403. In the RAM403, various programs and data necessary for the operation of the electronic apparatus 400 are also stored. The processing device 401, the ROM 402, and the RAM403 are connected to each other via a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
Generally, the following devices may be connected to the I/O interface 405: input devices 406 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 407 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 408 including, for example, tape, hard disk, etc.; and a communication device 409. The communication means 409 may allow the electronic device 400 to communicate wirelessly or by wire with other devices to exchange data. While fig. 4 illustrates an electronic device 400 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 4 may represent one device or may represent multiple devices as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network through the communication device 409, or from the storage device 408, or from the ROM 402. The computer program, when executed by the processing apparatus 401, performs the above-described functions defined in the methods of some embodiments of the present disclosure.
It should be noted that the computer readable medium described in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: and acquiring a target ship image, wherein the target ship image comprises at least one ship area. And performing annotation processing on the target ship image to obtain an annotated target ship image. And cutting the marked target ship image to obtain a target ship subimage set. And for each target ship subimage in the target ship subimage set, selecting a detection submodel matched with the target ship subimage from a small-scale detection submodel, a medium-scale detection submodel and a large-scale detection submodel. And for each target vessel subimage in the target vessel subimage set, inputting the target vessel subimage into a detection submodel matched with the target vessel subimage to obtain a vessel detection result. And sending the ship detection result of each target ship subimage in the target ship subimage set to a target terminal for displaying.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by software, and may also be implemented by hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a labeling unit, a clipping unit, a selection unit, a detection unit, and a transmission unit. The names of these units do not limit the unit itself in some cases, and for example, the labeling unit may also be described as a "unit that performs labeling processing on the target ship image to obtain a labeled target ship image".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (10)

1. A ship detection method comprising:
acquiring a target ship image, wherein the target ship image comprises at least one ship area;
performing labeling processing on the target ship image to obtain a labeled target ship image;
cutting the marked target ship image to obtain a target ship sub-image set;
for each target ship sub-image in the target ship sub-image set, selecting a detection sub-model matched with the target ship sub-image from a small-scale detection sub-model, a medium-scale detection sub-model and a large-scale detection sub-model;
for each target ship sub-image in the target ship sub-image set, inputting the target ship sub-image into a detection sub-model matched with the target ship sub-image to obtain a ship detection result;
and sending the ship detection result of each target ship subimage in the target ship subimage set to a target terminal for displaying.
2. The method of claim 1, wherein the acquiring a target vessel image comprises:
and acquiring a target ship image uploaded by a user or a target ship image shot by a camera.
3. The method of claim 1, wherein the labeling the target vessel image to obtain a labeled target vessel image comprises:
performing regional division processing on the target ship image to obtain a regional divided target ship image, wherein the regional divided target ship image comprises at least one region to be marked;
and marking the region to be marked included in the target ship image after the region division to obtain a marked target ship image.
4. The method of claim 3, wherein the performing region partition processing on the target vessel image comprises:
and respectively carrying out equal division processing on the length and the width of the target ship image.
5. The method according to claim 3, wherein the labeling the region to be labeled included in the region-divided target ship image comprises:
and marking the region to be marked included in the target ship image after the region division by adopting a region sensitive identification network.
6. The method of claim 1, wherein the selecting, for each target vessel sub-image in the set of target vessel sub-images, a detection sub-model that matches the target vessel sub-image from a small-scale detection sub-model, a medium-scale detection sub-model, and a large-scale detection sub-model comprises:
in response to determining that the number of pixels included in the target vessel sub-image is less than or equal to a first preset threshold, selecting the small-scale detection sub-model as a detection sub-model matched with the target vessel sub-image;
in response to determining that the number of pixels included in the target vessel subimage is greater than the first preset threshold and less than or equal to a second preset threshold, selecting the mesoscale detection submodel as a detection submodel matched with the target vessel subimage, wherein the second preset threshold is greater than the first preset threshold;
in response to determining that the number of pixels included in the target vessel sub-image is greater than the second preset threshold, selecting the large-scale detection sub-model as the detection sub-model matching the target vessel sub-image.
7. The method of claim 1, wherein the method further comprises:
carrying out image fusion processing on the ship detection result of each target ship subimage in the target ship subimage set and the target ship image to obtain a fused target ship image;
and sending the fused target ship image to the target terminal for display.
8. A ship detection device comprising:
an acquisition unit configured to acquire a target vessel image, wherein the target vessel image includes at least one vessel region therein;
the labeling unit is configured to label the target ship image to obtain a labeled target ship image;
the cutting unit is configured to cut the marked target ship image to obtain a target ship sub-image set;
a selection unit configured to select, for each target vessel sub-image in the set of target vessel sub-images, a detection sub-model matching the target vessel sub-image from a small-scale detection sub-model, a medium-scale detection sub-model, and a large-scale detection sub-model;
the detection unit is configured to input the target ship sub-image into a detection sub-model matched with the target ship sub-image for each target ship sub-image in the target ship sub-image set to obtain a ship detection result;
a sending unit configured to send a ship detection result of each target ship sub-image in the target ship sub-image set to a target terminal for display.
9. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
10. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1 to 7.
CN202110927399.8A 2021-08-10 2021-08-10 Ship detection method, device, electronic equipment and computer readable medium Pending CN113705565A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110927399.8A CN113705565A (en) 2021-08-10 2021-08-10 Ship detection method, device, electronic equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110927399.8A CN113705565A (en) 2021-08-10 2021-08-10 Ship detection method, device, electronic equipment and computer readable medium

Publications (1)

Publication Number Publication Date
CN113705565A true CN113705565A (en) 2021-11-26

Family

ID=78652617

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110927399.8A Pending CN113705565A (en) 2021-08-10 2021-08-10 Ship detection method, device, electronic equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN113705565A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019120543A (en) * 2017-12-28 2019-07-22 古野電気株式会社 Target detection device
CN110807362A (en) * 2019-09-23 2020-02-18 腾讯科技(深圳)有限公司 Image detection method and device and computer readable storage medium
CN111008585A (en) * 2019-11-29 2020-04-14 西安电子科技大学 Ship target detection method based on self-adaptive layered high-resolution SAR image
CN111444365A (en) * 2020-03-27 2020-07-24 Oppo广东移动通信有限公司 Image classification method and device, electronic equipment and storage medium
CN111507958A (en) * 2020-04-15 2020-08-07 全球能源互联网研究院有限公司 Target detection method, training method of detection model and electronic equipment
CN112347895A (en) * 2020-11-02 2021-02-09 北京观微科技有限公司 Ship remote sensing target detection method based on boundary optimization neural network
CN112991349A (en) * 2019-12-17 2021-06-18 阿里巴巴集团控股有限公司 Image processing method, device, equipment and storage medium
CN113011390A (en) * 2021-04-23 2021-06-22 电子科技大学 Road pedestrian small target detection method based on image partition
US20210209344A1 (en) * 2020-06-29 2021-07-08 Beijing Baidu Netcom Science And Technology Co., Ltd. Image recognition method and apparatus, device, and computer storage medium
CN113221895A (en) * 2021-05-31 2021-08-06 北京灵汐科技有限公司 Small target detection method, device, equipment and medium
JP2021192223A (en) * 2020-05-18 2021-12-16 ベイジン バイドゥ ネットコム サイエンス テクノロジー カンパニー リミテッド Method and apparatus for labeling object, electronic device, computer readable storage medium and computer program

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019120543A (en) * 2017-12-28 2019-07-22 古野電気株式会社 Target detection device
CN110807362A (en) * 2019-09-23 2020-02-18 腾讯科技(深圳)有限公司 Image detection method and device and computer readable storage medium
CN111008585A (en) * 2019-11-29 2020-04-14 西安电子科技大学 Ship target detection method based on self-adaptive layered high-resolution SAR image
CN112991349A (en) * 2019-12-17 2021-06-18 阿里巴巴集团控股有限公司 Image processing method, device, equipment and storage medium
CN111444365A (en) * 2020-03-27 2020-07-24 Oppo广东移动通信有限公司 Image classification method and device, electronic equipment and storage medium
CN111507958A (en) * 2020-04-15 2020-08-07 全球能源互联网研究院有限公司 Target detection method, training method of detection model and electronic equipment
JP2021192223A (en) * 2020-05-18 2021-12-16 ベイジン バイドゥ ネットコム サイエンス テクノロジー カンパニー リミテッド Method and apparatus for labeling object, electronic device, computer readable storage medium and computer program
US20210209344A1 (en) * 2020-06-29 2021-07-08 Beijing Baidu Netcom Science And Technology Co., Ltd. Image recognition method and apparatus, device, and computer storage medium
CN112347895A (en) * 2020-11-02 2021-02-09 北京观微科技有限公司 Ship remote sensing target detection method based on boundary optimization neural network
CN113011390A (en) * 2021-04-23 2021-06-22 电子科技大学 Road pedestrian small target detection method based on image partition
CN113221895A (en) * 2021-05-31 2021-08-06 北京灵汐科技有限公司 Small target detection method, device, equipment and medium

Similar Documents

Publication Publication Date Title
CN109308490B (en) Method and apparatus for generating information
CN109242801B (en) Image processing method and device
CN111784712B (en) Image processing method, device, equipment and computer readable medium
CN109947989B (en) Method and apparatus for processing video
CN110516678B (en) Image processing method and device
CN111598006B (en) Method and device for labeling objects
CN110059623B (en) Method and apparatus for generating information
CN110084317B (en) Method and device for recognizing images
CN112907628A (en) Video target tracking method and device, storage medium and electronic equipment
CN117690063A (en) Cable line detection method, device, electronic equipment and computer readable medium
CN110084298B (en) Method and device for detecting image similarity
CN112418054B (en) Image processing method, apparatus, electronic device, and computer readable medium
CN111586295B (en) Image generation method and device and electronic equipment
CN109819026B (en) Method and device for transmitting information
CN111815654A (en) Method, apparatus, device and computer readable medium for processing image
CN113808134B (en) Oil tank layout information generation method, oil tank layout information generation device, electronic apparatus, and medium
CN113705565A (en) Ship detection method, device, electronic equipment and computer readable medium
CN115086541B (en) Shooting position determining method, device, equipment and medium
CN111488928B (en) Method and device for acquiring samples
CN112418233B (en) Image processing method and device, readable medium and electronic equipment
CN115082516A (en) Target tracking method, device, equipment and medium
CN112712070A (en) Question judging method and device for bead calculation questions, electronic equipment and storage medium
CN115086538A (en) Shooting position determining method, device, equipment and medium
CN110991312A (en) Method, apparatus, electronic device, and medium for generating detection information
CN111680754A (en) Image classification method and device, electronic equipment and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination