CN116030298A - Scene complexity classification method, storage medium and device for ship navigation image - Google Patents

Scene complexity classification method, storage medium and device for ship navigation image Download PDF

Info

Publication number
CN116030298A
CN116030298A CN202211656202.2A CN202211656202A CN116030298A CN 116030298 A CN116030298 A CN 116030298A CN 202211656202 A CN202211656202 A CN 202211656202A CN 116030298 A CN116030298 A CN 116030298A
Authority
CN
China
Prior art keywords
complexity
image
ship navigation
scene
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211656202.2A
Other languages
Chinese (zh)
Inventor
石兵华
李睿恒
何舟
易娜
吴家兴
曹盼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HUBEI UNIVERSITY OF ECONOMICS
Original Assignee
HUBEI UNIVERSITY OF ECONOMICS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HUBEI UNIVERSITY OF ECONOMICS filed Critical HUBEI UNIVERSITY OF ECONOMICS
Priority to CN202211656202.2A priority Critical patent/CN116030298A/en
Publication of CN116030298A publication Critical patent/CN116030298A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a scene complexity classification method, a storage medium and a device for ship navigation images, wherein the method comprises the following steps: determining a data set of the ship navigation image according to the scene complexity classification level of the preset ship navigation image and the complexity vector of the preset ship navigation image; training a preset image complexity classification model according to a data set of the ship navigation image to obtain a target image complexity classification model; and classifying the data set of the target ship navigation image according to the complexity classification model of the target image to obtain the complexity level of the target ship navigation image. According to the invention, the complexity classification level and the corresponding relation of the complexity vector exist in the data set of the ship navigation image, so that the obtained target image complexity classification model can sense the complexity of the ship navigation scene, the complexity level of the target ship navigation image can be obtained through the model, and the long tail effect in the construction of the navigation scene of the visual image is solved.

Description

Scene complexity classification method, storage medium and device for ship navigation image
Technical Field
The invention relates to the technical field of intelligent ships, in particular to a scene complexity classification method, a storage medium and a device for ship navigation images.
Background
At present, unmanned and intelligent ship development becomes the mainstream research direction nowadays, and in order to realize real remote automatic or autonomous driving of intelligent ships in time, massive real ship test data are required to verify the safety and reliability of a navigation system.
The navigation scene construction technology based on the visual image has the characteristics of mature scheme, low cost, high scene restoration degree and the like, so that the technology is widely used for reconstructing the ship navigation scene in an omnibearing and multi-view manner. The biggest challenge in the visual image-based navigation scenario construction process is that it is difficult to cover those navigation scenarios that occur with a small probability and are at high risk, i.e. "long tail problem challenges".
The foregoing is provided merely for the purpose of facilitating understanding of the technical solutions of the present invention and is not intended to represent an admission that the foregoing is prior art.
Disclosure of Invention
The invention mainly aims to provide a scene complexity classification method, a storage medium and a device for ship navigation images, and aims to solve the technical problem of long tail effect which does not exist in navigation scene construction based on visual images.
In order to achieve the above object, the present invention provides a scene complexity classification method of a ship navigation image, the scene complexity classification method of the ship navigation image comprising the steps of:
determining a data set of the ship navigation image according to the scene complexity classification level of the preset ship navigation image and the complexity vector of the preset ship navigation image;
training a preset image complexity classification model according to the data set of the ship navigation image to obtain a target image complexity classification model;
and classifying the data set of the target ship navigation image according to the complexity classification model of the target image to obtain the complexity level of the target ship navigation image.
Optionally, before the step of determining the dataset of the ship navigation image according to the scene complexity classification level of the preset ship navigation image and the complexity vector of the preset ship navigation image, the method further includes:
acquiring scene pictures of navigation along the ship according to the navigation route of the ship;
determining a complex factor influencing ship navigation according to the element characteristics of the scene picture;
and determining scene complexity classification grades of the preset ship navigation images according to the complexity factors.
Optionally, the step of determining a scene complexity classification level of the preset ship navigation image according to the complexity factor includes:
and arranging and combining the complex factors, and determining scene complexity classification levels of the preset ship navigation images according to the result of the arrangement and combination of the complex factors.
Optionally, after the step of determining the scene complexity classification level of the preset ship navigation image according to the complexity factor, the method further includes:
determining gray level distribution conditions, information content, definition, local features and feature similarity of the scene image according to a preset image complexity algorithm and the scene image;
and determining a complexity vector of a preset ship navigation image according to the gray level distribution condition, the information content, the definition, the local characteristics and the characteristic similarity of the scene image.
Optionally, the step of training a preset image complexity classification model according to the dataset of the ship navigation image to obtain a target image complexity classification model includes:
initializing weights of a dataset of the ship navigation image;
training a preset image complexity classification model according to the data set of the ship navigation image to obtain a target identification layer in the preset image complexity classification model;
And determining a target image complexity classification model according to the target recognition layer.
Optionally, the step of training a preset image complexity classification model according to the dataset of the ship navigation image to obtain the target recognition layer in the preset image complexity classification model includes:
classifying complexity vectors of the data set of the ship navigation image according to a preset image complexity classification model;
updating the weight of the dataset of the ship navigation image according to the classification result;
and when the preset classification times are reached, determining a target identification layer in the preset image complexity classification model according to the updated weight.
Optionally, the step of determining the target recognition layer in the preset image complexity classification model according to the updated weight when the preset classification times are reached includes:
when the preset classification times are reached, updating the weight of the data set of the ship navigation image and determining the confidence coefficient of the updated weight;
determining a first weight of each identification layer in the preset image complexity classification model;
and determining a target identification layer in the preset image complexity classification model according to the first weight and the confidence level.
In addition, in order to achieve the above object, the present invention also proposes a scene complexity classification device of a ship navigation image, the scene complexity classification device of a ship navigation image comprising a memory, a processor and a scene complexity classification program stored on the memory and operable on the processor, the scene complexity classification program of a ship navigation image being configured to implement the scene complexity classification method of a ship navigation image as described above.
In addition, in order to achieve the above object, the present invention also proposes a storage medium having stored thereon a scene complexity classification program of a ship navigation image, which when executed by a processor, implements the scene complexity classification method of a ship navigation image as described above.
In addition, in order to achieve the above object, the present invention also provides a scene complexity classification device for a ship navigation image, the scene complexity classification device for a ship navigation image comprising: the system comprises a data set determining module, a model determining module and a grading module;
the data set determining module is used for determining a data set of the ship navigation image according to the scene complexity classification level of the preset ship navigation image and the complexity vector of the preset ship navigation image;
The model determining module is used for training a preset image complexity classification model according to the data set of the ship navigation image to obtain a target image complexity classification model;
the grading module is used for classifying the data set of the target ship navigation image according to the complexity classification model of the target image to obtain the complexity grade of the target ship navigation image.
The invention discloses a scene complexity classification method, a storage medium and a device for ship navigation images, wherein the method comprises the following steps: determining a data set of the ship navigation image according to the scene complexity classification level of the preset ship navigation image and the complexity vector of the preset ship navigation image; training a preset image complexity classification model according to a data set of the ship navigation image to obtain a target image complexity classification model; and classifying the data set of the target ship navigation image according to the complexity classification model of the target image to obtain the complexity level of the target ship navigation image. According to the invention, the complexity classification level and the corresponding relation of the complexity vector exist in the data set of the ship navigation image, so that the obtained target image complexity classification model can sense the complexity of the ship navigation scene, the complexity level of the target ship navigation image can be obtained through the model, and the long tail effect in the construction of the navigation scene of the visual image can be solved.
Drawings
Fig. 1 is a schematic structural diagram of a scene complexity classification device for ship navigation images of a hardware operation environment according to an embodiment of the present invention;
FIG. 2 is a flow chart of a first embodiment of a scene complexity classification method for ship navigation images according to the present invention;
FIG. 3 is a flow chart of a second embodiment of a scene complexity classification method for ship navigation images according to the present invention;
FIG. 4 is a flow chart of a third embodiment of a scene complexity classification method for ship navigation images according to the present invention;
FIG. 5 is a training structure diagram of a model for classifying the complexity of a preset image according to an embodiment of the method for classifying the complexity of a scene of a ship navigation image of the present invention;
fig. 6 is a block diagram illustrating a first embodiment of a scene complexity classification apparatus for ship navigation images according to the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, fig. 1 is a schematic view of a scene complexity classification device for a ship navigation image of a hardware operation environment according to an embodiment of the present invention.
As shown in fig. 1, the scene complexity classification apparatus of a ship navigation image may include: a processor 1001, such as a central processing unit (Central Processing Unit, CPU), a communication bus 1002, a user interface 1003, a network interface 1004, a memory 1005. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a Display (Display), and the optional user interface 1003 may also include a standard wired interface, a wireless interface, and the wired interface for the user interface 1003 may be a USB interface in the present invention. The network interface 1004 may optionally include a standard wired interface, a Wireless interface (e.g., a Wireless-Fidelity (Wi-Fi) interface). The Memory 1005 may be a high-speed random access Memory (Random Access Memory, RAM) or a stable Memory (NVM), such as a disk Memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
It will be appreciated by those skilled in the art that the structure shown in fig. 1 does not constitute a limitation of the scene complexity classification device of the ship navigation image, and may include more or fewer components than illustrated, or may combine certain components, or may be a different arrangement of components.
As shown in fig. 1, a memory 1005, which is considered to be a type of computer storage medium, may include an operating system, a network communication module, a user interface module, and a scene complexity classification program for a ship navigation image.
In the scene complexity classification device of the ship navigation image shown in fig. 1, the network interface 1004 is mainly used for connecting a background server, and performing data communication with the background server; the user interface 1003 is mainly used for connecting user equipment; the scene complexity classification device of the ship navigation image invokes a scene complexity classification program of the ship navigation image stored in the memory 1005 through the processor 1001, and executes the scene complexity classification method of the ship navigation image provided by the embodiment of the invention.
Based on the hardware structure, the embodiment of the scene complexity classification method of the ship navigation image is provided.
Referring to fig. 2, fig. 2 is a flowchart of a first embodiment of a scene complexity classification method for a ship navigation image according to the present invention, and the first embodiment of the scene complexity classification method for a ship navigation image according to the present invention is provided.
Step S10: and determining a data set of the ship navigation image according to the scene complexity classification level of the preset ship navigation image and the complexity vector of the preset ship navigation image.
It should be noted that, the execution body of the present embodiment may be a computer software device having functions of data processing, network communication, and program running, for example, a scene complexity classification device of a ship navigation image, or other electronic devices capable of implementing the same or similar functions, which is not limited in this embodiment.
It should be understood that in recent years, various new technologies of ships related to autonomous navigation are endless, and unmanned and intelligent technologies have become main directions of development of ships. Advanced sensing, planning, decision-making and control methods at home and abroad provide assistance for the research and development of intelligent ships. Intelligent ship development basically follows the development law from "manned to unmanned" and "dinghy to cargo ship". In order to realize the real remote automatic or autonomous driving of the intelligent ship, massive real ship test data are also required to verify the safety and reliability of a navigation system of the intelligent ship. If the ship to be tested is directly subjected to intelligent, remote, autonomous, unmanned navigation and other functions and performance tests in the public water area, the efficiency is extremely low, the cost is high, and even larger potential safety hazards can be brought to the ship and the ship around the ship. Therefore, before a real ship test is performed, various virtual-real combined simulation technologies are often required to build an intelligent ship navigation scene for performing the simulation test.
The navigation scene construction technology based on the visual image has the characteristics of mature scheme, low cost, high scene restoration degree and the like, so that the technology is widely used for reconstructing the ship navigation scene in an omnibearing and multi-view manner. The biggest challenge in the visual image-based navigation scenario construction process is that it is difficult to cover those navigation scenarios that occur with a small probability and are at high risk, i.e. "long tail problem challenges". Often to essentially address the long tail effect of scene coverage, there is a need to increase the diversity and complexity of constructing scenes, where diversity can be achieved by artificially selecting different test conditions. However, the complexity of navigation scenes is a relatively subjective feeling, and unified perception standards are needed. The complexity of the navigation scene can reflect the construction difficulty of the scene and the navigation capability of the ship to be tested for the whole intelligent ship test system.
Considering that the display mode of the navigation scene is a continuous image sequence or video, the complexity research of the intelligent ship navigation scene can refer to the related research results in the field of image engineering, and the main idea is to enable a computer to simulate human visual perception so as to quantitatively determine the image visual complexity perception. The usual solutions are: the image complexity calculation method based on information entropy, average information gain and significant region compression rate adopts a quantitative perception method of integral image characteristics such as gray level, color, edge, texture and the like, and finally a complexity perception result is obtained by using classification regression methods such as machine learning, neural network and the like. And wherein the texture feature is one of the most commonly used feature objects in image content complexity computation, for example: the literature describes image complexity using regularity and orientation in texture features, and density, roughness, and familiarity; the literature proposes that the BP neural network is utilized to train and obtain 5 index weight coefficients of energy, entropy, contrast, inverse moment and correlation, an image complexity perception model is built on the basis of the index weight coefficients, and experimental results show that the evaluation result of the model can accurately describe the complexity of an image.
In order to overcome the above-mentioned drawbacks, in this embodiment, for analyzing the complexity of the ship navigation scene, typical complex scenes are considered and carded, the scene complexity classification level of the preset ship navigation image is obtained by classifying the typical complex scenes according to the levels, the complexity vector of the preset ship navigation image is obtained by a gray level symbiotic matrix method according to the typical complex scene pictures, and the dataset of the ship navigation image is determined by the scene complexity classification level of the preset ship navigation image and the complexity vector of the preset ship navigation image, so that the scene complexity classification level of the preset ship navigation image and the complexity vector of the preset ship navigation image in the dataset of the ship navigation image have a one-to-one relationship, the preset image complexity classification model is trained according to the dataset of the ship navigation image, the obtained target image complexity classification model can sense the complex scene of the ship navigation, i.e. the complexity level of the complex scene can be classified, and the complexity level of the target ship navigation image is obtained according to the target image complexity classification model.
It should be noted that, long-tail effect means that those navigation scenes with small probability of occurrence and high risk are difficult to cover, so that the long-tail effect of scene coverage is generally solved essentially, the diversity and complexity of constructing scenes need to be improved, and therefore, the complexity of images needs to be estimated through an image complexity perception model, but most of methods in the field of image engineering are qualitatively described from the aspect of the overall complexity of images, and the result of analyzing the complexity of navigation scenes of focused intelligent ships is lacking. And at present, no literature is available for carrying out association research on image texture characteristics and navigation scene complexity. In addition, in the ship remote auxiliary driving test process, the complexity of images or videos is easily influenced by light irradiation, object surface reflection, internal performance parameters of an imaging sensor and other aspects, and especially the complexity of scenes under different navigation conditions is greatly different, so that the difficulty of scene complexity calculation and perception is increased. Therefore, the complex scene of ship navigation can be perceived through the target image complexity classification model obtained through training, namely the complex scene can be classified in complexity level, and the long tail effect existing in the navigation scene construction of the visual image is further solved.
It should be noted that, because the ship is on the sea area where the ship is sailing due to the influence of surrounding environment, weather, etc., there are many uncertain factors in the image collected by the ship under the sailing scene, so there are many complex scenes, however, the complex scenes are also divided to different degrees, and the complexity of the sailing scene of the ship needs to be graded by combining typical scene influence factors.
It can be appreciated that the scene complexity classification level of the preset ship navigation image may be 4, for example: the present embodiment is not limited to a non-complex scene, a slightly complex scene, a moderately complex scene, and a very complex scene.
It should be noted that, the texture feature in the intelligent navigation ship scene is an important basis for reflecting whether the scene is complex or not, and the texture feature information of the image can be obtained through a gray level co-occurrence matrix method, and the co-occurrence matrix method is to obtain the feature parameters of the image through calculating the gray level co-occurrence matrix of the image, so that the feature parameters obtained according to the gray level co-occurrence matrix form a complexity vector of the preset ship navigation image.
It should be noted that, there is a one-to-one correspondence between the scene complexity classification level of the ship navigation image and the complexity vector of the ship navigation image in the data set of the ship navigation image, that is, s= { x i ,y i X in } i E is the complexity vector of the ith ship navigation image;
Figure BDA0004012875840000081
and i=1, 2,..n, L0 is a non-complex scene, L1 is a slightly complex scene, L2 is a moderately complex scene, and L3 is a very complex scene.
The data set of the ship navigation image includes SMD data set, MODD data set, and YRNSD data set (self-acquired data set). The SMD data set is a data set in singapore waters from 7 months 2015 to 5 months 2016, and various environmental conditions were collected, such as before sunrise (40 minutes before sunrise), sunrise, noon, afternoon, evening, sunset (2 hours after sunset), haze and rainfall. The MODD dataset was acquired in the gulf of the sony sub-family using a camera fixed on an unmanned boat, with a time span of about 15 months, the camera capturing video of a given resolution at a rate of 10 frames per second. The self-picked YRNSD dataset was taken at the Yangtze river basin leg for a total of 64 videos, which covered various types of obstructions and meteorological conditions. Based on whether the frame image contains typical complex scene elements and the number thereof, carrying out complexity grading on the 3-class data set by adopting a manual classification mode, wherein the corresponding grading result is shown in a table 1, the table 1 is a data set and a complexity grading condition table thereof, and the unit of the data set is a frame:
Table 1-dataset and complexity ranking table therefor
Figure BDA0004012875840000082
The experimental environment of this example was Intel Core i7-8700K CPU 3.70GHz*12,NVIDIA GeForce GTX 1080Ti GPU 32G RAM. In order to ensure training efficiency and save computing resources, the data set unifies the resolution of each frame of image to be 500 x 280 before classifying and grading, and the network input is a 5-dimensional feature vector after the extraction of texture feature parameters. The above data sets were randomly divided into training and test sample sets at 70% and 30%. Taking the accuracy ACC of classification in the test set as an evaluation index of the complexity perception effect of the final navigation scene:
Figure BDA0004012875840000091
in TP i Representing the number of correctly perceived navigation scene complexity in the test sample; total represents the Total number of navigation scenarios in the test sample; k represents the number of complexity class levels to be perceived.
Step S20: training a preset image complexity classification model according to the data set of the ship navigation image to obtain a target image complexity classification model.
It should be noted that the preset image complexity classification model may be an integrated learning network model, and the complexity level classification is performed through the integrated learning network model, that is, a plurality of machine learning models are combined according to a certain policy to obtain a classification algorithm stronger than that of a single model. The AdaBoost algorithm in the preset image complexity classification model is a typical integrated algorithm, a sample subset is obtained through the operation of a sample set, a series of weak classifiers are generated on the sample subset through training of a weak classification algorithm, and finally the weak classifiers are synthesized into a classifier with stronger performance. The navigation scene complexity perception model based on the AdaBoost algorithm can model a nonlinear process of the multi-input multi-output classification problem, so that an integrated algorithm frame and a weak classifier are combined more tightly, and the scientificity and the effectiveness of the intelligent ship on the current navigation environment complexity perception result can be improved.
In specific implementation, initializing the weight of a data set of a ship navigation image, training a weak classifier of a preset image complexity classification level model through the data set of the ship navigation image, updating the weight of the data set of the ship navigation image and the confidence coefficient thereof when the training times reach a threshold value, calculating the weight of the weak classifier according to the weight of the data set of the ship navigation image, and carrying out weighted summation on the weights of the weak classifier to obtain a strong classifier, namely obtaining the target image complexity classification model.
Step S30: and classifying the data set of the target ship navigation image according to the complexity classification model of the target image to obtain the complexity level of the target ship navigation image.
It is understood that the dataset of the target vessel voyage image may be a test set divided from the dataset of the vessel voyage image.
It can be appreciated that the data set of the target ship navigation image is classified by the strong classifier of the target image complexity classification model, and the pictures in the data set of the target ship navigation image are correctly classified to the corresponding complexity level. Therefore, the difficulty in calculating and sensing the complexity of the ship navigation scene due to the influence of light irradiation, object surface reflection, internal performance parameters of the imaging sensor and the like in the ship remote auxiliary driving test is overcome.
According to the embodiment, a data set of the ship navigation image is determined according to the scene complexity classification level of the preset ship navigation image and the complexity vector of the preset ship navigation image; training a preset image complexity classification model according to a data set of the ship navigation image to obtain a target image complexity classification model; and classifying the data set of the target ship navigation image according to the complexity classification model of the target image to obtain the complexity level of the target ship navigation image. According to the embodiment, the complexity classification level and the corresponding relation of the complexity vector exist in the data set of the ship navigation image, so that the obtained target image complexity classification model can sense the complexity of the ship navigation scene, the complexity level of the target ship navigation image can be obtained through the model, and the long tail effect in the construction of the navigation scene of the visual image can be solved.
Referring to fig. 3, fig. 3 is a flowchart illustrating a second embodiment of the method for classifying scene complexity of a ship navigation image according to the present invention, and based on the first embodiment shown in fig. 2, the second embodiment of the method for classifying scene complexity of a ship navigation image according to the present invention is provided.
In a second embodiment, before the step S10, the method includes:
step S00: and obtaining scene pictures of the navigation along the ship according to the navigation route of the ship.
It will be appreciated that in order to solve the long tail effect, it is necessary to obtain as much as possible a complex scene of the ship during navigation, and different influencing factors exist on the navigation route of the ship, which may be a building on shore, weather or equipment for sensing environmental complexity, which is not limited by the comparison of this embodiment. Therefore, it is necessary to acquire pictures in these complex scenes for analysis.
It should be understood that the scene pictures of the ship sailing along the way can be acquired by manually shooting and collecting, and can also be acquired by shooting equipment of the ship in the ship sailing process.
Step S01: and determining a complex factor influencing ship navigation according to the element characteristics of the scene picture.
It can be appreciated that the navigation scene can be judged whether to belong to open water or complex water by the acquired scene pictures in the navigation process of the ship. Open waters are generally referred to as open ocean waters with an open field, while complex waters are generally referred to as heavy traffic inland, ports, etc. Typical complex scenes often occur in complex waters because there are many uncertainty factors in complex waters, and classifying these factors can result in the complexity factors that are contained in a typical complex scene.
It should be noted that, the complexity factors included in a typical complex scene may include (1) complex background elements: the background will have a certain proportion in the scene picture, and the pixel value of the corresponding region will change in a complex manner when the background is complex. For example, remote buildings, greenbelts, moored vessels, etc. in water have no other effect than increasing scene complexity, but rather interfere with the perception of the vessel's environment. (2) camera shake element: the problem of jitter and tilt of the camera itself should be taken into account in the construction of the scene based on visual images. For example, a ship is also subject to surface wind, waves, and currents when stationary, resulting in camera shake, which is more prevalent when the ship is in motion. Camera shake can tilt, rotate, or even distort the scene image, increasing the complexity of the navigation scene. (3) drop shadow element: the sun, clouds, and clusters of shoreside buildings, and vessels moving in the scene, all create drop shadows, e.g., the reflection of the clusters of shoreside buildings on the water surface. Depending on the type of shadow, it can be classified into self-shadow and drop shadow. The self-shadow is formed by shielding part of direct light rays when an object is irradiated; drop shadows are shadows of objects that cast onto surfaces of other objects when light is blocked. These shadows cast on the scene image may form blocks of pixels that are either bright or dark, resulting in varying navigational scene complexity. (4) light ray variation element: the stable light intensity is critical to the scene, however, in practice, due to natural light changes, camera exposure, and turning on or off of his boat lights, etc., a gradual change or abrupt change in the overall brightness of the navigation scene picture, such as in the case of a holiday afterglow, forms an excessively bright pixel block on the water surface. Such light changes have a significant impact on the complexity of the navigation scene. (5) special weather element: the complexity of navigation scenes can be influenced by special meteorological conditions such as night, rain, snow and haze, different meteorological types lead to different scene performances, such as raindrops, snowflakes and the like in rainy and snowy weather are easy to capture, so that the perception of other targets is interfered; haze weather will lead to reduced imaging contrast, blurred images, lost details, etc.
Step S02: and determining scene complexity classification grades of the preset ship navigation images according to the complexity factors.
It should be noted that, the complexity factors are combined according to the actual ship navigation situation to form the scene complexity classification level of the preset ship navigation image, and the scene complexity classification level of the preset ship navigation image may be divided into (1) L0 uncomplicated scene (Non-complexity): does not contain any typical complex scene elements. (2) L1-slightly complex scene (Low complex): including any of the typical complex scene elements. (3) L2-medium complex scene (Middle complexity): including any two typical complex scene elements. (4) L3-very complex scenario (High complex): any three typical complex scene elements are included.
Further, in order to solve the long tail effect occurring in navigation scene construction of the visual image, step S02 of this embodiment may include:
and arranging and combining the complex factors, and determining scene complexity classification levels of the preset ship navigation images according to the result of the arrangement and combination of the complex factors.
It should be noted that if the complex factors are arranged and combined without obstacle shielding, or only small obstacles are far away and do not influence the safe navigation of the ship, the complex factors can be classified into uncomplicated scenes; if the complex factor permutations are combined into drop shadow elements, the complex factor can be classified as a slightly complex scene; if the complex factor is arranged and combined into complex background elements and drop shadow elements, the complex factor can be classified into a moderately complex scene; if the complex factor permutation and combination contains both complex background elements and drop shadow elements and dynamic interference elements, the complex factor is classified as a very complex scene. The embodiment does not limit the results of the complex factor permutation and combination, and can be divided into different complex scenes according to different permutation results.
Further, in order to solve the long tail effect occurring in navigation scene construction of the visual image, step S02 of this embodiment may include:
determining gray level distribution conditions, information content, definition, local features and feature similarity of the scene image according to a preset image complexity algorithm and the scene image;
and determining a complexity vector of a preset ship navigation image according to the gray level distribution condition, the information content, the definition, the local characteristics and the characteristic similarity of the scene image.
It should be noted that the texture features in the intelligent navigation ship scene are important references reflecting whether the scene is complex or not. Referring to the related achievements in the field of image complexity, the texture feature information of the image can be obtained through a gray level co-occurrence matrix method. The symbiotic matrix method obtains characteristic parameters by calculating a gray level symbiotic matrix of an image, and Haralick proposes 14 statistical characteristic quantities to describe texture complexity of different images, and specifically comprises the following steps: energy, entropy, contrast, uniformity, correlation, variance, sum-of-average, sum-of-variance, sum-of-entropy, difference-of-variance, difference-of-average, difference-of-entropy, correlation information measure, and maximum correlation coefficient. However, there are problems of repetition and redundancy between these feature quantities. In order to solve the problem, the embodiment screens out 5 texture feature quantities which are small in correlation and easy to calculate, and carries out intelligent ship navigation scene complexity perception training calculation on the texture feature quantities which are energy, entropy, contrast, inverse moment and correlation.
In a specific implementation, assume a sailing scene image I of some size M N, (x) 1 ,y 1 ) And (x) 2 ,y 2 ) Two pixels with a distance d in the direction theta in the image I are adopted, and the gray level co-occurrence matrix of the navigation scene image I is calculated as follows:
P(i,j,d,θ)={(x 1 ,y 1 ),(x 2 ,y 2 )∈M×N|I(x 1 ,y 1 )=i,I(x 2 ,y 2 )=j}
energy (ASM) is often used to describe the distribution uniformity of the gray scale of a navigation scene image. When the element distribution in the gray level co-occurrence matrix is concentrated near the main diagonal, smaller ASM indicates that the pixel gray level distribution is more uniform and the texture is finer; otherwise, the pixel gray level distribution is uneven and the texture is rough, and the ASM calculation mode is as follows:
Figure BDA0004012875840000131
the Entropy (ENT) is used for describing the information content contained in the navigation scene image, and if the scene image does not contain texture features, the gray level co-occurrence matrix of the scene image is a zero matrix, and the corresponding entropy value is zero; conversely, the larger the amount of texture information contained in the scene, the larger the corresponding entropy value, and the following manner of ENT calculation is as follows:
Figure BDA0004012875840000132
the Contrast (CON) is used for reflecting the depth degree of the grooves of the image texture and the definition of the image, in a specific navigation scene, the clearer the image texture is, the larger the difference between adjacent gray scale pairs is, and the larger the CON value is; conversely, the smaller the CON value, the more CON is calculated as follows:
Figure BDA0004012875840000133
the Inverse Difference Moment (IDM) is a statistical feature quantity reflecting the degree of local variation of the image texture. When the IDM value is larger, the change between textures of different areas in the navigation scene is smaller; on the contrary, the calculation mode of IDM is as follows, wherein the variation among textures of different areas is larger:
Figure BDA0004012875840000134
The Correlation (COV) is used for measuring the similarity of elements of the gray level co-occurrence matrix in the row or column direction, when the similarity of the rows or columns is high, the COV value is larger, the complexity of the corresponding scene image is smaller, otherwise, the complexity is larger, and the COV is calculated as follows:
Figure BDA0004012875840000135
u in the formula 1 、u 2 Respectively represent the average value delta of the elements along the row and column directions of the normalized gray level co-occurrence matrix 1 、δ 2 Respectively representing the mean square value thereof.
It can be understood that 5 different characteristic parameters can be extracted for any one intelligent ship navigation scene image respectively, and the characteristic parameters are combined into a texture characteristic vector E= [ ASM, ENT, CON, IDM, COV ].
According to the embodiment, a scene picture of ship navigation along the way is obtained according to a ship navigation route; determining a complex factor influencing ship navigation according to the element characteristics of the scene picture; and determining scene complexity classification grades of the preset ship navigation images according to the complexity factors. According to the method, the complex factors influencing ship navigation are determined according to the scene pictures of ship navigation along the way, the complex factors are arranged and combined, the arranged and combined results are classified into scene complexity classification grades of preset ship navigation images, and intelligent ship navigation scenes are subjected to complexity division in the technical field of intelligent ships, so that the calculation and perception difficulties of the complex scenes are reduced.
Referring to fig. 4, fig. 4 is a flowchart illustrating a third embodiment of the method for classifying scene complexity of a ship navigation image according to the present invention, and based on the first embodiment shown in fig. 2, the third embodiment of the method for classifying scene complexity of a ship navigation image according to the present invention is provided.
In a third embodiment, the step S20 includes:
step S201: initializing weights of the dataset of vessel voyage images.
It will be appreciated that the weight initialization of the data sets of ship voyage images, e.g. data set D for each ship voyage image, is performed with the data sets of N ship voyage images 1 (i) Is initialized to 1/N.
Step S202: training a preset image complexity classification model according to the data set of the ship navigation image to obtain a target identification layer in the preset image complexity classification model.
The weak classifier in the preset image complexity classification model is trained according to the data set of the ship navigation image, the weight of the data set of the ship navigation image is adjusted, when the training times reach the preset times, the weight of the weak classifier is calculated, and the weight of the weak classifier is weighted and summed to obtain the strong classifier.
It should be noted that, the target recognition layer in the preset image complexity classification model may be a strong classifier in the preset image complexity classification model.
Further, in order to improve the efficiency of testing the data set, step S202 of this embodiment may further include:
classifying complexity vectors of the data set of the ship navigation image according to a preset image complexity classification model;
updating the weight of the dataset of the ship navigation image according to the classification result;
and when the preset classification times are reached, determining a target identification layer in the preset image complexity classification model according to the updated weight.
It should be noted that, the data set of the ship navigation image has a one-to-one correspondence between the scene complexity classification level of the preset ship navigation image and the complexity vector of the preset ship navigation image, so that the weak classifier of the preset image complexity classification model classifies the complexity vector of the data set of the ship navigation image, and when the weak classifier classifies the ship navigation image to the correct complexity classification level according to the complexity vector of the preset ship navigation image, the ship navigation image weight is reduced, otherwise, the ship navigation image weight is increased.
It can be understood that when the preset times are reached, the weights of the weak classifiers are determined according to the weights of the data sets of the updated ship navigation images, and the weights of the weak classifiers are weighted and summed to obtain the target recognition layer in the preset image complexity classification model, namely the strong classifier in the preset image complexity classification model.
Further, in order to improve the efficiency of testing the data set, step S202 of this embodiment may further include:
when the preset classification times are reached, updating the weight of the data set of the ship navigation image and determining the confidence coefficient of the updated weight;
determining a first weight of each identification layer in the preset image complexity classification model;
and determining a target identification layer in the preset image complexity classification model according to the first weight and the confidence level.
Taking the T (1T is less than or equal to T) th weak classifier as an example, when the training classification times reach the preset classification times, calculating the confidence coefficient according to the data set weight of the ship navigation image adjusted after the previous training, and determining the first weight of each recognition layer in the preset image complexity classification model according to the confidence coefficient, namely determining the weight of each weak classifier in the preset image complexity classification model according to the confidence coefficient, wherein the weight can be expressed as follows:
Figure BDA0004012875840000151
Wherein lambda is the learning rate; e, e t Is the confidence level; k represents the number of complexity levels to be perceived.
It can be appreciated that the weighted summation is performed according to the weight and the confidence of each weak classifier, so as to obtain a strong classifier, which can be expressed as follows:
Figure BDA0004012875840000152
step S203: and determining a target image complexity classification model according to the target recognition layer.
It can be understood that the scene complexity classification level of each ship navigation image in the ship navigation image test set can be obtained by classifying the test set of the ship navigation image through a strong classifier in the preset image complexity classification model. For ease of understanding, reference is made to fig. 5, which is a training structure diagram of a preset image complexity classification model.
It should be noted that, the target image complexity classification model based on the AdaBoost algorithm does not need to manually distinguish the importance of each characteristic index of the texture, so that the problem of subjectivity of the weight of each characteristic index can be avoided. And randomly selecting texture characteristic indexes of 18 frames of navigation scene images from the 3-class navigation scene data set as test samples, and testing by using a trained scene complexity perception model, wherein test results are shown in a table 2, and a table 2 bit complexity perception result table. Each characteristic index in table 2 is a normalized numerical value, the data of which is distributed between (0, 1), and the reference value is a complexity sensing result based on human vision. As can be seen from table 2, the complexity perception model proposed in the present embodiment is substantially consistent with the human visual perception result.
Table 2_complexity sense result table
Figure BDA0004012875840000161
To further verify and compare the effectiveness of the proposed method of the present embodiment, several other exemplary multi-classifiers were selected: the perception classification results of the BP neural network, the k-nearest neighbor (kNN), the Support Vector Machine (SVM) and the Convolutional Neural Network (CNN) are compared, the results are shown in a table 3, and the table 3 bit algorithm compares tables. The result shows that the algorithms have higher recognition accuracy when the navigation scene complexity is perceived. As can be seen from comparison, the algorithms presented herein are superior to other types of algorithms in terms of both accuracy and training time.
Table 3 algorithm comparison table
Figure BDA0004012875840000171
From the results of table 3, the 3-class dataset contains different numbers of frame images, and the accuracy of the final complexity perception rating results also has slight differences. When the actual test of the navigation scene is carried out, the actual distribution of the scene complexity is balanced. However, in the experiment of this embodiment, both the classical data set and the self-sampling data set focus on the scene with high complexity, and the occupation of the uncomplicated scene is relatively small. Taking self-collected data YRNSD as an example, the Yangtze river basin leg belongs to a typical inland water area, and the collected frame image features are characterized by heavy traffic, complex background on two sides and narrow channel, so that the occupancy rate of uncomplicated scenes in the data set is small.
In order to balance the data samples, the data augmentation operation of rotation, translation, scaling and mirroring is performed on uncomplicated navigation scenes in the original 3 data sets, so that the data samples are enlarged by 5 times. However, the complexity perception of the navigation scene is not significantly improved by re-clustering and computing the enlarged dataset. The reason for this is probably because the gray level co-occurrence matrix method applied in the texture extraction process, and the rotation, translation, scaling and mirroring operations only perform linear transformation on the image, without substantially changing the distribution of textures in the image. This also illustrates that the energy, entropy, contrast, inverse moment, correlation characteristic index is insensitive to rotation, translation, scaling, mirroring, etc. of the navigation scene, and can be used to perceive and distinguish navigation scenes of different complexity.
The embodiment initializes weights of the dataset of the ship navigation image; training a preset image complexity classification model according to the data set of the ship navigation image to obtain a target identification layer in the preset image complexity classification model; and determining a target image complexity classification model according to the target recognition layer. According to the method, the device and the system, the preset image complexity classification model is trained through the data set of the ship navigation image to obtain the target image complexity classification model, so that the test set can be classified according to the target image complexity classification model, and further the test efficiency of building the intelligent ship navigation scene by utilizing various virtual-real combined simulation technologies for simulation test is improved.
In addition, the embodiment of the invention also provides a storage medium, wherein the storage medium is stored with a scene complexity classification program of the ship navigation image, and the scene complexity classification program of the ship navigation image realizes the scene complexity classification method of the ship navigation image when being executed by a processor.
In addition, referring to fig. 6, an embodiment of the present invention further provides a device for classifying scene complexity of a ship navigation image, where the device for classifying scene complexity of a ship navigation image includes: a data set determining module 10, a model determining module 20, a grading module 30;
the data set determining module 10 is configured to determine a data set of the ship navigation image according to a scene complexity classification level of the preset ship navigation image and a complexity vector of the preset ship navigation image;
the model determining module 20 is configured to train a preset image complexity classification model according to the data set of the ship navigation image, so as to obtain a target image complexity classification model;
the grading module 30 is configured to classify the dataset of the target ship navigation image according to the complexity classification model of the target image, so as to obtain the complexity grade of the target ship navigation image.
According to the embodiment, a data set of the ship navigation image is determined according to the scene complexity classification level of the preset ship navigation image and the complexity vector of the preset ship navigation image; training a preset image complexity classification model according to a data set of the ship navigation image to obtain a target image complexity classification model; and classifying the data set of the target ship navigation image according to the complexity classification model of the target image to obtain the complexity level of the target ship navigation image. According to the embodiment, the complexity classification level and the corresponding relation of the complexity vector exist in the data set of the ship navigation image, so that the obtained target image complexity classification model can sense the complexity of the ship navigation scene, the complexity level of the target ship navigation image can be obtained through the model, and the long tail effect in the construction of the navigation scene of the visual image can be solved.
Based on the first embodiment of the scene complexity classification device for ship navigation images, a second embodiment of the scene complexity classification device for ship navigation images is provided.
In this embodiment, the device for classifying scene complexity of ship navigation images further includes a level determining module 00.
In this embodiment, the level determining module 00 is configured to obtain a scene picture of the navigation along the ship according to the navigation route of the ship.
Further, the level determining module 00 is further configured to determine a complexity factor affecting ship navigation according to the element characteristics of the scene picture.
Further, the level determining module 00 is further configured to determine a scene complexity classification level of the preset ship navigation image according to the complexity factor.
Further, the level determining module 00 is further configured to rank and combine the complexity factors, and determine a scene complexity classification level of the preset ship navigation image according to a result of rank and combine the complexity factors.
Further, the level determining module 00 is further configured to determine a gray level distribution condition, an information content, a definition, a local feature and a feature similarity of the scene image according to a preset image complexity algorithm and the scene image.
Further, the level determining module 00 is further configured to determine a complexity vector of a preset ship navigation image according to the gray level distribution condition, the information content, the definition, the local features and the feature similarity of the scene image.
Further, the model determination module 20 is also configured to initialize weights of the dataset of the ship navigation image.
Further, the model determining module 20 is further configured to train a preset image complexity classification model according to the dataset of the ship navigation image, so as to obtain a target recognition layer in the preset image complexity classification model.
Further, the model determining module 20 is further configured to determine a target image complexity classification model according to the target recognition layer.
Further, the model determining module 20 is further configured to classify the complexity vector of the dataset of the ship navigation image according to a preset image complexity classification model.
Further, the model determining module 20 is further configured to update the weight of the dataset of the ship navigation image according to the classification result.
Further, the model determining module 20 is further configured to determine, when the preset classification times are reached, a target recognition layer in the preset image complexity classification model according to the updated weights.
Further, the model determining module 20 is further configured to update the weight of the dataset of the ship navigation image and determine the confidence level of the updated weight when the preset classification times are reached.
Further, the model determining module 20 is further configured to determine a first weight of each recognition layer in the preset image complexity classification model.
Further, the model determining module 20 is further configured to determine an object recognition layer in the preset image complexity classification model according to the first weight and the confidence level.
Other embodiments or specific implementation manners of the scene complexity classification device for ship navigation images according to the present invention may refer to the above method embodiments, and are not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. read only memory mirror (Read Only Memory image, ROM)/random access memory (Random Access Memory, RAM), magnetic disk, optical disk), comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (10)

1. The scene complexity classification method of the ship navigation image is characterized by comprising the following steps of:
determining a data set of the ship navigation image according to the scene complexity classification level of the preset ship navigation image and the complexity vector of the preset ship navigation image;
training a preset image complexity classification model according to the data set of the ship navigation image to obtain a target image complexity classification model;
and classifying the data set of the target ship navigation image according to the complexity classification model of the target image to obtain the complexity level of the target ship navigation image.
2. The method for classifying scene complexity of a ship navigation image according to claim 1, wherein before the step of determining the dataset of the ship navigation image based on the scene complexity classification level of the preset ship navigation image and the complexity vector of the preset ship navigation image, further comprising:
acquiring scene pictures of navigation along the ship according to the navigation route of the ship;
determining a complex factor influencing ship navigation according to the element characteristics of the scene picture;
and determining scene complexity classification grades of the preset ship navigation images according to the complexity factors.
3. The scene complexity classification method of a ship navigation image according to claim 2, wherein the step of determining a scene complexity classification level of a preset ship navigation image according to the complexity factor comprises:
and arranging and combining the complex factors, and determining scene complexity classification levels of the preset ship navigation images according to the result of the arrangement and combination of the complex factors.
4. The method for classifying scene complexity of a ship navigation image according to claim 2, wherein after the step of determining a scene complexity classification level of a preset ship navigation image according to the complexity factor, further comprising:
determining gray level distribution conditions, information content, definition, local features and feature similarity of the scene image according to a preset image complexity algorithm and the scene image;
and determining a complexity vector of a preset ship navigation image according to the gray level distribution condition, the information content, the definition, the local characteristics and the characteristic similarity of the scene image.
5. The method for classifying scene complexity of a ship navigation image according to claim 1, wherein the training of the preset image complexity classification model according to the data set of the ship navigation image to obtain the target image complexity classification model comprises the steps of:
Initializing weights of a dataset of the ship navigation image;
training a preset image complexity classification model according to the data set of the ship navigation image to obtain a target identification layer in the preset image complexity classification model;
and determining a target image complexity classification model according to the target recognition layer.
6. The method for classifying scene complexity of a ship navigation image according to claim 5, wherein the training a preset image complexity classification model according to the data set of the ship navigation image to obtain the target recognition layer in the preset image complexity classification model comprises the steps of:
classifying complexity vectors of the data set of the ship navigation image according to a preset image complexity classification model;
updating the weight of the dataset of the ship navigation image according to the classification result;
and when the preset classification times are reached, determining a target identification layer in the preset image complexity classification model according to the updated weight.
7. The method for classifying scene complexity of a ship navigation image according to claim 5, wherein the step of determining the target recognition layer in the preset image complexity classification model according to the updated weight when the preset number of classifications is reached comprises:
When the preset classification times are reached, updating the weight of the data set of the ship navigation image and determining the confidence coefficient of the updated weight;
determining a first weight of each identification layer in the preset image complexity classification model;
and determining a target identification layer in the preset image complexity classification model according to the first weight and the confidence level.
8. A scene complexity classification apparatus of a ship navigation image, characterized in that the scene complexity classification apparatus of a ship navigation image comprises: memory, a processor and a scene complexity classification program of a ship navigation image stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the scene complexity classification method of a ship navigation image as claimed in any one of claims 1 to 7.
9. A storage medium having stored thereon a scene complexity classification program of a ship navigation image, which when executed by a processor, implements the steps of the scene complexity classification method of a ship navigation image according to any one of claims 1 to 7.
10. A scene complexity classification device for a ship navigation image, characterized in that the scene complexity classification device for a ship navigation image comprises: the system comprises a data set determining module, a model determining module and a grading module;
the data set determining module is used for determining a data set of the ship navigation image according to the scene complexity classification level of the preset ship navigation image and the complexity vector of the preset ship navigation image;
the model determining module is used for training a preset image complexity classification model according to the data set of the ship navigation image to obtain a target image complexity classification model;
the grading module is used for classifying the data set of the target ship navigation image according to the complexity classification model of the target image to obtain the complexity grade of the target ship navigation image.
CN202211656202.2A 2022-12-22 2022-12-22 Scene complexity classification method, storage medium and device for ship navigation image Pending CN116030298A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211656202.2A CN116030298A (en) 2022-12-22 2022-12-22 Scene complexity classification method, storage medium and device for ship navigation image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211656202.2A CN116030298A (en) 2022-12-22 2022-12-22 Scene complexity classification method, storage medium and device for ship navigation image

Publications (1)

Publication Number Publication Date
CN116030298A true CN116030298A (en) 2023-04-28

Family

ID=86069879

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211656202.2A Pending CN116030298A (en) 2022-12-22 2022-12-22 Scene complexity classification method, storage medium and device for ship navigation image

Country Status (1)

Country Link
CN (1) CN116030298A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117649635A (en) * 2024-01-30 2024-03-05 湖北经济学院 Method, system and storage medium for detecting shadow eliminating point of narrow water channel scene

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117649635A (en) * 2024-01-30 2024-03-05 湖北经济学院 Method, system and storage medium for detecting shadow eliminating point of narrow water channel scene

Similar Documents

Publication Publication Date Title
CN108596101B (en) Remote sensing image multi-target detection method based on convolutional neural network
CN110443143B (en) Multi-branch convolutional neural network fused remote sensing image scene classification method
Bahnsen et al. Rain removal in traffic surveillance: Does it matter?
Santra et al. Learning a patch quality comparator for single image dehazing
CN113688723B (en) Infrared image pedestrian target detection method based on improved YOLOv5
CN111914686B (en) SAR remote sensing image water area extraction method, device and system based on surrounding area association and pattern recognition
Yang et al. Single image haze removal via region detection network
CN113780296B (en) Remote sensing image semantic segmentation method and system based on multi-scale information fusion
CN110796009A (en) Method and system for detecting marine vessel based on multi-scale convolution neural network model
US10325371B1 (en) Method and device for segmenting image to be used for surveillance using weighted convolution filters for respective grid cells by converting modes according to classes of areas to satisfy level 4 of autonomous vehicle, and testing method and testing device using the same
CN111968088B (en) Building detection method based on pixel and region segmentation decision fusion
CN112597815A (en) Synthetic aperture radar image ship detection method based on Group-G0 model
CN110826411B (en) Vehicle target rapid identification method based on unmanned aerial vehicle image
US20230281913A1 (en) Radiance Fields for Three-Dimensional Reconstruction and Novel View Synthesis in Large-Scale Environments
Cheng et al. A highway traffic image enhancement algorithm based on improved GAN in complex weather conditions
CN115272876A (en) Remote sensing image ship target detection method based on deep learning
CN116030298A (en) Scene complexity classification method, storage medium and device for ship navigation image
CN110633633B (en) Remote sensing image road extraction method based on self-adaptive threshold
CN113850783B (en) Sea surface ship detection method and system
CN115861756A (en) Earth background small target identification method based on cascade combination network
Zhao et al. Image dehazing based on haze degree classification
Babu et al. An efficient image dahazing using Googlenet based convolution neural networks
CN114048536A (en) Road structure prediction and target detection method based on multitask neural network
KR20220045762A (en) System for automatic recognition and monitoring of vessel using artificial intelligence image processing and method for providing the same
Alsharay et al. Improved sea-ice identification using semantic segmentation with raindrop removal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination