CN114125228A - Wide dynamic image processing method of marine 360-degree panoramic image system - Google Patents

Wide dynamic image processing method of marine 360-degree panoramic image system Download PDF

Info

Publication number
CN114125228A
CN114125228A CN202111392528.4A CN202111392528A CN114125228A CN 114125228 A CN114125228 A CN 114125228A CN 202111392528 A CN202111392528 A CN 202111392528A CN 114125228 A CN114125228 A CN 114125228A
Authority
CN
China
Prior art keywords
scene
image data
image
scene image
marine
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111392528.4A
Other languages
Chinese (zh)
Inventor
董晓斐
王晓原
姜雨函
张朋元
李�一
夏国强
杨顺利
桑文征
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Navigation Brilliance Qingdao Technology Co Ltd
Original Assignee
Navigation Brilliance Qingdao Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Navigation Brilliance Qingdao Technology Co Ltd filed Critical Navigation Brilliance Qingdao Technology Co Ltd
Priority to CN202111392528.4A priority Critical patent/CN114125228A/en
Publication of CN114125228A publication Critical patent/CN114125228A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • G06T5/90
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Abstract

The invention discloses a wide dynamic image processing method of a marine 360-degree panoramic image system, which comprises the following steps: acquiring scene image data of a dynamic range in a ship surrounding environment and preprocessing the scene image data; acquiring chrominance information in the marine scene based on the preprocessed scene image data; obtaining a brightness image of the scene image data based on the chrominance information, and updating the chrominance information based on the brightness image; calculating and generating a feature vector of the scene image data through the saturation, the exposure and the contrast; marking scene image data, constructing a training sample based on the characteristic vector and a marking value, and inputting the training sample into an SVR (support vector regression) model to obtain a training model; reordering the brightness images and fusing the images through a training model; and linearly combining the updated chrominance information with the fused luminance image to obtain a 360-degree panoramic image. The invention can obtain clear image pictures in a high dynamic range area all weather, and provides safety guarantee for intelligent ship navigation.

Description

Wide dynamic image processing method of marine 360-degree panoramic image system
Technical Field
The invention relates to the technical field of picture optimization of a ship navigation auxiliary system, in particular to a wide dynamic image processing method of a 360-degree panoramic image system for a ship.
Background
In the current ship driving process, due to the influence of illumination, two problems are mainly faced: one is that the dynamic range of the traditional camera is too small, and the picture beyond the dynamic range in the sea surface scene cannot be obtained under strong illumination; the other is that the light on the sea surface is too dark at night, and a camera cannot acquire clear pictures during navigation.
For the problem that the dynamic range of the camera is too small, a multi-exposure fusion method in a wide dynamic technology is used for solving the problem in a 360-degree panoramic image system for a ship. The wide dynamic technology can clearly display the extremely bright and dark parts generated by different exposure in the surrounding scene during navigation, and obtain clear image pictures. Wide dynamics requires a traditional camera facing both High Dynamic Range (HDRI) sea surface scenes and Low Dynamic Range (LDRI) and retains the detail information of bright and dark areas in the picture. The main method adopts a multi-exposure fusion technology to process the low dynamic range picture. The main processing method of multi-exposure fusion at the present stage is direct weight averaging, which causes the fused image to lose a large amount of detail information and influences the auxiliary navigation of the whole system. For the problem of unclear picture acquisition of a camera caused by insufficient sea surface illumination in the night navigation process, a night vision mode is adopted in a 360-degree panoramic image system for a ship to solve the problem. Any object has infrared radiation, the night vision mode is used as a technology applied to image acquisition and processing, the infrared reflection intensity of the object is mainly improved through active infrared ray irradiation, and the visual ability of human eyes is expanded through the photoelectric effect principle and the photoelectron imaging method. At present, night vision technology can be divided into passive night vision and active night vision. The passive night vision specifically utilizes the existing infrared ray formed by natural light rays to identify and monitor, and needs to detect objects with heat and specific wavelength emitted by infrared radiation, but the infrared penetrating power of the passive night vision is weak in the face of heat source interference, so that the passive night vision is easily shielded and blocked. On the contrary, the active night vision is to enhance the weak radiation of the object by an artificial mode, so that the active night vision has stronger anti-interference capability, strong target brightness and large scene contrast, and can improve the imaging definition at lower cost. Although less covert, active night vision is a suitable option for normal marine navigation where high covert requirements are not required. However, the active night vision usually selects common near infrared radiation as the light source, the radiation distance is at most 150m, if severe weather such as fog occurs during navigation, the night vision performance will be obviously reduced, so that the working distance of the active night vision system is limited.
Disclosure of Invention
The invention aims to provide a wide dynamic image processing method of a marine 360-degree panoramic image system, which solves the problems in the prior art, can obtain a clear image picture in a high dynamic range area all weather, and provides a safe guarantee for intelligent ship navigation.
In order to achieve the purpose, the invention provides the following scheme: the invention provides a wide dynamic image processing method of a marine 360-degree panoramic image system, which comprises the following steps:
s1, acquiring scene image data of a dynamic range in the surrounding environment of the ship in the navigation process of the ship, and preprocessing the acquired scene image data; the scene image data under the same scene comprises a plurality of pictures with different brightness;
s2, acquiring chromaticity information in the marine scene based on the preprocessed scene image data;
s3, obtaining a brightness image of the scene image data based on the chrominance information in the marine scene, and updating the chrominance information in the marine scene based on the brightness image;
s4, performing saturation, exposure and contrast calculation on each pixel in the preprocessed scene image data respectively to generate a feature vector of the scene image data;
s5, labeling the preprocessed scene image data, constructing a training sample based on the feature vector and the labeled value of the scene image data, and inputting the training sample into a Support Vector Regression (SVR) model to obtain a training model;
s6, reordering the brightness images of the scene image data through the training model, and performing image fusion on the reordered brightness images;
and S7, linearly combining the chrominance information updated in the step S3 and the luminance image fused in the step S6 to obtain a 360-degree panoramic image of the surrounding environment in the ship sailing process.
Preferably, in S1, the process of acquiring the scene image data of the dynamic range in the ship surrounding environment further includes a night vision mode.
Preferably, the night vision mode adopts an active night vision method, and the active night vision method is realized by the following steps:
in the ship sailing process, an infrared emitter is used for irradiating a marine target object, and an infrared reflection signal is formed through the reflection of the target object on infrared rays;
receiving the infrared reflection signal by using a receiving device, wherein the light wavelength of the infrared reflection signal is 800-1000 nm;
and converting the received infrared reflection signal into an electric signal from an optical signal through the cathode surface of the infrared phase-change tube in the camera to form a visible image of the target object, namely scene image data.
Preferably, the camera adopts a CCD sensor.
Preferably, in S2, the specific method for acquiring the chromaticity information in the marine scene includes:
sequencing the scene image data according to the exposure;
selecting two images with intermediate exposure according to the sequencing result, and fusing the two selected images by adopting a weighted average fusion algorithm to obtain chrominance information in the marine scene; the two selected images are fused by adopting a weighted average fusion algorithm as shown in the following formula:
F(m,n)=w1A(m,n)+w2B(m,n)m=1,2,...,M n=1,2,...,N
wherein (m, n) is the pixel of a certain point in the image, A (m, n) and B (m, n) are two selected images, F (m, n) is the fusion result of the two images, w1Is the weight coefficient, w, of the image A (m, n)2Is the weight coefficient of image B (m, n); m is the number of rows of pixels in the image, and N is the number of columns of pixels in the image.
Preferably, the S3 specifically includes:
based on chrominance information R in marine scenes,GsAnd BsObtaining a luminance image, luminance image IsIs obtained as shown in the following formula:
Is=0.299×Rs+0.587×Gs+0.114×Bs
based on the luminance image IsFor chrominance information R in marine scenes,GsAnd BsUpdating is performed as shown in the following formula:
R′s=(Rs/Is)λ
G′s=(Gs/Is)λ
B′s=(Bs/Is)λ
in the formula (II), R's、G′s、B′sFor the updated chrominance information, λ is a saturation adjustment parameter.
Preferably, in S4, the calculating of the exposure level of the scene image data includes: respectively carrying out Gaussian processing on the red component, the green component and the blue component of the scene image data, and multiplying the Gaussian processing results of the red component, the green component and the blue component to obtain the exposure of the scene image data;
the calculating of the contrast of the scene image data comprises: and filtering the brightness of the scene image data through a Laplaction filter to obtain the contrast of the scene image data.
Preferably, in S5, the method for labeling the preprocessed scene image data includes:
outputting a marked upper boundary value bound by using an index value calculator of each normal distribution area in the preprocessed scene image data, as shown in the following formula:
Figure BDA0003368858530000051
where i denotes an index value, Q denotes the number of scene video data, bound (0) is 0, and bound (Q +1) is 0;
the label value label of the preprocessed scene image data is calculated as follows:
Figure BDA0003368858530000052
in the formula, Comi(m, n) is the integrated characteristic value, Com, of the pixels in the region ii(m,n)=S(m,n)×E(m,n)×C(m,n);maxComi、minComiRespectively, the maximum value and the minimum value of the comprehensive characteristic of the pixels in the area i.
Preferably, the S6 specifically includes:
inputting the brightness image of the scene image data into the training model, and reordering the brightness image of the scene image data based on the support vector regression output of the training model to obtain
Figure BDA0003368858530000061
Wherein the content of the first and second substances,
Figure BDA0003368858530000062
with the smallest of the support vector regression outputs,
Figure BDA0003368858530000063
the method has the maximum support vector regression output, and Q is the number of scene image data;
performing image fusion on the reordered brightness images according to a weight average formula, and performing image fusion on a fused brightness image IfAs shown in the following formula:
Figure BDA0003368858530000064
where w (i) is the weight of the ith luminance image.
Preferably, in S7, the 360-degree panoramic image of the surrounding environment during the ship sailing process obtained by linear combination is as follows:
Rout=R′s×If
Gout=C′s×If
Bout=B′s×If
in the formula, Rout、Gout、BoutRespectively a red component, a green component, a blue component, I of the 360-degree panoramic imagefIs a fused luminance image.
The invention discloses the following technical effects:
(1) the invention provides a wide dynamic image processing method of a marine 360-degree panoramic image system, which comprises the steps of sampling the same scene for multiple times by using different exposure time through a multi-exposure fusion method to obtain regional details with different brightness in the scene, then effectively fusing images with different exposure degrees in the same scene by adopting a multi-exposure fusion algorithm based on support vector regression to obtain an image with a high dynamic range, simultaneously inhibiting random noise, keeping global and local contrast of the fused image, and keeping as much detail information as possible in a bright region and a dark region, so that a camera can obtain a clear image even in a high dynamic range region.
(2) The invention provides an active night vision method in the process of acquiring scene image data, which can help a target to improve the infrared radiation reflectivity of the target by transmitting infrared rays with specified wavelength through near-infrared laser in a night vision mode, so that an optical lens of a camera can conveniently acquire reflected optical information, and clear images can be fed back by converting photoelectric signals through a receiver, thereby providing clear images all day long;
(3) the all-weather high-dynamic-range area clear image provided by the invention can optimize the system functions of assisting in berthing, assisting in driving, monitoring the sea surface in real time and the like, and provides a safety guarantee for the navigation of the intelligent ship.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a flow chart of a wide dynamic image processing method of a 360-degree panoramic image system for a ship according to an embodiment of the present invention;
fig. 2 is a block diagram of the active infrared night vision system in the embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Referring to fig. 1, the present embodiment provides a wide dynamic image processing method for a 360-degree panoramic image system for a ship, including:
s1, acquiring scene image data of a dynamic range in the surrounding environment of the ship in the navigation process of the ship, and preprocessing the acquired scene image data; the scene image data under the same scene comprises a plurality of pictures with different brightness;
in the step, a camera installed on the ship is started, the surrounding environment of the ship is monitored in real time in the process of sailing, and real high-dynamic-range scene image data are collected. In the acquisition process, the same scene is sampled for multiple times by using different exposure times to obtain pictures with different brightness in the same scene;
the preprocessing of the scene image data comprises the following steps: and adjusting the scene image data to the same size.
Because the boats and ships navigation not only all need at daytime and night navigation, and can't realize the definition that scene image data was shot evening, consequently, this application still provides the night vision mode and is used for the in-process of boats and ships navigation evening, carries out the collection of scene image data.
In this embodiment, the night vision mode adopts an active night vision method, and this method needs to use a near-infrared laser lamp to actively irradiate a target object in the surrounding sea surface environment during navigation, so as to enhance the intensity of the infrared radiation reflection of the actual target. And then collecting the infrared signal reflected by the object through an optical lens of a camera on the ship.
One of the important points of the active night vision method is the selection of the camera, and the image sensor is the main part of the camera. In considering the construction of an active night vision system, CCD and CMOS sensors are two types of image sensors currently in common use as camera sensing receiving devices. The practical situation that a camera with characteristics of high sensitivity and the like is needed during navigation is analyzed, and a CCD sensor is selected to be used as a camera induction receiver of a 360-degree system for a ship. Compared with a CMOS sensor, the CCD sensor has the advantages of high sensitivity, small distortion, long service life, vibration resistance, magnetic field resistance, small volume, no residual image and the like. Which can change light into a charge and transfer a charge storage set. Therefore, the CCD sensor is an ideal image pickup device of this patent.
Another important point of the active night vision method of the present application is the choice of light source. In active night vision systems, the choice of light source determines the working distance of the system. A typical active night vision system uses a common near-infrared illumination lamp as a light source. However, the active night vision system has the limitation that the active night vision system can only irradiate a distance of 150m at most on the sea and is easily influenced by severe weather such as heavy fog, and the working distance of the active night vision system during navigation is limited. By combining the spectral response characteristic of the CCD and the effective absorption rate of the retina, the near-infrared laser with the wavelength of 760-960 nm is determined to be suitable for being used as a light source of an active night vision system, and the power density of the laser needs to be controlled within a safety threshold value to ensure that the laser does not damage human eyes. The laser modulation circuit is designed to realize the modulation frequency of 10khz, so that the laser has anti-interference performance.
The block diagram of the active infrared night vision system in this embodiment is shown in fig. 2, and the implementation steps of the night vision mode include:
firstly, irradiating a marine target object by using an infrared emitter in the ship sailing process, and forming an infrared reflection signal through the reflection of the target object on infrared rays;
secondly, receiving the infrared reflection signal by using an optical lens of a receiving device, wherein the light wavelength of the infrared reflection signal is 800-1000 nm;
and converting the received infrared reflection signal into an electric signal from an optical signal through the cathode surface of the infrared phase-change tube in the camera, thereby forming a visible image of the target object, namely scene image data.
S2, acquiring chromaticity information in the marine scene based on the preprocessed scene image data;
the specific implementation method of the step is as follows:
firstly, sequencing scene image data according to exposure;
secondly, selecting two images with intermediate exposure according to the sorting result, and fusing the two selected images by adopting a weighted average fusion algorithm to obtain chrominance information marked as R in the marine scenes,GsAnd Bs
The principle of the weighted average fusion algorithm is that different weighting coefficients are given to a source image, and summation operation is performed to obtain a fusion result F (m, n), as shown in the following formula:
F(m,n)=w1A(m,n)+w2B(m,n)m=1,2,...,M n=1,2,...,N
where (m, n) is the pixel of a certain point in the image, A (m, n) and B (m, n) are two selected images, w1Is the weight coefficient, w, of the image A (m, n)2Is the weight coefficient of image B (m, n); m is the number of rows of pixels in the image, and N is the number of columns of pixels in the image.
The key point of the weighted average fusion algorithm is a weight coefficient w1、w2The value of (b) can also be calculated by the image gray value, which is generally set by experience, as shown in the following formula:
Figure BDA0003368858530000111
therefore, a new pixel point F (m, n) is obtained by the weighted average fusion algorithm, and the new pixel point F (m, n) is the fused chrominance information Rs,GsAnd BsAnd (4) forming.
S3, based on chromaticity information R in marine scenes,GsAnd BsTo obtainUpdating the chrominance information in the marine scene based on the luminance image of the scene image data;
the method specifically comprises the following steps:
first, based on chrominance information R in a marine scenes,GsAnd BsObtaining a luminance image, luminance image IsIs obtained as shown in the following formula:
Is=0.299×Rs+0.587×Gs+0.114×Bs
secondly, the chroma information R in the marine scene Is processed based on the luminance image Iss,GsAnd BsUpdating is performed as shown in the following formula:
R′s=(Rs/Is)λ
G′s=(Gs/Is
B′s=(Bs/Is)λ
in the formula (II), R's、G′s、B′sThe updated chrominance information; lambda is a saturation adjusting parameter used for adjusting the saturation of the scene image data;
s4, performing saturation, exposure and contrast calculation on each pixel in the preprocessed scene image data respectively to generate a feature vector of the scene image data;
in this step, the saturation S of the scene image data is calculated as follows:
Figure BDA0003368858530000121
Figure BDA0003368858530000122
in the formula, R (m, n), G (m, n), and B (m, n) are the red component, green component, and blue component of the pixel (m, n), respectively, and Z (m, n) is the average of the red component, green component, and blue component of the pixel (m, n).
The calculation of the exposure E of the scene image data includes:
first, gaussian processing is performed on the red component, the green component, and the blue component of the scene image data, as shown in the following formula:
Figure BDA0003368858530000123
Figure BDA0003368858530000124
Figure BDA0003368858530000125
in the formula, Rexposure(m,n)、Gexposure(m,n)、Bexposure(m, n) are the gaussian processing results of the red component, the green component and the blue component of the pixel point (m, n) respectively; σ is a standard deviation, and in this embodiment, a default value σ of 0.2 is used.
Then, based on the product of the gaussian processing results of the red component, the green component, and the blue component, the exposure E of the scene image data is obtained as shown in the following formula:
E(m,n)=Rexposure(m,n)×Gexposure(m,n)×Bexposure(m,n)。
the calculation of the contrast C of the scene image data includes:
filtering the brightness of the scene image data through a Laplaction filter to obtain the contrast C of the scene image data;
the template h of the Laplaction filter is shown as the following formula:
Figure BDA0003368858530000131
the contrast C is calculated as follows:
Figure BDA0003368858530000132
wherein L (m, n) is the brightness value of the pixel (m, n),
Figure BDA0003368858530000133
indicating a convolution operation, laplacion (m, n) is a laplacion filter.
In this embodiment, the size of the scene image data is 50 × 20, and S (m, n), E (m, n), and C (m, n) are calculated for each pixel (m, n) to form a feature vector [ S (m, n), E (m, n), C (m, n) ].
And S5, labeling the preprocessed scene image data, constructing a training sample based on the feature vector and the labeled value of the scene image data, and inputting the training sample into a Support Vector Regression (SVR) model to obtain the training model.
Since the preprocessed scene image data conforms to the gaussian distribution, the index value calculator of each normal distribution region is used to output the labeled upper boundary value bound, as shown in the following formula:
Figure BDA0003368858530000134
in the formula, i represents an index value, Q represents the number of scene video data, and bound (0) is 0, and bound (Q +1) is 0.
The label value label of the preprocessed scene image data is calculated as follows:
Figure BDA0003368858530000141
in the formula, Comi(m, n) is the global feature value, Cmo, of the pixels in region ii(m,n)=S(m,n)×E(m,n)×C(m,n);maxComi、minComiRespectively, the maximum value and the minimum value of the comprehensive characteristic of the pixels in the area i.
S6, reordering the brightness images of the scene image data through the training model, and performing image fusion on the reordered brightness images;
in this step, the luminance images of the scene image data obtained in step S3 are reordered to obtain
Figure BDA0003368858530000142
Wherein the content of the first and second substances,
Figure BDA0003368858530000143
with the smallest of the support vector regression outputs,
Figure BDA0003368858530000144
with the largest support vector regression output.
Performing image fusion on the reordered brightness images according to a weight average formula, and performing image fusion on a fused brightness image IfAs shown in the following formula:
Figure BDA0003368858530000145
where w (i) is the weight of the ith luminance image.
S7, the chromaticity information R 'updated in the step S3's、G′s、B′sAnd linearly combining the luminance image fused in the step S6 to obtain a 360-degree panoramic image of the surrounding environment in the ship sailing process.
In this step, the output image after linear combination is as follows:
Rout=R′s×If
Gout=G′s×If
Bout=B′s×If
in the formula, Rout、Gout、BoutRespectively a red component, a green component, and a blue component of the output image.
The invention has the following beneficial effects:
firstly, in solving the problem of high dynamic range scenes in navigation, because of the huge difference between the dynamic range of the traditional camera equipment and the actual scene, the image collected by the camera can lose a lot of details in a bright area and a dark area, thereby reducing the safety of auxiliary berthing and auxiliary driving functions in the marine 360-degree panoramic image system. The method can process the real-time image acquired by the camera equipment by adjusting the dynamic range of the brightness of the image, so that the camera equipment can acquire a clear image even in a high dynamic range area. Secondly, in the problem of too dark light during navigation, due to insufficient illumination, the camera cannot acquire clear images, and cannot track and feed back the target dynamics in real time by utilizing night images. The invention uses an active night vision technology to emit infrared rays with specified wavelength through near-infrared laser to help a target to improve the infrared radiation reflectivity of the target. Therefore, the optical lens of the camera can conveniently collect reflected optical information, and then the photoelectric signal is converted through the receiver, so that clear images are fed back. The two technologies are applied to a 360-degree panoramic image system for a ship, can optimize system functions such as assisting in berthing, assisting in driving, monitoring the sea surface in real time and the like, and provide safety guarantee for navigation of an intelligent ship.
The above-described embodiments are merely illustrative of the preferred embodiments of the present invention, and do not limit the scope of the present invention, and various modifications and improvements of the technical solutions of the present invention can be made by those skilled in the art without departing from the spirit of the present invention, and the technical solutions of the present invention are within the scope of the present invention defined by the claims.

Claims (10)

1. A wide dynamic image processing method for a 360-degree panoramic image system for a ship is characterized by comprising the following steps:
s1, acquiring scene image data of a dynamic range in the surrounding environment of the ship in the navigation process of the ship, and preprocessing the acquired scene image data; the scene image data under the same scene comprises a plurality of pictures with different brightness;
s2, acquiring chromaticity information in the marine scene based on the preprocessed scene image data;
s3, obtaining a brightness image of the scene image data based on the chrominance information in the marine scene, and updating the chrominance information in the marine scene based on the brightness image;
s4, performing saturation, exposure and contrast calculation on each pixel in the preprocessed scene image data respectively to generate a feature vector of the scene image data;
s5, labeling the preprocessed scene image data, constructing a training sample based on the feature vector and the labeled value of the scene image data, and inputting the training sample into a Support Vector Regression (SVR) model to obtain a training model;
s6, reordering the brightness images of the scene image data through the training model, and performing image fusion on the reordered brightness images;
and S7, linearly combining the chrominance information updated in the step S3 and the luminance image fused in the step S6 to obtain a 360-degree panoramic image of the surrounding environment in the ship sailing process.
2. The wide dynamic image processing method of marine 360-degree panoramic image system of claim 1, wherein in step S1, the process of collecting the dynamic range scene image data in the surrounding environment of the ship further includes a night vision mode.
3. The wide dynamic image processing method of marine 360-degree panoramic image system of claim 2, wherein the night vision mode is an active night vision method, and the active night vision method is implemented by the following steps:
in the ship sailing process, an infrared emitter is used for irradiating a marine target object, and an infrared reflection signal is formed through the reflection of the target object on infrared rays;
receiving the infrared reflection signal by using a receiving device, wherein the light wavelength of the infrared reflection signal is 800-1000 nm;
and converting the received infrared reflection signal into an electric signal from an optical signal through the cathode surface of the infrared phase-change tube in the camera to form a visible image of the target object, namely scene image data.
4. The wide dynamic image processing method of marine 360-degree panoramic image system of claim 3, wherein the camera is a CCD sensor.
5. The wide motion video processing method for a 360-degree panoramic video system for ships according to claim 1, wherein in S2, the specific method for acquiring chrominance information in the marine scene comprises:
sequencing the scene image data according to the exposure;
selecting two images with intermediate exposure according to the sequencing result, and fusing the two selected images by adopting a weighted average fusion algorithm to obtain chrominance information in the marine scene; the two selected images are fused by adopting a weighted average fusion algorithm as shown in the following formula:
F(m,n)=w1A(m,n)+w2B(m,n)m=1,2,...,M n=1,2,...,N
wherein (m, n) is the pixel of a certain point in the image, A (m, n) and B (m, n) are two selected images, F (m, n) is the fusion result of the two images, w1Is the weight coefficient, w, of the image A (m, n)2Is the weight coefficient of image B (m, n); m is the number of rows of pixels in the image, and N is the number of columns of pixels in the image.
6. The wide motion video processing method for a 360-degree panoramic video system for ships according to claim 1, wherein S3 specifically includes:
based on chrominance information R in marine sceneS,GSAnd BSObtaining a luminance image, luminance image IsIs obtained as shown in the following formula:
Is=0.299×Rs+0.587×Gs+0.114×Bs
based on the luminance image IsFor chrominance information R in marine sceneS,GSAnd BSUpdating is performed as shown in the following formula:
R′s=(Rs/Is)λ
G′s=(Gs/Is)λ
B′s=(Bs/Is)λ
in the formula (II), R's、G′s、B′sFor the updated chrominance information, λ is a saturation adjustment parameter.
7. The wide motion picture processing method for a 360-degree panoramic picture system for ships according to claim 1, wherein the calculating of the exposure level of the scene picture data in S4 includes: respectively carrying out Gaussian processing on the red component, the green component and the blue component of the scene image data, and multiplying the Gaussian processing results of the red component, the green component and the blue component to obtain the exposure of the scene image data;
the calculating of the contrast of the scene image data comprises: and filtering the brightness of the scene image data through a Laplaction filter to obtain the contrast of the scene image data.
8. The method for processing wide dynamic image in 360-degree panoramic image system for ships according to claim 1, wherein in S5, the method for labeling the preprocessed scene image data includes:
outputting a marked upper boundary value bound by using an index value calculator of each normal distribution area in the preprocessed scene image data, as shown in the following formula:
Figure FDA0003368858520000041
where i denotes an index value, Q denotes the number of scene video data, bound (0) is 0, and bound (Q +1) is 0;
the label value label of the preprocessed scene image data is calculated as follows:
Figure FDA0003368858520000042
in the formula, Comi(m, n) is the integrated characteristic value, Com, of the pixels in the region ii(m,n)=S(m,n)×E(m,n)×C(m,n);maxComi、minComiRespectively, the maximum value and the minimum value of the comprehensive characteristic of the pixels in the area i.
9. The wide motion video processing method for a 360-degree panoramic video system for ships according to claim 1, wherein S6 specifically includes:
inputting the brightness image of the scene image data into the training model, and reordering the brightness image of the scene image data based on the support vector regression output of the training model to obtain
Figure FDA0003368858520000043
Wherein the content of the first and second substances,
Figure FDA0003368858520000044
with the smallest of the support vector regression outputs,
Figure FDA0003368858520000045
the method has the maximum support vector regression output, and Q is the number of scene image data;
performing image fusion on the reordered brightness images according to a weight average formula, and performing image fusion on a fused brightness image IfAs shown in the following formula:
Figure FDA0003368858520000051
where w (i) is the weight of the ith luminance image.
10. The wide motion picture processing method of a 360-degree panoramic picture system for ships according to claim 6, wherein in step S7, the 360-degree panoramic picture of the surrounding environment during the ship sailing, obtained by linear combination, is represented by the following formula:
Rout=R′s×If
Gout=G′s×If
Bout=B′s×If
in the formula, Rout、Gout、BoutRespectively a red component, a green component, a blue component, I of the 360-degree panoramic imagefIs a fused luminance image.
CN202111392528.4A 2021-11-23 2021-11-23 Wide dynamic image processing method of marine 360-degree panoramic image system Pending CN114125228A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111392528.4A CN114125228A (en) 2021-11-23 2021-11-23 Wide dynamic image processing method of marine 360-degree panoramic image system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111392528.4A CN114125228A (en) 2021-11-23 2021-11-23 Wide dynamic image processing method of marine 360-degree panoramic image system

Publications (1)

Publication Number Publication Date
CN114125228A true CN114125228A (en) 2022-03-01

Family

ID=80439976

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111392528.4A Pending CN114125228A (en) 2021-11-23 2021-11-23 Wide dynamic image processing method of marine 360-degree panoramic image system

Country Status (1)

Country Link
CN (1) CN114125228A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1975550A (en) * 2006-12-11 2007-06-06 厦门大学 Active infrared polarization laser night vision imaging instrument
CN101214851A (en) * 2008-01-10 2008-07-09 黄席樾 Intelligent all-weather actively safety early warning system and early warning method thereof for ship running
CN103802727A (en) * 2012-11-14 2014-05-21 上海市闵行区知识产权保护协会 Low-visibility guiding system
CN107172418A (en) * 2017-06-08 2017-09-15 宁波大学 A kind of tone scale map image quality evaluating method analyzed based on exposure status
CN107635117A (en) * 2017-11-02 2018-01-26 南京船行天下信息科技有限公司 A kind of inland navigation craft Big Dipper video monitoring multi-channel video device
US20180115777A1 (en) * 2016-10-26 2018-04-26 Dolby Laboratories Licensing Corporation Screen-adaptive decoding of high dynamic range video
CN110335221A (en) * 2019-03-21 2019-10-15 西安电子科技大学 A kind of more exposure image fusion methods based on unsupervised learning

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1975550A (en) * 2006-12-11 2007-06-06 厦门大学 Active infrared polarization laser night vision imaging instrument
CN101214851A (en) * 2008-01-10 2008-07-09 黄席樾 Intelligent all-weather actively safety early warning system and early warning method thereof for ship running
CN103802727A (en) * 2012-11-14 2014-05-21 上海市闵行区知识产权保护协会 Low-visibility guiding system
US20180115777A1 (en) * 2016-10-26 2018-04-26 Dolby Laboratories Licensing Corporation Screen-adaptive decoding of high dynamic range video
CN107172418A (en) * 2017-06-08 2017-09-15 宁波大学 A kind of tone scale map image quality evaluating method analyzed based on exposure status
CN107635117A (en) * 2017-11-02 2018-01-26 南京船行天下信息科技有限公司 A kind of inland navigation craft Big Dipper video monitoring multi-channel video device
CN110335221A (en) * 2019-03-21 2019-10-15 西安电子科技大学 A kind of more exposure image fusion methods based on unsupervised learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
周宏明;程林义;朱国璋;: "基于激光照明的高速船用夜视系统设计", 光学与光电技术, no. 06 *
王金华: "高动态范围场景可视化技术研究", 中国博士学位论文全文数据库 (信息科技辑), pages 4 *

Similar Documents

Publication Publication Date Title
CN109636754B (en) Extremely-low-illumination image enhancement method based on generation countermeasure network
Wang et al. Underwater image restoration via maximum attenuation identification
CN110378845B (en) Image restoration method based on convolutional neural network under extreme conditions
CN110166692B (en) Method and device for improving automatic focusing accuracy and speed of camera
CN107784642B (en) A kind of infrared video and visible light video method for self-adaption amalgamation
CN107580163A (en) A kind of twin-lens black light camera
CN105447838A (en) Method and system for infrared and low-level-light/visible-light fusion imaging
CN114119378A (en) Image fusion method, and training method and device of image fusion model
CN110349114A (en) Applied to the image enchancing method of AOI equipment, device and road video monitoring equipment
CN104966108A (en) Visible light and infrared image fusion method based on gradient transfer
CN108093175B (en) A kind of adaptive defogging method of real-time high-definition video and device
US10721448B2 (en) Method and apparatus for adaptive exposure bracketing, segmentation and scene organization
CN112561996A (en) Target detection method in autonomous underwater robot recovery docking
CN113034417A (en) Image enhancement system and image enhancement method based on generation countermeasure network
CN107016343A (en) A kind of traffic lights method for quickly identifying based on Bel's format-pattern
CN113762161B (en) Intelligent obstacle monitoring method and system
CN112561813B (en) Face image enhancement method and device, electronic equipment and storage medium
CN114125228A (en) Wide dynamic image processing method of marine 360-degree panoramic image system
CN105118032B (en) A kind of wide method for dynamically processing of view-based access control model system
CN111541886A (en) Vision enhancement system applied to muddy underwater
Raigonda et al. Haze Removal Of Underwater Images Using Fusion Technique
CN116757949A (en) Atmosphere-ocean scattering environment degradation image restoration method and system
CN112926367A (en) Living body detection equipment and method
CN116664460A (en) Infrared night vision fusion method
Zou et al. Self-tuning underwater image fusion method based on dark channel prior

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination