CN115641487B - Multi-stage judgment fusion method and system based on neutrons and X rays - Google Patents

Multi-stage judgment fusion method and system based on neutrons and X rays Download PDF

Info

Publication number
CN115641487B
CN115641487B CN202211033724.7A CN202211033724A CN115641487B CN 115641487 B CN115641487 B CN 115641487B CN 202211033724 A CN202211033724 A CN 202211033724A CN 115641487 B CN115641487 B CN 115641487B
Authority
CN
China
Prior art keywords
fusion
image
neutron
ray
level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211033724.7A
Other languages
Chinese (zh)
Other versions
CN115641487A (en
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neutron Times Qingdao Innovation Technology Co ltd
Original Assignee
Neutron Times Qingdao Innovation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neutron Times Qingdao Innovation Technology Co ltd filed Critical Neutron Times Qingdao Innovation Technology Co ltd
Priority to CN202211033724.7A priority Critical patent/CN115641487B/en
Publication of CN115641487A publication Critical patent/CN115641487A/en
Application granted granted Critical
Publication of CN115641487B publication Critical patent/CN115641487B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E30/00Energy generation of nuclear origin
    • Y02E30/10Nuclear fusion reactors

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a multi-stage judgment fusion method and system based on neutrons and X rays, and relates to the field of nondestructive ray detection. The method comprises the following steps: based on a neutron image and an X-ray image of an object to be detected, multi-level dominance information is obtained, fusion weights of the neutron image and the X-ray image in each level are calculated according to the multi-level dominance information, fusion processing is carried out on the neutron image and the X-ray image according to the fusion weights in each level, so that an object fusion image is obtained, and the geometric definition, the signal-to-noise ratio, the edge gradient and other image quality of the fused image in the image are higher than those of the fused image in the traditional fusion mode.

Description

Multi-stage judgment fusion method and system based on neutrons and X rays
Technical Field
The invention relates to the field of nondestructive ray detection, in particular to a multi-stage judgment fusion method and system based on neutrons and X rays.
Background
In the nondestructive ray detection industry, in order to solve the detection problem of the characteristic equipment, the conventional means include neutron photography, X-ray, gamma-ray and the like, but the action mechanism of the X-ray and the neutron is different from that of a substance, the X-ray acts on electrons outside the atomic nucleus of the substance, the action section of the X-ray has a definite function relation with the atomic number of the nuclide, the neutron directly interacts with the atomic nucleus, and the size of a scattering section has no direct relation with the atomic number. The traditional fusion scheme only extracts single characteristic information (gray level difference, boundary gradient and the like) of the target image for fusion, only relies on gray level difference as a threshold condition in fusion judgment, and single simple fusion calculation cannot obtain top-layer information (abstract information) of a multi-source image with larger difference, so that a general fusion algorithm cannot accurately position the advantage information of the multi-source image.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a multi-stage judgment fusion method and system based on neutrons and X rays.
The technical scheme for solving the technical problems is as follows:
a multi-stage judgment fusion method based on neutrons and X rays comprises the following steps:
s1, obtaining multi-stage dominance information based on a neutron image and an X-ray image of an object to be detected, wherein the multi-stage dominance information comprises the following characteristics, signal levels and pixel levels: confidence, boundary gradient value, and pixel value;
s2, respectively calculating fusion weights of the neutron image and the X-ray image in each stage according to the multi-stage dominance information;
and S3, carrying out fusion processing on the neutron image and the X-ray image according to the fusion weights in each stage so as to obtain a target fusion image.
The beneficial effects of the invention are as follows: according to the technical scheme, the multi-level dominance information of the neutron image and the X-ray image is extracted from the characteristic level, the signal level and the pixel level, the fusion weights of the neutron image and the X-ray image in each level are calculated according to the multi-level dominance information, the neutron image and the X-ray image are fused according to the fusion weights in each level, so that a target fusion image is obtained, and the geometric definition, the signal-to-noise ratio, the edge gradient and other image quality of the fused image are higher than those of a traditional fusion algorithm.
Further, the step S2 specifically includes:
extracting the confidence coefficient of the X-ray image at the feature level, and distributing a first X-ray fusion weight according to the confidence coefficient;
extracting boundary gradient values of the X-ray image at a signal level, and distributing second X-ray fusion weights according to the boundary gradient values;
and extracting pixel values of the X-ray image at a pixel level, and distributing a third X-ray fusion weight according to the pixel values.
Further, the step S2 further specifically includes:
extracting the confidence coefficient of the neutron image at the feature level, and distributing a first neutron fusion weight according to the confidence coefficient;
extracting boundary gradient values of the neutron images at a signal level, and distributing second neutron fusion weights according to the boundary gradient values;
and extracting the pixel value of the sub-image at a pixel level, and distributing a third sub-fusion weight according to the pixel value.
Further, the step S3 specifically includes:
preprocessing the X-ray image and the neutron image;
performing feature level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the first X-ray fusion weight and the first neutron fusion weight to obtain a first fusion image;
performing signal level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the second X-ray fusion weight and the second neutron fusion weight to obtain a second fusion image;
performing pixel-level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the third X-ray fusion weight and the third neutron fusion weight to obtain a third fusion image;
and reconstructing according to the first fusion image and the second fusion image combined with the third fusion image to obtain a target fusion image.
Further, the preprocessing includes: denoising, scatter correction, and contrast adjustment.
The other technical scheme for solving the technical problems is as follows:
a neutron and X-ray based multi-level decision fusion system, comprising: the system comprises an information extraction module, a fusion weight calculation module and a fusion module;
the information extraction module is used for obtaining multi-stage dominance information based on neutron images and X-ray images of an object to be detected, wherein the multi-stage dominance information comprises the following components: confidence, boundary gradient value, and pixel value;
the fusion weight calculation module is used for calculating fusion weights of the neutron image and the X-ray image in each stage according to the multi-stage dominance information;
and the fusion module is used for carrying out fusion processing on the neutron image and the X-ray image according to the fusion weights in each stage so as to obtain a target fusion image.
The beneficial effects of the invention are as follows: according to the technical scheme, the multi-level dominance information of the neutron image and the X-ray image is extracted from the characteristic level, the signal level and the pixel level, the fusion weights of the neutron image and the X-ray image in each level are calculated according to the multi-level dominance information, the neutron image and the X-ray image are fused according to the fusion weights in each level, so that a target fusion image is obtained, and the geometric definition, the signal-to-noise ratio, the edge gradient and other image quality of the fused image are higher than those of a traditional fusion algorithm.
Further, the fusion weight calculation module is specifically configured to extract a confidence coefficient of the X-ray image at a feature level, and allocate a first X-ray fusion weight according to the confidence coefficient;
extracting boundary gradient values of the X-ray image at a signal level, and distributing second X-ray fusion weights according to the boundary gradient values;
and extracting pixel values of the X-ray image at a pixel level, and distributing a third X-ray fusion weight according to the pixel values.
Further, the fusion weight calculation module is specifically configured to extract a confidence coefficient of the neutron image at a feature level, and allocate a first neutron fusion weight according to the confidence coefficient;
extracting boundary gradient values of the neutron images at a signal level, and distributing second neutron fusion weights according to the boundary gradient values;
and extracting the pixel value of the sub-image at a pixel level, and distributing a third sub-fusion weight according to the pixel value.
Further, the fusion module is specifically configured to pre-process the X-ray image and the neutron image;
performing feature level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the first X-ray fusion weight and the first neutron fusion weight to obtain a first fusion image;
performing signal level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the second X-ray fusion weight and the second neutron fusion weight to obtain a second fusion image;
performing pixel-level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the third X-ray fusion weight and the third neutron fusion weight to obtain a third fusion image;
and reconstructing according to the first fusion image and the second fusion image combined with the third fusion image to obtain a target fusion image.
Further, the preprocessing includes: denoising, scatter correction, and contrast adjustment.
Additional aspects of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
FIG. 1 is a schematic flow chart of a multi-level judgment fusion method based on neutrons and X rays according to an embodiment of the present invention;
FIG. 2 is a block diagram of a multi-level decision fusion system based on neutrons and X rays according to an embodiment of the present invention;
fig. 3 is a flow chart of multi-level decision provided by other embodiments of the present invention.
Detailed Description
The principles and features of the present invention are described below with reference to the drawings, the illustrated embodiments are provided for illustration only and are not intended to limit the scope of the present invention.
As shown in fig. 1, a multi-stage determination fusion method based on neutrons and X-rays according to an embodiment of the present invention includes:
s1, obtaining multi-stage dominance information based on a neutron image and an X-ray image of an object to be detected, wherein the multi-stage dominance information comprises the following characteristics, signal levels and pixel levels: confidence, boundary gradient value, and pixel value;
s2, respectively calculating fusion weights of the neutron image and the X-ray image in each stage according to the multi-stage dominance information;
it should be noted that, in an embodiment, as shown in fig. 3, the multi-level determination of the feature level, the signal level, and the pixel level may include:
the method mainly comprises the steps of calculating weight factors of different areas of different images according to different levels of features, wherein feature level information is the most abstract information, obtaining the highest basic weight distributed by the mask1 of the most interesting area by using a segmentation neural network, and multiplying the segmentation confidence as a left threshold and a right threshold. Next, a secondary region of interest of the signal level is calculated using edge detection, and this region mask2 is weighted moderately, wherein the XRAY image (X-ray image) threshold is higher than ZRAY (neutron image). And finally, extracting the information of the bottom-layer pixel level under the large, medium and small fields by using the FPN, and distributing the lowest weight according to the difference of gray values of different levels, wherein the threshold value of the ZRAY image is higher than XRAY. The split neural network may be a split neural network. The highest basis weight may be greater than 0.5; the segmentation confidence can be output according to the neural network after detection, and the segmentation confidence is a number between 0 and 1; the secondary region of interest of the signal level calculated by the edge detection may include: extracting edges by using an extractor for edge detection such as canny, sob l e and the like; wherein the medium weight may be greater than 0.2 based on the specific gravity of the edge mean after neutron and x-ray edge extraction, similar to the last basis weight. The lowest weight can be set according to the gray average value of the neutron x-ray image according to the difference of gray values of different levels, and the calculation formula is as follows: the neutron roi gray scale average value/x-ray roi line gray scale average value + neutron roi gray scale average value is less than 0.2.
And S3, carrying out fusion processing on the neutron image and the X-ray image according to the fusion weights in each stage so as to obtain a target fusion image.
It should be noted that the fusion processing procedure may include:
firstly, preprocessing an x-ray image and a neutron image, (respectively processing the x-ray image and the neutron image in the following steps), then carrying out Laplace transformation on the image processed in the second step to obtain images with different times of fields of view such as downsampling 2, 4, 8, 16, 32 and the like, reducing the images to the images with the same field of view by using linear interpolation up-sampl e, subtracting two images (the image representing neutrons and the image of the x-ray) of the same field of view to obtain high-frequency information parts of the images with different fields of view, and then linearly superposing the high-frequency information of the x-ray and the low-frequency image (downsampling map) of neutrons according to the multistage weighting factors obtained in the previous step, wherein the low-frequency image of neutrons can be obtained by carrying out image pyramid downsampling on the images, or utilizing Fourier transformation or Butterworth, gaussian and other low-pass filters and the like; and finally, carrying out inverse Laplacian transformation on the obtained images in the various fields of view to obtain a final fusion image. Wherein, feature level, pixel level and signal level, based on the deep learning feature level map: a convolutional neural network schematic diagram realizes a feature extraction function; based on the FPN pixel level: for performing FPN image golden sub-tower layering, designing a low-pass convolution kernel to perform pixel level extraction; based on signal level: and performing feature fusion on edges of the two images for an edge detection schematic diagram. Wherein, the preprocessing may include: denoising, scattering correction, contrast adjustment, and the like, for improving image quality, reducing image noise. Wherein, the multistage weight factor linear superposition x-ray high frequency information and neutron low frequency image specifically include: the X-ray image and the neutron image are multiplied by the weight obtained in the previous section respectively, and then added for normalization.
According to the technical scheme, the multi-level dominance information of the neutron image and the X-ray image is extracted from the characteristic level, the signal level and the pixel level, the fusion weights of the neutron image and the X-ray image in each level are calculated according to the multi-level dominance information, the neutron image and the X-ray image are fused according to the fusion weights in each level, so that a target fusion image is obtained, and the geometric definition, the signal-to-noise ratio, the edge gradient and other image quality of the fused image are higher than those of a traditional fusion algorithm.
In one embodiment, there are more levels of fusion or fusion methods of two feature levels in addition to the feature level signal level pixel levels described above. The feature level signal can be extracted by other deep learning algorithms besides deep learning segmentation algorithm, and the signal level can be extracted by a scheme such as frequency domain calculation (Fourier transform) besides edge detection. In addition, the secondary fusion scheme can be used in the fusion of neutron X-ray double-source images and can also be applied to other multi-source images, such as infrared rays, RGB and X rays, provided that different source images have larger characteristic signal differences.
In another embodiment, the hierarchical processing may include:
when 0< l < N, for the first layer image decomposed by the Laplacian pyramid, the regional energy is calculated first:
Figure BDA0003818061790000071
wherein AER represents the total energy of the regions of the image to be fused; LA (LA) N Representing the Nth layer of the golden sub-tower; ω represents a low pass filter.
Figure BDA0003818061790000081
Where p=1, q=1,
Figure BDA0003818061790000082
the fusion result of the other hierarchy image is:
Figure BDA0003818061790000083
wherein, LA i Representing pixel values of an ith layer of the image golden tower; AER represents the energy value of the lower i layer of the golden sub-tower; the two terms are multiplied to a pixel level feature;
mask represents the target area detected by deep learning; confidence represents the confidence level calculated by the deep learning; the two terms are multiplied to be feature level features;
canny represents a binary image obtained by canny edge detection; thresh represents the threshold of edge detection signal level assignment; these two terms are multiplied by the signal level characteristics.
Obtaining a fusion image LF of each pyramid level 1 、LF 2 、LF N And then, the method is carried out. By the previous reconstruction, the final fusion image can be obtained.
Optionally, in any embodiment of the foregoing, the S2 specifically includes:
extracting the confidence coefficient of the X-ray image at the feature level, and distributing a first X-ray fusion weight according to the confidence coefficient;
extracting boundary gradient values of the X-ray image at a signal level, and distributing second X-ray fusion weights according to the boundary gradient values;
and extracting pixel values of the X-ray image at a pixel level, and distributing a third X-ray fusion weight according to the pixel values.
Optionally, in any embodiment of the foregoing, the S2 further specifically includes:
extracting the confidence coefficient of the neutron image at the feature level, and distributing a first neutron fusion weight according to the confidence coefficient;
extracting boundary gradient values of the neutron images at a signal level, and distributing second neutron fusion weights according to the boundary gradient values;
and extracting the pixel value of the sub-image at a pixel level, and distributing a third sub-fusion weight according to the pixel value.
Optionally, in any embodiment of the foregoing, the S3 specifically includes:
preprocessing the X-ray image and the neutron image;
performing feature level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the first X-ray fusion weight and the first neutron fusion weight to obtain a first fusion image;
performing signal level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the second X-ray fusion weight and the second neutron fusion weight to obtain a second fusion image;
performing pixel-level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the third X-ray fusion weight and the third neutron fusion weight to obtain a third fusion image;
and reconstructing according to the first fusion image and the second fusion image combined with the third fusion image to obtain a target fusion image.
Optionally, in any embodiment above, the preprocessing includes: denoising, scatter correction, and contrast adjustment.
In one embodiment, a neutron and X-ray based multi-level decision fusion system includes: an information extraction module 1101, a fusion weight calculation module 1102, and a fusion module 1103;
the information extraction module 1101 is configured to obtain multi-level dominance information based on a neutron image and an X-ray image of an object to be detected, where the multi-level dominance information includes: confidence, boundary gradient value, and pixel value;
the fusion weight calculation module 1102 is configured to calculate fusion weights of the neutron image and the X-ray image in each stage according to the multi-stage dominance information;
the fusion module 1103 is configured to perform fusion processing on the neutron image and the X-ray image according to the fusion weights in the respective stages, so as to obtain a target fusion image.
According to the technical scheme, the multi-level dominance information of the neutron image and the X-ray image is extracted from the characteristic level, the signal level and the pixel level, the fusion weights of the neutron image and the X-ray image in each level are calculated according to the multi-level dominance information, the neutron image and the X-ray image are fused according to the fusion weights in each level, so that a target fusion image is obtained, and the geometric definition, the signal-to-noise ratio, the edge gradient and other image quality of the fused image are higher than those of a traditional fusion algorithm.
Optionally, in any embodiment of the foregoing, the fusion weight calculation module 1102 is specifically configured to extract a confidence level of the X-ray image at a feature level, and allocate a first X-ray fusion weight according to the confidence level;
extracting boundary gradient values of the X-ray image at a signal level, and distributing second X-ray fusion weights according to the boundary gradient values;
and extracting pixel values of the X-ray image at a pixel level, and distributing a third X-ray fusion weight according to the pixel values.
Optionally, in any embodiment of the foregoing, the fusion weight calculation module 1102 is specifically configured to extract a confidence level of the neutron image at a feature level, and allocate a first neutron fusion weight according to the confidence level;
extracting boundary gradient values of the neutron images at a signal level, and distributing second neutron fusion weights according to the boundary gradient values;
and extracting the pixel value of the sub-image at a pixel level, and distributing a third sub-fusion weight according to the pixel value.
Optionally, in any embodiment, the fusion module 1103 is specifically configured to pre-process the X-ray image and the neutron image;
performing feature level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the first X-ray fusion weight and the first neutron fusion weight to obtain a first fusion image;
performing signal level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the second X-ray fusion weight and the second neutron fusion weight to obtain a second fusion image;
performing pixel-level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the third X-ray fusion weight and the third neutron fusion weight to obtain a third fusion image;
and reconstructing according to the first fusion image and the second fusion image combined with the third fusion image to obtain a target fusion image.
Optionally, in any embodiment above, the preprocessing includes: denoising, scatter correction, and contrast adjustment.
It is to be understood that in some embodiments, some or all of the alternatives described in the various embodiments above may be included.
It should be noted that, the foregoing embodiments are product embodiments corresponding to the previous method embodiments, and the description of each optional implementation manner in the product embodiments may refer to the corresponding description in the foregoing method embodiments, which is not repeated herein.
The reader will appreciate that in the description of this specification, a description of terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the method embodiments described above are merely illustrative, e.g., the division of steps is merely a logical function division, and there may be additional divisions of actual implementation, e.g., multiple steps may be combined or integrated into another step, or some features may be omitted or not performed.
The above-described method, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present invention is essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present invention. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-only memory (ROM), a random access memory (RAM, randomAccessMemory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The present invention is not limited to the above embodiments, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the present invention, and these modifications and substitutions are intended to be included in the scope of the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (6)

1. A multi-stage judgment fusion method based on neutrons and X rays is characterized by comprising the following steps:
s1, obtaining multi-level advantage information based on neutron images and X-ray images of an object to be detected; multiple levels represent a feature level, a signal level, and a pixel level, and the multiple levels of dominance information includes: confidence, boundary gradient value, and pixel value;
s2, respectively calculating fusion weights of the neutron image and the X-ray image in each stage according to the multi-stage dominance information;
s3, carrying out fusion processing on the neutron image and the X-ray image according to the fusion weights in each stage so as to obtain a target fusion image;
the step S2 specifically comprises the following steps:
extracting the confidence coefficient of the X-ray image at the feature level, and distributing a first X-ray fusion weight according to the confidence coefficient;
extracting boundary gradient values of the X-ray image at a signal level, and distributing second X-ray fusion weights according to the boundary gradient values;
extracting pixel values of an X-ray image at a pixel level, and distributing a third X-ray fusion weight according to the pixel values;
the step S2 further specifically comprises:
extracting the confidence coefficient of the neutron image at the feature level, and distributing a first neutron fusion weight according to the confidence coefficient;
extracting boundary gradient values of the neutron images at a signal level, and distributing second neutron fusion weights according to the boundary gradient values;
and extracting the pixel value of the sub-image at a pixel level, and distributing a third sub-fusion weight according to the pixel value.
2. The multi-level decision fusion method based on neutrons and X rays according to claim 1, wherein S3 specifically comprises:
preprocessing the X-ray image and the neutron image;
performing feature level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the first X-ray fusion weight and the first neutron fusion weight to obtain a first fusion image;
performing signal level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the second X-ray fusion weight and the second neutron fusion weight to obtain a second fusion image;
performing pixel-level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the third X-ray fusion weight and the third neutron fusion weight to obtain a third fusion image;
and reconstructing according to the first fusion image and the second fusion image combined with the third fusion image to obtain a target fusion image.
3. The multi-level decision fusion method based on neutrons and X rays according to claim 2, wherein said preprocessing comprises: denoising, scatter correction, and contrast adjustment.
4. A neutron and X-ray based multi-level decision fusion system, comprising: the system comprises an information extraction module, a fusion weight calculation module and a fusion module;
the information extraction module is used for obtaining multi-stage dominance information based on neutron images and X-ray images of an object to be detected, wherein the multi-stage dominance information comprises the following components: confidence, boundary gradient value, and pixel value;
the fusion weight calculation module is used for calculating fusion weights of the neutron image and the X-ray image in each stage according to the multi-stage dominance information;
the fusion module is used for carrying out fusion processing on the neutron image and the X-ray image according to the fusion weights in each stage so as to obtain a target fusion image;
the fusion weight calculation module is specifically used for extracting the confidence coefficient of the X-ray image at the feature level and distributing a first X-ray fusion weight according to the confidence coefficient;
extracting boundary gradient values of the X-ray image at a signal level, and distributing second X-ray fusion weights according to the boundary gradient values;
extracting pixel values of an X-ray image at a pixel level, and distributing a third X-ray fusion weight according to the pixel values;
the fusion weight calculation module is specifically used for extracting the confidence coefficient of the neutron image at the feature level and distributing a first neutron fusion weight according to the confidence coefficient;
extracting boundary gradient values of the neutron images at a signal level, and distributing second neutron fusion weights according to the boundary gradient values;
and extracting the pixel value of the sub-image at a pixel level, and distributing a third sub-fusion weight according to the pixel value.
5. The multi-level decision fusion system based on neutrons and X rays according to claim 4, wherein said fusion module is specifically configured to pre-process said X-ray images and said neutron images;
performing feature level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the first X-ray fusion weight and the first neutron fusion weight to obtain a first fusion image;
performing signal level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the second X-ray fusion weight and the second neutron fusion weight to obtain a second fusion image;
performing pixel-level fusion processing on the preprocessed X-ray image and the preprocessed neutron image according to the third X-ray fusion weight and the third neutron fusion weight to obtain a third fusion image;
and reconstructing according to the first fusion image and the second fusion image combined with the third fusion image to obtain a target fusion image.
6. The multi-level, neutron and X-ray based decision fusion system of claim 5, wherein the pre-processing includes: denoising, scatter correction, and contrast adjustment.
CN202211033724.7A 2022-08-26 2022-08-26 Multi-stage judgment fusion method and system based on neutrons and X rays Active CN115641487B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211033724.7A CN115641487B (en) 2022-08-26 2022-08-26 Multi-stage judgment fusion method and system based on neutrons and X rays

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211033724.7A CN115641487B (en) 2022-08-26 2022-08-26 Multi-stage judgment fusion method and system based on neutrons and X rays

Publications (2)

Publication Number Publication Date
CN115641487A CN115641487A (en) 2023-01-24
CN115641487B true CN115641487B (en) 2023-06-27

Family

ID=84940249

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211033724.7A Active CN115641487B (en) 2022-08-26 2022-08-26 Multi-stage judgment fusion method and system based on neutrons and X rays

Country Status (1)

Country Link
CN (1) CN115641487B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020007320A1 (en) * 2018-07-03 2020-01-09 清华-伯克利深圳学院筹备办公室 Method for fusing multi-visual angle images, apparatus, computer device, and storage medium
CN113313661A (en) * 2021-05-26 2021-08-27 Oppo广东移动通信有限公司 Image fusion method and device, electronic equipment and computer readable storage medium
CN114882332A (en) * 2022-06-06 2022-08-09 江苏富士特电气技术有限公司 Target detection system based on image fusion

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109978808B (en) * 2019-04-25 2022-02-01 北京迈格威科技有限公司 Method and device for image fusion and electronic equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020007320A1 (en) * 2018-07-03 2020-01-09 清华-伯克利深圳学院筹备办公室 Method for fusing multi-visual angle images, apparatus, computer device, and storage medium
CN113313661A (en) * 2021-05-26 2021-08-27 Oppo广东移动通信有限公司 Image fusion method and device, electronic equipment and computer readable storage medium
CN114882332A (en) * 2022-06-06 2022-08-09 江苏富士特电气技术有限公司 Target detection system based on image fusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种基于多传感器多级信息融合的红外目标检测方法;李秋华,李吉成,沈振康;电子与信息学报(11);第1701-1705页 *

Also Published As

Publication number Publication date
CN115641487A (en) 2023-01-24

Similar Documents

Publication Publication Date Title
CN102203826B (en) Denoising medical images
CN103069432A (en) Non-linear resolution reduction for medical imagery
Wu et al. A novel scheme for infrared image enhancement by using weighted least squares filter and fuzzy plateau histogram equalization
CN107085839B (en) SAR image speckle reduction method based on texture enhancement and sparse coding
CN115641487B (en) Multi-stage judgment fusion method and system based on neutrons and X rays
CN109035228A (en) A kind of radioscopic image processing method of non-uniform thickness component
Wang et al. Detail preserving multi-scale exposure fusion
Jindal et al. Bio-medical image enhancement based on spatial domain technique
Boby et al. Medical Image Denoising Techniques against Hazardous Noises: An IQA Metrics Based Comparative Analysis
Prakoso et al. Enhancement methods of brain MRI images: A Review
Bora Contrast improvement of medical images using advanced fuzzy logic-based technique
Rama Lakshmi et al. A Review on Image Denoising Algorithms for Various Applications
CN114529518A (en) Image pyramid and NLM-based image enhancement method for cryoelectron microscope
Paul et al. MR image enhancement using an extended neighborhood filter
Chen et al. A pilot study on a new image enhancement method using spatial statistics
Kokhan et al. Segmentation criteria in the problem of porosity determination based on CT scans
Shankar et al. Object oriented fuzzy filter for noise reduction of Pgm images
Storozhilova et al. 2.5 D extension of neighborhood filters for noise reduction in 3D medical CT images
AKINTOYE et al. COMPOSITE MEDIAN WIENER FILTER BASED TECHNIQUE FOR IMAGE ENHANCEMENT.
Sreedevi et al. A modified approach for the removal of impulse noise from mammogram images
Han et al. An ICA-domain shrinkage based Poisson-noise reduction algorithm and its application to Penumbral imaging
Saxena et al. Utilizing deep learning techniques to diagnose nodules in lung computed tomography (ct) scan images
CN110246096B (en) Fitting correction method and device for scattered X-ray
Ostojić et al. Recursive radiography image denoising
SHINOHARA et al. Spatial resolution characteristics of image reconstruction with nonlinear filter-based L1 regularization: a simulation study

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 266043 No. 1, Loushan Road, Licang District, Qingdao, Shandong

Applicant after: Neutron Times (Qingdao) Innovation Technology Co.,Ltd.

Address before: 266043 No. 1, Loushan Road, Licang District, Qingdao, Shandong

Applicant before: Qingdao Yuandongxin Energy Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant