CN117391985A - Multi-source data information fusion processing method and system - Google Patents

Multi-source data information fusion processing method and system Download PDF

Info

Publication number
CN117391985A
CN117391985A CN202311685726.9A CN202311685726A CN117391985A CN 117391985 A CN117391985 A CN 117391985A CN 202311685726 A CN202311685726 A CN 202311685726A CN 117391985 A CN117391985 A CN 117391985A
Authority
CN
China
Prior art keywords
fused
image
pixel point
column
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311685726.9A
Other languages
Chinese (zh)
Other versions
CN117391985B (en
Inventor
何奇兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Jifen Intelligent Technology Co ltd
Original Assignee
Anhui Jifen Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Jifen Intelligent Technology Co ltd filed Critical Anhui Jifen Intelligent Technology Co ltd
Priority to CN202311685726.9A priority Critical patent/CN117391985B/en
Publication of CN117391985A publication Critical patent/CN117391985A/en
Application granted granted Critical
Publication of CN117391985B publication Critical patent/CN117391985B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to the technical field of image processing, in particular to a multi-source data information fusion processing method and system, wherein the method comprises the following steps: acquiring an image A to be fused and an image B to be fused; obtaining matrixes of the image A to be fused and the image B to be fused according to all the image blocks of the image A to be fused and the image B to be fused, and further obtaining matrixes after spatial transformation of the image A to be fused and the image B to be fused; acquiring the definition effectiveness of pixel points at the same position of each of the image A to be fused and the image B to be fused, and acquiring the effectiveness of each column when the image A to be fused and the image B to be fused are fused; and acquiring a matrix after fusion of the image A to be fused and the image B to be fused according to the fusion weight of each column when the image A to be fused and the image B to be fused are fused, and the first element value and the first function value of each pixel point in each column, so as to complete multi-source image data fusion. The invention solves the problem of high-frequency information loss and obtains fusion data with better definition.

Description

Multi-source data information fusion processing method and system
Technical Field
The invention relates to the technical field of image processing, in particular to a multi-source data information fusion processing method and system.
Background
In the current information age, a large number of data sources including sensors, social media, weblogs, etc. are emerging. These data sources provide rich information, but also present problems of data fragmentation and information redundancy, which present challenges for data analysis and utilization; in the process of performing SFM three-dimensional reconstruction by using multi-source data, multi-source image acquisition is often performed by using an unmanned aerial vehicle carrying multiple cameras, images of three heights of high, medium and low are obtained, and then point cloud data acquisition is performed by using an oblique photography technology, so that SFM three-dimensional reconstruction is completed. In the acquired images, because the focuses of different images are different in the acquisition, the definition of the images of the same object is inconsistent, so that a clearer and more accurate SFM three-dimensional model is built for obtaining the accurate and clear images, and the fusion of the adjacent image information of the coordinate positions is selected.
The existing multi-scale fusion method is carried out through singular value decomposition, but when fusion is carried out on multi-scale images through singular value decomposition, average weighted fusion is carried out on singular value features among different areas, so that the situation that the fusion effect is poor possibly exists.
Disclosure of Invention
In order to solve the problems, the invention provides a multi-source data information fusion processing method and a multi-source data information fusion processing system.
The embodiment of the invention provides a multi-source data information fusion processing method, which comprises the following steps:
acquiring an image A to be fused and an image B to be fused;
acquiring a plurality of image blocks from an image A to be fused and an image B to be fused respectively; acquiring matrixes of the image A to be fused and the image B to be fused according to the image blocks of the image A to be fused and the image B to be fused; performing spatial transformation on the matrix of the image A to be fused and the matrix of the image B to be fused to obtain a matrix after spatial transformation of the image A to be fused and the image B to be fused;
determining a difference image EA and a difference image EB of the image A to be fused and the image B to be fused; determining the definition effectiveness of pixel points at the same position of each of the image A to be fused and the image B to be fused; acquiring two-dimensional Gaussian function values of each pixel point in the image A to be fused and the image B to be fused after correction according to the definition effectiveness of the pixel points at the same position of the image A to be fused and the image B to be fused and the difference image EA and the difference image EB; acquiring the effectiveness of each row when the image A to be fused and the image B to be fused are fused according to the two-dimensional Gaussian function value corrected by each pixel point in the image A to be fused and the image B to be fused;
determining the fusion weight of each column when the image A to be fused and the image B to be fused are fused according to the effectiveness of each column when the image A to be fused and the image B to be fused are fused; determining a first element value of each pixel point in each column when the image A to be fused and the image B to be fused are fused according to the matrix after the spatial transformation of the image A to be fused and the image B to be fused; according to the corrected two-dimensional Gaussian functions of the image A to be fused and the image B to be fused, determining a first function value of each pixel point in each column when the image A to be fused and the image B to be fused are fused; determining a matrix after fusion of the image A to be fused and the image B to be fused according to the fusion weight of each column, the first element value of each pixel point in each column and the first function value of each pixel point in each column when the image A to be fused and the image B to be fused are fused; and obtaining fused data according to the matrix fused by the image A to be fused and the image B to be fused.
Preferably, the spatial transformation is performed on the matrix of the image to be fused A and the matrix of the image to be fused B to obtain the matrix after the spatial transformation of the image to be fused A and the image to be fused B, and the specific method comprises the following steps:
matrix of images a to be fusedAnd a matrix of images B to be fused->Singular value decomposition is respectively carried out to obtain a left singular matrix of the image A to be fused>And the left singular matrix of the image B to be fused +.>The method comprises the steps of carrying out a first treatment on the surface of the Left singular matrix of image A to be fused +.>Sum matrix->Is used as a spatially transformed matrix of the image A to be fused +.>Left of the image B to be fusedSingular matrix->Sum matrix->Is used as a spatially transformed matrix of the image B to be fused +.>
Preferably, the determining the validity of the definition of the pixel point at each same position of the image to be fused a and the image to be fused B includes the following specific steps:
for a pixel point of any one of the image A to be fused and the image B to be fused at the same position, the pixel point is positioned at the pixel value of the image A to be fusedMean value of all pixels in eight neighborhoods of the pixel point>The absolute value of the difference value of the pixel point is marked as a first difference value, and the pixel value of the pixel point in the image B to be fused is +.>Mean value of all pixels in eight neighborhoods of the pixel point>The absolute value of the difference value of the pixel point is marked as a second difference value, and the average value of the first difference value and the second difference value is used as the definition effectiveness of the pixel point; and further obtaining the definition effectiveness of the pixel points under the same position of the image A to be fused and the image B to be fused.
Preferably, the method for obtaining the two-dimensional gaussian function value corrected by each pixel point in the image to be fused a and the image to be fused B according to the definition effectiveness of the pixel point in each same position of the image to be fused a and the image to be fused B and the difference image EA and the difference image EB includes the following specific steps:
obtaining each image in the difference image EAThe coordinate values and the pixel values of the pixel points form three-dimensional data of the image A to be fused; acquiring three-dimensional data of an image B to be fused, which is formed by coordinate values and pixel values of all pixel points in a difference image EB; respectively performing two-dimensional Gaussian function fitting on the three-dimensional data of the image A to be fused and the three-dimensional data of the image B to be fused by using the GMM Gaussian function model to obtain a two-dimensional Gaussian function of the image A to be fusedAnd a two-dimensional Gaussian function of the image B to be fused>
For any pixel point in the image A to be fused, according to the two-dimensional Gaussian function of the image A to be fusedAcquiring a two-dimensional Gaussian function value of a pixel point, and taking the product of the two-dimensional Gaussian function value of the pixel point and the definition effectiveness of the pixel point as the two-dimensional Gaussian function value of the pixel point in the image A to be fused after correction;
for any pixel point in the image B to be fused, according to the two-dimensional Gaussian function of the image B to be fusedAnd obtaining a two-dimensional Gaussian function value of the pixel point, and taking the product of the two-dimensional Gaussian function value of the pixel point and the definition effectiveness of the pixel point as the two-dimensional Gaussian function value of the pixel point in the image B to be fused after correction.
Preferably, the obtaining the validity of each column when the image to be fused A and the image to be fused B are fused according to the two-dimensional Gaussian function value corrected by each pixel point in the image to be fused A and the image to be fused B comprises the following specific methods:
and for any column under the same position of the image A to be fused and the image B to be fused, taking the maximum value in the corrected two-dimensional Gaussian function values of all pixel points in the column as the validity of the column when the image A to be fused and the image B to be fused are fused.
Preferably, the determining the fusion weight of each column when the image to be fused A and the image to be fused B are fused according to the validity of each column when the image to be fused A and the image to be fused B are fused comprises the following specific methods:
for the first fusion of the image A to be fused and the image B to be fusedColumn +.>The ratio of the effectiveness of the columns to the maximum value of the effectiveness of all columns when the image A to be fused and the image B to be fused are fused, and the ratio is used as the +.>Fusion weights for columns.
Preferably, the determining the first element value of each pixel point in each column when the image to be fused a and the image to be fused B are fused according to the matrix after the spatial transformation of the image to be fused a and the image to be fused B includes the following specific methods:
for the first fusion of the image A to be fused and the image B to be fusedColumn +.>The image A to be fused is subjected to pixel pointColumn +.>The two-dimensional Gaussian function value corrected by each pixel point is recorded as a second function value; image B to be fused is->Column +.>The two-dimensional Gaussian function value corrected by each pixel point is recorded as a third function value; and (3) marking the maximum value between the second function value and the third function value as a first extremum, marking the image to be fused corresponding to the first extremum as a first image, and marking the pixel point as the element of the matrix after the spatial transformation of the first image as the first element value of the pixel point.
Preferably, the determining the first function value of each pixel point in each column when the image to be fused A and the image to be fused B are fused according to the corrected two-dimensional Gaussian functions of the image to be fused A and the image to be fused B comprises the following specific methods:
for the first fusion of the image A to be fused and the image B to be fusedColumn +.>The method comprises the steps of obtaining corresponding elements of pixel points in a matrix of an image A to be fused after spatial transformation, and marking the elements as first elements; acquiring corresponding elements of the pixel points in the matrix of the image B to be fused after the spatial transformation, and marking the elements as second elements; and marking the maximum value between the first element and the second element as a second extremum, marking the image to be fused corresponding to the second extremum as a second image, and taking the two-dimensional Gaussian function value corrected by the pixel point in the second image as the first function value of the pixel point.
Preferably, the determining the matrix after the fusion of the image a to be fused and the image B to be fused according to the fusion weight of each column when the image a to be fused and the image B to be fused are fused, the first element value of each pixel point in each column, and the first function value of each pixel point in each column comprises the following specific methods:
the first of the matrix after the fusion of the image A to be fused and the image B to be fusedColumn->The element value calculation method of the row comprises the following steps:
in the method, in the process of the invention,representing +.sup.th in the matrix after fusion of image A to be fused and image B to be fused>Column->Element values of rows;representing the +.sup.th of fusion of image A to be fused and image B to be fused>Fusion weights of columns; />Representing the +.sup.th of fusion of image A to be fused and image B to be fused>Column +.>A first pixel value of each pixel point; />Representing the +.sup.th of fusion of image A to be fused and image B to be fused>Column +.>A first function value for each pixel point.
The embodiment of the invention provides a multi-source data information fusion processing system, which comprises a data acquisition module, a data characteristic analysis module, a data characteristic acquisition module and a data fusion module, wherein:
the data acquisition module is used for acquiring an image A to be fused and an image B to be fused;
the data characteristic analysis module is used for respectively acquiring a plurality of image blocks from the image A to be fused and the image B to be fused; acquiring matrixes of the image A to be fused and the image B to be fused according to the image blocks of the image A to be fused and the image B to be fused; performing spatial transformation on the matrix of the image A to be fused and the matrix of the image B to be fused to obtain a matrix after spatial transformation of the image A to be fused and the image B to be fused;
the data characteristic acquisition module is used for determining a difference image EA and a difference image EB of the image A to be fused and the image B to be fused; determining the definition effectiveness of pixel points at the same position of each of the image A to be fused and the image B to be fused; acquiring two-dimensional Gaussian function values of each pixel point in the image A to be fused and the image B to be fused after correction according to the definition effectiveness of the pixel points at the same position of the image A to be fused and the image B to be fused and the difference image EA and the difference image EB; acquiring the effectiveness of each row when the image A to be fused and the image B to be fused are fused according to the two-dimensional Gaussian function value corrected by each pixel point in the image A to be fused and the image B to be fused;
the data fusion module is used for determining the fusion weight of each column when the image A to be fused and the image B to be fused are fused according to the effectiveness of each column when the image A to be fused and the image B to be fused are fused; determining a first element value of each pixel point in each column when the image A to be fused and the image B to be fused are fused according to the matrix after the spatial transformation of the image A to be fused and the image B to be fused; according to the corrected two-dimensional Gaussian functions of the image A to be fused and the image B to be fused, determining a first function value of each pixel point in each column when the image A to be fused and the image B to be fused are fused; determining a matrix after fusion of the image A to be fused and the image B to be fused according to the fusion weight of each column, the first element value of each pixel point in each column and the first function value of each pixel point in each column when the image A to be fused and the image B to be fused are fused; and obtaining fused data according to the matrix fused by the image A to be fused and the image B to be fused.
The technical scheme of the invention has the beneficial effects that: aiming at the problem that when fusion is carried out on the multi-scale images by utilizing singular value decomposition, average weighted fusion is carried out on singular value features among different areas, so that the situation that the fusion effect is poor possibly exists; according to the invention, the column weight in singular value decomposition fusion is obtained by constructing the validity characteristic, the problem of inconsistent focus in the multisource acquisition image is solved, the data fusion of the importance matrix is further carried out according to the column weight and local high-frequency information, the problem of high-frequency information loss is solved, and fused data with better definition is further obtained.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart showing steps of a multi-source data information fusion processing method of the present invention;
FIG. 2 is a block diagram of a multi-source data information fusion processing system according to the present invention.
Detailed Description
In order to further describe the technical means and effects adopted by the present invention to achieve the preset purpose, the following detailed description refers to the specific implementation, structure, characteristics and effects of a multi-source data information fusion processing method and system according to the present invention with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of a multi-source data information fusion processing method and system provided by the invention with reference to the accompanying drawings.
Referring to fig. 1, a flowchart of steps of a multi-source data information fusion processing method according to an embodiment of the invention is shown, and the method includes the following steps:
step S001: and obtaining an image A to be fused and an image B to be fused.
In the acquired images, because the focuses of the different images are different during acquisition, the definition of the images of the same object is inconsistent, so that in order to obtain accurate and clear images, a clearer and more accurate SFM three-dimensional model is established, and further, in the embodiment, the adjacent image information of the coordinate positions is selected to be fused.
Specifically, in order to implement the multi-source data information fusion processing method provided in this embodiment, an image a to be fused and an image B to be fused need to be acquired first, and the specific process is as follows:
the unmanned aerial vehicle carrying the multi-camera module and the GPS sensor is subjected to path planning, and then the unmanned aerial vehicle is controlled to fly according to a preset path, so that images acquired by all unmanned aerial vehicles can be obtained; wherein the multi-camera model is a fixed camera module. Secondly, acquiring any group of adjacent images in images acquired by all unmanned aerial vehicles by utilizing GPS data, namely dividing the images acquired by the two unmanned aerial vehicles with the smallest Euclidean distance between GPS coordinates into a group of adjacent images; finally, after a group of adjacent images are obtained, two images and images in the group of adjacent images are used as images to be fused; obtaining an image A to be fused and an image B to be fused; the embodiment describes that the acquired image to be fused a and the acquired image to be fused B have a size of 400×400.
So far, the image A to be fused and the image B to be fused are obtained through the method.
Step S002: and obtaining matrixes of the image A to be fused and the image B to be fused according to all the image blocks of the image A to be fused and the image B to be fused, and carrying out singular value decomposition on the two matrixes to obtain matrixes after spatial transformation of the image A to be fused and the image B to be fused.
1. And acquiring a plurality of image blocks of the image A to be fused and a plurality of image blocks of the image B to be fused.
It should be noted that, since the singular value decomposition result is a linear decomposition tool, if the selected area is too large, the feature of the whole image may exist in the decomposition result, resulting in poor fusion effect, and in order to ensure the emotion degree of image fusion, in this embodiment, the image a to be fused and the image B to be fused are selectively segmented respectively to obtain corresponding segmented images, where the image a to be fused and the image B to be fused are the same in size because they are shot by the same camera module.
Presetting a parameterWherein the present embodiment is +.>Examples are described, the present embodiment is not particularly limited, wherein +.>Depending on the particular implementation.
Specifically, a preset size is usedThe sliding window of (a) is respectively carried out on the image A to be fused and the image B to be fused, and the step length is +.>Respectively obtaining a plurality of windows of an image A to be fused and a plurality of windows of an image B to be fused, and taking each window of the image A to be fused as an image block of the image A to be fused; each window of the image B to be fused serves as an image block of the image B to be fused, and then a plurality of image blocks of the image A to be fused and a plurality of image blocks of the image B to be fused are obtained. Wherein, in order to enable the sliding window to uniformly divide the image A to be fused and the image B to be fused into a plurality of windows, the size of the sliding window is set as the image A to be fusedThe common factor of the image sizes of the fused image a and the image B to be fused.
So far, a plurality of image blocks of the image A to be fused and a plurality of image blocks of the image B to be fused are obtained.
2. And acquiring a matrix after spatial transformation of the image A to be fused and a matrix after spatial transformation of the image B to be fused.
Specifically, marking all image blocks of the image A to be fused according to the positions of all image blocks of the image A to be fused corresponding to the image blocks of the image A to be fused from left to right and from top to bottom; for any one image block of the image A to be fused, expanding pixel values of all pixel points in the image block according to a Z-shaped scanning sequence to form a one-dimensional vector, forming a plurality of one-dimensional vectors by all the image blocks of the image A to be fused, and arranging the plurality of one-dimensional vectors according to the marks of the corresponding image blocks in rows to obtain a matrix of the image A to be fusedThe method comprises the steps of carrying out a first treatment on the surface of the Similarly, a matrix of the image B to be fused is acquired>
Matrix of images a to be fusedAnd a matrix of images B to be fused->Singular value decomposition is respectively carried out to obtain a left singular matrix of the image A to be fused>And the left singular matrix of the image B to be fused +.>The method comprises the steps of carrying out a first treatment on the surface of the Left singular matrix of image A to be fused +.>Sum matrix->Is used as a spatially transformed matrix of the image A to be fused +.>Left singular matrix of image B to be fused +.>Sum matrix->Is used as a spatially transformed matrix of the image B to be fused +.>
It should be noted that, when the spatially transformed matrices of the image a to be fused and the image B to be fused are obtained, the positions of the left singular matrices U and the matrix C of the image a to be fused and the image B to be fused cannot be interchanged, and each column in the spatially transformed matrix D represents the main information after stretching the gray data of the pixel points in the image block corresponding to the column in the matrix C; the singular value decomposition method is the prior art, and the implementation is not described in detail here.
So far, the matrix after the spatial transformation of the image A to be fused and the matrix after the spatial transformation of the image B to be fused are obtained through the method.
Step S003: and acquiring the definition effectiveness of the pixel points under each same position of the image A to be fused and the image B to be fused, and acquiring the effectiveness of each column when the image A to be fused and the image B to be fused are fused.
It should be noted that, in the conventional singular value image fusion method, only the average value is used for data fusion, which may result in that the fused image cannot ensure that the fused information is necessarily the desired information. Since the main information tends to be low frequency information and the high frequency information like edge texture tends to be high frequency information. And the distance value between different areas and the focusing point is different, the definition is different, so that the column data corresponding to higher definition should be focused more when fusion is performed.
1. And acquiring the definition effectiveness of the pixel points under the same position of each of the image A to be fused and the image B to be fused.
It should be noted that, the different positions of the camera module set may cause inconsistent sharpness in different areas due to different focal positions when capturing images of the camera. In order to enable the clear part between two images to finish image fusion, the embodiment selects the image blurred by Gaussian filtering to be differenced from the original image, so that an area with obvious gradient distribution in the image is obtained, the approximate distribution position of a focus is obtained, the clear part of each pixel point in the image is represented and used as the validity of gray data among different pixel points, and the validity of the definition of the pixel points under the same position of the two images to be fused is obtained.
Presetting a parameterWherein the present embodiment is +.>To describe the example, the present embodiment is not particularly limited, wherein +.>Depending on the particular implementation.
Specifically, the variance is used for the image A to be fused and the image B to be fusedCarrying out Gaussian filtering on the Gaussian filter kernel of the image B to obtain a filtered image A and an image B to be fused; and subtracting the filtered image A and the image B to be fused from the image A to be fused and the image B to be fused respectively to obtain a difference image EA and a difference image EB. The pixel value of each pixel point on the difference image is the absolute value of the difference value of the pixel value of each pixel point on the filtered image and the image to be fused; the size of the gaussian filter kernel can be adjusted by an implementer according to a specific implementation scene, and the filtering operation is not described in detail.
It should be noted that, in the difference image EA and the difference image EB, the pixel point having a larger value indicates that the gray level change after blurring is large, that is, the more likely the part with a large front-rear difference value is a clear part in the collected image, because the sizes of the images a and B to be fused are consistent, if the local gray level consistency is higher, the difference value is smaller, and the gray level difference in the images a and B to be fused is not large regardless of the clear part or the blurred part.
Specifically, for a pixel point of any one of the same position of the image A to be fused and the image B to be fused, the pixel point is positioned at the pixel value of the image A to be fusedMean value of all pixels in eight neighborhoods of the pixel point>The absolute value of the difference value of the pixel point is marked as a first difference value, and the pixel value of the pixel point in the image B to be fused is +.>Mean value of all pixels in eight neighborhoods of the pixel point>The absolute value of the difference value of the pixel point is marked as a second difference value, and the average value of the first difference value and the second difference value is used as the definition effectiveness of the pixel point; and similarly, obtaining the definition effectiveness of the pixel points at the same position of each of the image A to be fused and the image B to be fused.
So far, the definition effectiveness of the pixel points under each same position of the image A to be fused and the image B to be fused is obtained.
2. And acquiring the validity of each column when the image A to be fused and the image B to be fused are fused.
It should be noted that, when an image is acquired, the image blur caused by focusing is usually that the focusing center is a clear part, and then the surrounding pixels with divergence gradually blur; further, in this embodiment, a single two-dimensional gaussian function fitting is performed on the images, so as to obtain two-dimensional gaussian functions corresponding to the two images, where the two-dimensional gaussian functions respectively represent approximate distribution of the focusing clear portions in the corresponding images.
Specifically, coordinate values and pixel values of all pixel points in a difference image EA are obtained to form three-dimensional data of an image A to be fused; acquiring three-dimensional data of an image B to be fused, which is formed by coordinate values and pixel values of all pixel points in a difference image EB; respectively performing two-dimensional Gaussian function fitting on the three-dimensional data of the image A to be fused and the three-dimensional data of the image B to be fused by using the GMM Gaussian function model to obtain a two-dimensional Gaussian function of the image A to be fusedAnd a two-dimensional Gaussian function of the image B to be fused>
For any pixel point in the image A to be fused, according to the two-dimensional Gaussian function of the image AAcquiring a two-dimensional Gaussian function value of the pixel point, and taking the product of the two-dimensional Gaussian function value of the pixel point and the definition effectiveness of the pixel point as the two-dimensional Gaussian function value after the pixel point is corrected; similarly, acquiring two-dimensional Gaussian function values after correction of all pixel points in the image A to be fused; and similarly, obtaining two-dimensional Gaussian function values after correction of all pixel points in the image B to be fused.
For any column under the same position of the image A to be fused and the image B to be fused, taking the maximum value in the corrected two-dimensional Gaussian function values of all pixel points in the column as the validity of the column when the image A to be fused and the image B to be fused are fused; and similarly, obtaining the validity of each column when the image A to be fused and the image B to be fused are fused.
So far, the validity of each row when the image A to be fused and the image B to be fused are fused is obtained through the method.
Step S004: and acquiring a matrix after fusion of the image A to be fused and the image B to be fused according to the fusion weight of each column when the image A to be fused and the image B to be fused are fused, and the first element value and the first function value of each pixel point in each column, so as to complete multi-source acquisition image data fusion.
After obtaining the validity of each column at the time of fusion, the method can be directly carried out according to the validityAnd->Column weight assignment at fusion, complete +.>And->Fusion weight distribution between different columns at fusion, but since the validity is only obtained from the focal position distribution of the focus, but at +.>And->In the importance distribution, since the low frequency is a local main feature, the more front the importance distribution is, the more rear the local detail is often the high frequency information, and further the data of each row needs to be subjected to further fusion weight adjustment during fusion, so that only the maximum value and the average value cannot be adopted.
For columns with greater effectiveness, more high frequency information should be selected, i.e. mainly images with greater image sharpness, while for columns with lower effectiveness, due to their own bias-blur, if more high frequency information is selected, local distortion may be caused, so that the high frequency data is more noise-like after fusion.
Specifically, for the first fusion of the image A to be fused and the image B to be fusedColumn, fusing the image A to be fused and the image B to be fusedTime->The ratio of the effectiveness of the columns to the maximum value of the effectiveness of all columns when the image A to be fused and the image B to be fused are fused, and the ratio is used as the +.>Fusion weights of columns; and similarly, acquiring the fusion weight of each column when the image A to be fused and the image B to be fused are fused.
For the first fusion of the image A to be fused and the image B to be fusedColumn +.>The image A to be fused is subjected to pixel pointColumn +.>The two-dimensional Gaussian function value corrected by each pixel point is recorded as a second function value; image B to be fused is->Column +.>The two-dimensional Gaussian function value corrected by each pixel point is recorded as a third function value; the maximum value between the second function value and the third function value is marked as a first extremum, the image to be fused corresponding to the first extremum is marked as a first image, and the pixel point corresponds to an element of a matrix after the spatial transformation of the first image and is used as a first element value of the pixel point; and similarly, acquiring a first element value of each pixel point in each column when the image A to be fused and the image B to be fused are fused.
For the first fusion of the image A to be fused and the image B to be fusedColumn +.>The method comprises the steps of obtaining corresponding elements of pixel points in a matrix of an image A to be fused after spatial transformation, and marking the elements as first elements; acquiring corresponding elements of the pixel points in the matrix of the image B to be fused after the spatial transformation, and marking the elements as second elements; the maximum value between the first element and the second element is marked as a second extremum, the image to be fused corresponding to the second extremum is marked as a second image, and the two-dimensional Gaussian function value corrected by the pixel point in the second image is used as a first function value of the pixel point; and similarly, acquiring a first function value of each pixel point in each column when the image A to be fused and the image B to be fused are fused.
In summary, according to the fusion weight of each column, the first element value of each pixel point in each column, and the first function value of each pixel point in each column when the image to be fused a and the image to be fused B are fused, the specific matrix acquisition method after fusion of the image to be fused a and the image to be fused B is as follows:
the first of the matrix after the fusion of the image A to be fused and the image B to be fusedColumn->The element value calculation method of the row comprises the following steps:
in the method, in the process of the invention,representing +.sup.th in the matrix after fusion of image A to be fused and image B to be fused>Column->Element values of rows;representing the +.sup.th of fusion of image A to be fused and image B to be fused>Fusion weights of columns; />Representing the +.sup.th of fusion of image A to be fused and image B to be fused>Column +.>A first pixel value of each pixel point; />Representing the +.sup.th of fusion of image A to be fused and image B to be fused>Column +.>A first function value for each pixel point.
So far, according to all element values on the matrix fused by the image A to be fused and the image B to be fused, the fused matrix is obtainedThe method comprises the steps of carrying out a first treatment on the surface of the And carrying out data fusion on the importance matrix according to the column weight and the local high-frequency information to obtain a fused matrix, solving the problem of high-frequency information loss, and further obtaining fused data with better definition.
By matrix-forming left singular of the image A to be fusedAnd the left singular matrix of the image B to be fused +.>The average value of the elements at the same position is taken, so that a fused left singular matrix is obtained; inverse matrix of the fused left singular matrix and the fused matrix are added>Multiplying to obtain fused data.
Through the steps, the multi-source acquisition image data fusion is completed.
Referring to fig. 2, a block diagram of a multi-source data information fusion processing system according to an embodiment of the present invention is shown, where the system includes the following modules:
the data acquisition module is used for acquiring an image A to be fused and an image B to be fused;
the data characteristic analysis module is used for respectively acquiring a plurality of image blocks from the image A to be fused and the image B to be fused; acquiring matrixes of the image A to be fused and the image B to be fused according to the image blocks of the image A to be fused and the image B to be fused; performing spatial transformation on the matrix of the image A to be fused and the matrix of the image B to be fused to obtain a matrix after spatial transformation of the image A to be fused and the image B to be fused;
the data characteristic acquisition module is used for determining a difference image EA and a difference image EB of the image A to be fused and the image B to be fused; determining the definition effectiveness of pixel points at the same position of each of the image A to be fused and the image B to be fused; acquiring two-dimensional Gaussian function values of each pixel point in the image A to be fused and the image B to be fused after correction according to the definition effectiveness of the pixel points at the same position of the image A to be fused and the image B to be fused and the difference image EA and the difference image EB; acquiring the effectiveness of each row when the image A to be fused and the image B to be fused are fused according to the two-dimensional Gaussian function value corrected by each pixel point in the image A to be fused and the image B to be fused;
the data fusion module is used for determining the fusion weight of each column when the image A to be fused and the image B to be fused are fused according to the effectiveness of each column when the image A to be fused and the image B to be fused are fused; determining a first element value of each pixel point in each column when the image A to be fused and the image B to be fused are fused according to the matrix after the spatial transformation of the image A to be fused and the image B to be fused; according to the corrected two-dimensional Gaussian functions of the image A to be fused and the image B to be fused, determining a first function value of each pixel point in each column when the image A to be fused and the image B to be fused are fused; determining a matrix after fusion of the image A to be fused and the image B to be fused according to the fusion weight of each column, the first element value of each pixel point in each column and the first function value of each pixel point in each column when the image A to be fused and the image B to be fused are fused; and obtaining fused data according to the matrix fused by the image A to be fused and the image B to be fused.
Aiming at the problem that when fusion is carried out on the multi-scale images by utilizing singular value decomposition, average weighted fusion is carried out on singular value features among different areas, so that the situation that the fusion effect is poor possibly exists; according to the invention, the column weight in singular value decomposition fusion is obtained by constructing the validity characteristic, the problem of inconsistent focus in the multisource acquisition image is solved, the data fusion of the importance matrix is further carried out according to the column weight and local high-frequency information, the problem of high-frequency information loss is solved, and fused data with better definition is further obtained.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the invention, but any modifications, equivalent substitutions, improvements, etc. within the principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. The multi-source data information fusion processing method is characterized by comprising the following steps of:
acquiring an image A to be fused and an image B to be fused;
acquiring a plurality of image blocks from an image A to be fused and an image B to be fused respectively; acquiring matrixes of the image A to be fused and the image B to be fused according to the image blocks of the image A to be fused and the image B to be fused; performing spatial transformation on the matrix of the image A to be fused and the matrix of the image B to be fused to obtain a matrix after spatial transformation of the image A to be fused and the image B to be fused;
determining a difference image EA and a difference image EB of the image A to be fused and the image B to be fused; determining the definition effectiveness of pixel points at the same position of each of the image A to be fused and the image B to be fused; acquiring two-dimensional Gaussian function values of each pixel point in the image A to be fused and the image B to be fused after correction according to the definition effectiveness of the pixel points at the same position of the image A to be fused and the image B to be fused and the difference image EA and the difference image EB; acquiring the effectiveness of each row when the image A to be fused and the image B to be fused are fused according to the two-dimensional Gaussian function value corrected by each pixel point in the image A to be fused and the image B to be fused;
determining the fusion weight of each column when the image A to be fused and the image B to be fused are fused according to the effectiveness of each column when the image A to be fused and the image B to be fused are fused; determining a first element value of each pixel point in each column when the image A to be fused and the image B to be fused are fused according to the matrix after the spatial transformation of the image A to be fused and the image B to be fused; according to the corrected two-dimensional Gaussian functions of the image A to be fused and the image B to be fused, determining a first function value of each pixel point in each column when the image A to be fused and the image B to be fused are fused; determining a matrix after fusion of the image A to be fused and the image B to be fused according to the fusion weight of each column, the first element value of each pixel point in each column and the first function value of each pixel point in each column when the image A to be fused and the image B to be fused are fused; and obtaining fused data according to the matrix fused by the image A to be fused and the image B to be fused.
2. The method for processing the multi-source data information fusion according to claim 1, wherein the method for processing the matrix of the image a to be fused and the matrix of the image B to be fused by spatial transformation to obtain the matrix of the image a to be fused and the matrix of the image B to be fused after spatial transformation comprises the following specific steps:
matrix of images a to be fusedAnd a matrix of images B to be fused->Singular value decomposition is respectively carried out to obtain a left singular matrix of the image A to be fused>And the left singular matrix of the image B to be fused +.>The method comprises the steps of carrying out a first treatment on the surface of the Left singular matrix of image A to be fused +.>Sum matrix->Is used as a spatially transformed matrix of the image A to be fused +.>Left singular matrix of image B to be fusedSum matrix->Is used as a spatially transformed matrix of the image B to be fused +.>
3. The method for processing the multi-source data information fusion according to claim 1, wherein the determining the validity of the definition of the pixel point at the same position of each of the image to be fused a and the image to be fused B comprises the following specific steps:
for a pixel point of any one of the image A to be fused and the image B to be fused at the same position, the pixel point is positioned at the pixel value of the image A to be fusedAnd the pixel pointMean value of all pixels in eight neighborhoods +.>The absolute value of the difference value of the pixel point is marked as a first difference value, and the pixel value of the pixel point in the image B to be fused is +.>Mean value of all pixels in eight neighborhoods of the pixel point>The absolute value of the difference value of the pixel point is marked as a second difference value, and the average value of the first difference value and the second difference value is used as the definition effectiveness of the pixel point; and further obtaining the definition effectiveness of the pixel points under the same position of the image A to be fused and the image B to be fused.
4. The method for processing the multi-source data information fusion according to claim 1, wherein the obtaining the two-dimensional gaussian function value after each pixel point correction in the image a to be fused and the image B to be fused according to the definition validity of the pixel point and the difference image EA and the difference image EB in the same position of each of the image a to be fused and the image B to be fused comprises the following specific steps:
acquiring coordinate values and pixel values of all pixel points in a difference image EA to form three-dimensional data of an image A to be fused; acquiring three-dimensional data of an image B to be fused, which is formed by coordinate values and pixel values of all pixel points in a difference image EB; respectively performing two-dimensional Gaussian function fitting on the three-dimensional data of the image A to be fused and the three-dimensional data of the image B to be fused by using the GMM Gaussian function model to obtain a two-dimensional Gaussian function of the image A to be fusedAnd a two-dimensional Gaussian function of the image B to be fused>
For any one image in the image A to be fusedPixel point according to two-dimensional Gaussian function of image A to be fusedAcquiring a two-dimensional Gaussian function value of a pixel point, and taking the product of the two-dimensional Gaussian function value of the pixel point and the definition effectiveness of the pixel point as the two-dimensional Gaussian function value of the pixel point in the image A to be fused after correction;
for any pixel point in the image B to be fused, according to the two-dimensional Gaussian function of the image B to be fusedAnd obtaining a two-dimensional Gaussian function value of the pixel point, and taking the product of the two-dimensional Gaussian function value of the pixel point and the definition effectiveness of the pixel point as the two-dimensional Gaussian function value of the pixel point in the image B to be fused after correction.
5. The method for processing the multi-source data information fusion according to claim 1, wherein the obtaining the validity of each column when the image a to be fused and the image B to be fused are fused according to the two-dimensional gaussian function value corrected by each pixel point in the image a to be fused and the image B to be fused comprises the following specific steps:
and for any column under the same position of the image A to be fused and the image B to be fused, taking the maximum value in the corrected two-dimensional Gaussian function values of all pixel points in the column as the validity of the column when the image A to be fused and the image B to be fused are fused.
6. The method for processing the multi-source data information fusion according to claim 1, wherein the determining the fusion weight of each column when the image a to be fused and the image B to be fused are fused according to the validity of each column when the image a to be fused and the image B to be fused are fused comprises the following specific steps:
for the first fusion of the image A to be fused and the image B to be fusedColumn, to-be-fused graphImage A and image B to be fused are fused +.>The ratio of the effectiveness of the columns to the maximum value of the effectiveness of all columns when the image A to be fused and the image B to be fused are fused, and the ratio is used as the +.>Fusion weights for columns.
7. The method for processing the multi-source data information fusion according to claim 1, wherein the determining the first pixel value of each pixel point in each column when the image a to be fused and the image B to be fused are fused according to the matrix after the spatial transformation of the image a to be fused and the image B to be fused comprises the following specific steps:
for the first fusion of the image A to be fused and the image B to be fusedColumn +.>Pixels, the image A to be fused is first +.>Column +.>The two-dimensional Gaussian function value corrected by each pixel point is recorded as a second function value; image B to be fused is->Column +.>The two-dimensional Gaussian function value corrected by each pixel point is recorded as a third function value; the maximum value between the second function value and the third function value is recorded as the first function valueAnd recording an image to be fused corresponding to the first extremum as a first image, and taking the pixel point corresponding to an element of the matrix after the spatial transformation of the first image as a first element value of the pixel point.
8. The method for processing the multi-source data information fusion according to claim 1, wherein the determining the first function value of each pixel point in each column when the image a to be fused and the image B to be fused are fused according to the modified two-dimensional gaussian function of the image a to be fused and the image B to be fused comprises the following specific steps:
for the first fusion of the image A to be fused and the image B to be fusedColumn +.>The method comprises the steps of obtaining corresponding elements of pixel points in a matrix of an image A to be fused after spatial transformation, and marking the elements as first elements; acquiring corresponding elements of the pixel points in the matrix of the image B to be fused after the spatial transformation, and marking the elements as second elements; and marking the maximum value between the first element and the second element as a second extremum, marking the image to be fused corresponding to the second extremum as a second image, and taking the two-dimensional Gaussian function value corrected by the pixel point in the second image as the first function value of the pixel point.
9. The method for processing the multi-source data information fusion according to claim 1, wherein the determining the matrix after the fusion of the image a and the image B according to the fusion weight of each column, the first element value of each pixel point in each column, and the first function value of each pixel point in each column when the image a and the image B are fused comprises the following specific steps:
the first of the matrix after the fusion of the image A to be fused and the image B to be fusedColumn->The element value calculation method of the row comprises the following steps:
in the method, in the process of the invention,representing +.sup.th in the matrix after fusion of image A to be fused and image B to be fused>Column->Element values of rows; />Representing the +.sup.th of fusion of image A to be fused and image B to be fused>Fusion weights of columns; />Representing the +.sup.th of fusion of image A to be fused and image B to be fused>Column +.>A first pixel value of each pixel point; />Representing the +.sup.th of fusion of image A to be fused and image B to be fused>Column +.>A first function value for each pixel point.
10. A multi-source data information fusion processing system, comprising the following modules:
the data acquisition module is used for acquiring an image A to be fused and an image B to be fused;
the data characteristic analysis module is used for respectively acquiring a plurality of image blocks from the image A to be fused and the image B to be fused; acquiring matrixes of the image A to be fused and the image B to be fused according to the image blocks of the image A to be fused and the image B to be fused; performing spatial transformation on the matrix of the image A to be fused and the matrix of the image B to be fused to obtain a matrix after spatial transformation of the image A to be fused and the image B to be fused;
the data characteristic acquisition module is used for determining a difference image EA and a difference image EB of the image A to be fused and the image B to be fused; determining the definition effectiveness of pixel points at the same position of each of the image A to be fused and the image B to be fused; acquiring two-dimensional Gaussian function values of each pixel point in the image A to be fused and the image B to be fused after correction according to the definition effectiveness of the pixel points at the same position of the image A to be fused and the image B to be fused and the difference image EA and the difference image EB; acquiring the effectiveness of each row when the image A to be fused and the image B to be fused are fused according to the two-dimensional Gaussian function value corrected by each pixel point in the image A to be fused and the image B to be fused;
the data fusion module is used for determining the fusion weight of each column when the image A to be fused and the image B to be fused are fused according to the effectiveness of each column when the image A to be fused and the image B to be fused are fused; determining a first element value of each pixel point in each column when the image A to be fused and the image B to be fused are fused according to the matrix after the spatial transformation of the image A to be fused and the image B to be fused; according to the corrected two-dimensional Gaussian functions of the image A to be fused and the image B to be fused, determining a first function value of each pixel point in each column when the image A to be fused and the image B to be fused are fused; determining a matrix after fusion of the image A to be fused and the image B to be fused according to the fusion weight of each column, the first element value of each pixel point in each column and the first function value of each pixel point in each column when the image A to be fused and the image B to be fused are fused; and obtaining fused data according to the matrix fused by the image A to be fused and the image B to be fused.
CN202311685726.9A 2023-12-11 2023-12-11 Multi-source data information fusion processing method and system Active CN117391985B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311685726.9A CN117391985B (en) 2023-12-11 2023-12-11 Multi-source data information fusion processing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311685726.9A CN117391985B (en) 2023-12-11 2023-12-11 Multi-source data information fusion processing method and system

Publications (2)

Publication Number Publication Date
CN117391985A true CN117391985A (en) 2024-01-12
CN117391985B CN117391985B (en) 2024-02-20

Family

ID=89463495

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311685726.9A Active CN117391985B (en) 2023-12-11 2023-12-11 Multi-source data information fusion processing method and system

Country Status (1)

Country Link
CN (1) CN117391985B (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103186894A (en) * 2013-03-22 2013-07-03 南京信息工程大学 Multi-focus image fusion method for self-adaptive partitioning
CN103455991A (en) * 2013-08-22 2013-12-18 西北大学 Multi-focus image fusion method
CN105678723A (en) * 2015-12-29 2016-06-15 内蒙古科技大学 Multi-focus image fusion method based on sparse decomposition and differential image
CN108419009A (en) * 2018-02-02 2018-08-17 成都西纬科技有限公司 Image definition enhancing method and device
CN108830818A (en) * 2018-05-07 2018-11-16 西北工业大学 A kind of quick multi-focus image fusing method
CN109509163A (en) * 2018-09-28 2019-03-22 洛阳师范学院 A kind of multi-focus image fusing method and system based on FGF
CN110211077A (en) * 2019-05-13 2019-09-06 杭州电子科技大学上虞科学与工程研究院有限公司 A kind of more exposure image fusion methods based on Higher-order Singular value decomposition
CN110533632A (en) * 2019-07-18 2019-12-03 数字广东网络建设有限公司 Image obscures altering detecting method, device, computer equipment and storage medium
CN112184646A (en) * 2020-09-22 2021-01-05 西北工业大学 Image fusion method based on gradient domain oriented filtering and improved PCNN
CN112258464A (en) * 2020-10-14 2021-01-22 宁波大学 Full-reference remote sensing image fusion quality evaluation method
CN113487526A (en) * 2021-06-04 2021-10-08 湖北工业大学 Multi-focus image fusion method for improving focus definition measurement by combining high and low frequency coefficients
CN113744257A (en) * 2021-09-09 2021-12-03 展讯通信(上海)有限公司 Image fusion method and device, terminal equipment and storage medium
KR102388831B1 (en) * 2021-02-09 2022-04-21 인천대학교 산학협력단 Apparatus and Method for Fusing Intelligent Multi Focus Image
CN115760665A (en) * 2022-11-18 2023-03-07 深圳小湃科技有限公司 Multi-scale registration fusion method and device for images, terminal equipment and storage medium
CN116152454A (en) * 2023-02-15 2023-05-23 中铁水利信息科技有限公司 Water conservancy real-time monitoring management system based on GIS and three-dimensional modeling
CN116630220A (en) * 2023-07-25 2023-08-22 江苏美克医学技术有限公司 Fluorescent image depth-of-field fusion imaging method, device and storage medium

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103186894A (en) * 2013-03-22 2013-07-03 南京信息工程大学 Multi-focus image fusion method for self-adaptive partitioning
CN103455991A (en) * 2013-08-22 2013-12-18 西北大学 Multi-focus image fusion method
CN105678723A (en) * 2015-12-29 2016-06-15 内蒙古科技大学 Multi-focus image fusion method based on sparse decomposition and differential image
CN108419009A (en) * 2018-02-02 2018-08-17 成都西纬科技有限公司 Image definition enhancing method and device
CN108830818A (en) * 2018-05-07 2018-11-16 西北工业大学 A kind of quick multi-focus image fusing method
CN109509163A (en) * 2018-09-28 2019-03-22 洛阳师范学院 A kind of multi-focus image fusing method and system based on FGF
CN110211077A (en) * 2019-05-13 2019-09-06 杭州电子科技大学上虞科学与工程研究院有限公司 A kind of more exposure image fusion methods based on Higher-order Singular value decomposition
CN110533632A (en) * 2019-07-18 2019-12-03 数字广东网络建设有限公司 Image obscures altering detecting method, device, computer equipment and storage medium
CN112184646A (en) * 2020-09-22 2021-01-05 西北工业大学 Image fusion method based on gradient domain oriented filtering and improved PCNN
CN112258464A (en) * 2020-10-14 2021-01-22 宁波大学 Full-reference remote sensing image fusion quality evaluation method
KR102388831B1 (en) * 2021-02-09 2022-04-21 인천대학교 산학협력단 Apparatus and Method for Fusing Intelligent Multi Focus Image
CN113487526A (en) * 2021-06-04 2021-10-08 湖北工业大学 Multi-focus image fusion method for improving focus definition measurement by combining high and low frequency coefficients
CN113744257A (en) * 2021-09-09 2021-12-03 展讯通信(上海)有限公司 Image fusion method and device, terminal equipment and storage medium
CN115760665A (en) * 2022-11-18 2023-03-07 深圳小湃科技有限公司 Multi-scale registration fusion method and device for images, terminal equipment and storage medium
CN116152454A (en) * 2023-02-15 2023-05-23 中铁水利信息科技有限公司 Water conservancy real-time monitoring management system based on GIS and three-dimensional modeling
CN116630220A (en) * 2023-07-25 2023-08-22 江苏美克医学技术有限公司 Fluorescent image depth-of-field fusion imaging method, device and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
YU LIU: "Multi-focus image fusion with dense SIFT", 《INFORMATION FUSION》, 21 May 2014 (2014-05-21), pages 139 - 155 *
ZHANHUI HU: "An improved multi-focus image fusion algorithm based on multi-scale weighted focus measure", 《APPLIED INTELLIGENCE》, 4 January 2021 (2021-01-04), pages 4453, XP037485344, DOI: 10.1007/s10489-020-02066-8 *
刘博: "基于多尺度变换与深度学习的多聚焦图像融合研究", 《中国优秀硕士学位论文全文数据库(信息科技辑)》, no. 11, 15 November 2021 (2021-11-15), pages 138 - 30 *

Also Published As

Publication number Publication date
CN117391985B (en) 2024-02-20

Similar Documents

Publication Publication Date Title
Yoon et al. Light-field image super-resolution using convolutional neural network
CN103826033B (en) Image processing method, image processing equipment, image pick up equipment and storage medium
US9412151B2 (en) Image processing apparatus and image processing method
CN110363116B (en) Irregular human face correction method, system and medium based on GLD-GAN
CN103973989B (en) Obtain the method and system of high-dynamics image
CN111145134B (en) Block effect-based microlens light field camera full-focus image generation algorithm
CN111415310B (en) Image processing method and device and storage medium
CN107845145B (en) Three-dimensional reconstruction system and method under electron microscopic scene
DE102014117120A1 (en) IMAGING DEVICE
CN110363170B (en) Video face changing method and device
KR20200021891A (en) Method for the synthesis of intermediate views of a light field, system for the synthesis of intermediate views of a light field, and method for the compression of a light field
CN114612352A (en) Multi-focus image fusion method, storage medium and computer
CN106023189A (en) Light field data depth reconstruction method based on matching optimization
CN112348972A (en) Fine semantic annotation method based on large-scale scene three-dimensional model
CN116188859A (en) Tea disease unmanned aerial vehicle remote sensing monitoring method based on superdivision and detection network
CN109218706B (en) Method for generating stereoscopic vision image from single image
CN117391985B (en) Multi-source data information fusion processing method and system
CN117058183A (en) Image processing method and device based on double cameras, electronic equipment and storage medium
CN110599403A (en) Image super-resolution reconstruction method with good high-frequency visual effect
CN108564559B (en) Multi-focus image fusion method based on two-scale focus image
CN115699073A (en) Neural network supported camera image or video processing pipeline
CN113240573A (en) Local and global parallel learning-based style transformation method and system for ten-million-level pixel digital image
CN111985535A (en) Method and device for optimizing human body depth map through neural network
CN111127514A (en) Target tracking method and device by robot
Sakurikar et al. Focal stack representation and focus manipulation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant