CN107492086B - Image fusion method and system - Google Patents

Image fusion method and system Download PDF

Info

Publication number
CN107492086B
CN107492086B CN201710852872.4A CN201710852872A CN107492086B CN 107492086 B CN107492086 B CN 107492086B CN 201710852872 A CN201710852872 A CN 201710852872A CN 107492086 B CN107492086 B CN 107492086B
Authority
CN
China
Prior art keywords
image
component
images
fused
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710852872.4A
Other languages
Chinese (zh)
Other versions
CN107492086A (en
Inventor
丁明跃
李晓庆
邓文杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201710852872.4A priority Critical patent/CN107492086B/en
Publication of CN107492086A publication Critical patent/CN107492086A/en
Application granted granted Critical
Publication of CN107492086B publication Critical patent/CN107492086B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention discloses an image fusion method and system. The method comprises the following steps: (1) acquiring three component image data including a morphology image, an acoustic amplitude image and an acoustic phase image by using a high-frequency ultrasonic composite scanning probe microscope; (2) after normalization processing is carried out on the three collected component images, an RGB color fusion model is adopted to fuse the three component images into a color image, and the fused image can display all external and internal information of the sample. The image fusion system of the high-frequency ultrasonic composite scanning probe microscope system comprises an image acquisition module, an image normalization module, an image fusion module and an image background adjusting module. The invention solves the problem of fusion analysis of different component images and achieves the purpose of analyzing the different component images acquired by the high-frequency ultrasonic composite scanning probe microscope system.

Description

Image fusion method and system
Technical Field
The invention belongs to the field of image fusion, and particularly relates to a method for fusing images with different components of the same structure under a high-frequency ultrasonic composite scanning probe microscope.
Background
The high-frequency ultrasonic composite scanning probe microscope is developed on the basis of an atomic force microscope, can obtain a morphology image and a probe fluctuation image which represent the external form of a sample, and can also obtain a scanning acoustic amplitude image and a phase image which can represent an internal structure, and has no better fusion method for processing images of different components. The existing method can not simply and clearly process the images with different components, so that the external image and the internal image are simultaneously expressed on one image, and the result is convenient to observe and analyze. Therefore, it is important to establish a fusion method that is easy to observe and analyze.
Disclosure of Invention
In response to the need for improvement, the present invention provides a method for image fusion of different components. The invention relates to a fluorescence staining model in a biological fluorescence system, which can obtain images capable of representing cytoskeleton, organelle and cell nucleus by using different color dye marks.
In order to achieve the above object, according to one aspect of the present invention, there is provided a method for fusing images obtained by a high frequency ultrasound composite scanning probe microscope system, comprising the steps of:
(1) acquiring different component images to be fused of the same structure at the same position by using a high-frequency ultrasonic composite scanning probe microscope;
(2) normalizing three component image data of the morphology image to be fused, the acoustic amplitude image and the acoustic phase image to be fused at the same position to be fused, which are obtained in the step (1);
(3) fusing the different component images normalized in the step (2) into a color image; firstly, correspondingly setting the three normalized component images as one of R component, G component and B component full arrays, namely correspondingly setting the three normalized component images as one of red component, green component and blue component full arrays; then, the three images are fused into a color image, and the color coordinates of pixel points of the fused color image satisfy the following formula: d ═ (R, G, B); d is the color coordinate of each pixel point on the fused color image, R is the gray value of each pixel point of the red component image, G is the gray value of each corresponding pixel point of the green component image, and B is the gray value of each corresponding pixel point of the blue component image;
(4) and (4) adjusting the background of the fused color image obtained in the step (3) to maximize the contrast between the target image and the background, wherein the fused color image can display all external appearances and internal structures in the three component images.
Preferably, the image data to be fused in step (1) includes a topography image, an acoustic amplitude image and an acoustic phase image.
Preferably, the formula for calculating the value of each individual pixel point in the image in the normalization processing in the step (2) is as follows:
j=(i/max)*256;
wherein i is the numerical value of each pixel point in the image, max is the maximum value of the pixel points in the whole image, and j is the numerical value of each pixel point after normalization. Because the image acquired by the high-frequency ultrasonic composite scanning probe microscope is an index image, the index image needs to be converted into a gray image through normalization processing before fusion.
Comparing the color channels corresponding to the set component images in the step (2) in six combinations, and setting the appearance images as R components; the acoustic amplitude image is set as a G component, the acoustic phase image is set as a B component, the optimal selection is achieved, the information content of the outline and the internal structure of the fused target is maximum, and the edge is clearest. The combinations are as follows:
topography-R-component, acoustic amplitude-G-component, acoustic phase-B-component;
topography image-R component, acoustic amplitude image-B component, acoustic phase-G component;
topography-G-component, acoustic amplitude-R-component, acoustic phase-B-component;
topography-G-component, acoustic amplitude-B-component, acoustic phase-R-component;
a topography image-B component, an acoustic amplitude image-G component, an acoustic phase-R component;
a topography image-B component, an acoustic amplitude image-R component, an acoustic phase-G component;
preferably, the adjusting of the background of the fused color image in step (4) is performed by respectively modifying the histograms, i.e. the gray distribution, of the three original images, i.e. performing gray compression on the three original images, and compressing the gray distribution of the single image from [0,255] to [130,255 ]. After the fused image is obtained by the preliminary color fusion, the contrast ratio of the target and the background is observed to be not large, and the histogram correction cannot be directly carried out because the fused image is composed of three color channels, so that the histogram correction is carried out on the three gray level images before the fusion is returned, and the RGB model is used for fusion to obtain the color image with the maximum contrast ratio of the target and the background.
Preferably, the same site in step (1) is the same cell.
Preferably, the same part in the step (1) is the same part of the notch groove sample on the silicon wafer.
Preferably, the surface of the silicon wafer is coated with gold particles.
In another aspect, the present invention provides an image fusion system of a high-frequency ultrasound composite scanning probe microscope system, comprising:
an image acquisition module: the three component images are used for acquiring a morphology image, an acoustic amplitude image and an acoustic phase image to be fused at the same position;
an image normalization module: the three component image data of the morphology image, the acoustic amplitude image and the acoustic phase image to be fused at the same position are subjected to normalization processing;
an image fusion module: the image fusion device is used for fusing the normalized different component images into a color image; firstly, the three normalized component images are correspondingly set as one of R component, G component and B component full arrays, namely the three normalized component images are correspondingly set as one of red component, green component and blue component full arrays. Then, the three images are fused into a color image, and the color coordinates of pixel points of the fused color image satisfy the following formula: d ═ (R, G, B); d is the color coordinate of each pixel point on the fused color image, R is the gray value of each pixel point of the red component image, G is the gray value of each corresponding pixel point of the green component image, and B is the gray value of each corresponding pixel point of the blue component image;
an image background adjustment module: the method is used for adjusting the background of the fused color image to maximize the contrast between the target image and the background, and the fused color image can display all external appearances and internal structures in the three component images.
Preferably, the adjusting of the background of the fused color image is performed by respectively correcting the histograms, i.e. the gray distribution, of the three original images, i.e. performing gray compression on the three original images, and compressing the gray distribution of the single image from [0,255] to [130,255 ].
The technical scheme of the invention can obtain the following beneficial effects:
(1) the method for fusing the images of multiple components into one image is explored theoretically and experimentally due to the fact that different external and internal information is provided by different component images, the purpose of simultaneously displaying the external information and the internal structure on one image is achieved, and edge information and internal information of a target in the image can be analyzed more intuitively.
(2) The normalization processing of the image provides guarantee for obtaining a color image capable of representing all structures, and has important influence on the result optimization.
(3) After the color image is fused, the correction of the image histogram is very important for obtaining a clearer color image with a more prominent structure, and the contrast between the target and the background displayed in the gray-scale image is more obvious.
Drawings
FIG. 1 is a fusion flow diagram of the present invention;
FIG. 2 is a graph of the effect of normalization processing;
FIG. 3 is a gray scale image after RGB fusion;
FIG. 4 is a comparison of a bioluminescent staining system and an RGB fused image.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The method for fusing the images aiming at different components comprises the following steps:
(1) acquiring image data of different components to be fused
Acquiring image data of different components needing to be fused by using a high-frequency ultrasonic composite scanning probe microscope to obtain images of different components of the same part, wherein the images comprise a morphology image, an acoustic amplitude image and an acoustic phase image;
(2) image fusion method for different components
And (2) carrying out normalization processing on the image data of different components to be fused obtained in the step (1), constructing an RGB fused color model, fusing the three components into a color image, adjusting the background color to enable the background color to be close to black, wherein the fused color image can display all external appearances and internal structures in the three component images.
The image data to be fused in the step (1) are different components of the same structure, and comprise a morphology image, an acoustic amplitude image and an acoustic phase image. And directly adopting a high-frequency ultrasonic composite scanning probe microscope to acquire data structures of different components to be fused.
In the image fusion method of different components, the formula for calculating the value of each individual pixel point in the image in the normalization processing in the step (2) is as follows:
j=(i/max)*256;
wherein i is the numerical value of each pixel point in the image, max is the maximum value of the pixel points in the whole image, and j is the numerical value of each pixel point after normalization.
In the step (2), the multi-component fused color map is obtained according to the RGB model, and the multi-component fused color map is obtained according to the following method:
s1, setting the normalized images of the three components as three color channels;
and S2, setting the components of the three color channels, fusing to obtain a color image, and adjusting the background of the color image to obtain the fused image with different components.
In the image fusion method with different components, after the color channels corresponding to the component images in the step (2) are subjected to six combinations of comparison, the appearance image is set as an R component; the acoustic amplitude image is set as a G component, the acoustic phase image is set as a B component, the optimal selection is achieved, the information content of the outline and the internal structure of the fused target is maximum, and the edge is clearest. The combinations are as follows:
topography-R-component, acoustic amplitude-G-component, acoustic phase-B-component;
topography image-R component, acoustic amplitude image-B component, acoustic phase-G component;
topography-G-component, acoustic amplitude-R-component, acoustic phase-B-component;
topography-G-component, acoustic amplitude-B-component, acoustic phase-R-component;
a topography image-B component, an acoustic amplitude image-G component, an acoustic phase-R component;
a topography image-B component, an acoustic amplitude image-R component, an acoustic phase-G component;
the image fusion method of different components in step (2) is characterized in that image fusion is carried out after three component channels are set, the value of the output image pixel point corresponds to a specific color in a color table, and the color coordinate of the color image pixel point meets the following formula: d ═ (R, G, B);
wherein D is the color coordinate of each pixel point on the fused color image, the color coordinate corresponds to a specific color in the RGB color system, R is the gray value of each pixel point of the red component image, G is the gray value of each pixel point corresponding to the green component image, and B is the gray value of each pixel point corresponding to the blue component image.
In the method for fusing images with different components, the adjustment of the background color is performed after the color image is obtained in the step (2), and the adjustment is completed by correcting the image histogram. The histogram method for correcting the image is completed by respectively correcting the histograms, namely the gray distribution, of the three original images, namely, performing gray compression on the three original images, and compressing the gray distribution of a single image from [0,255] to [130,255 ]. After the fused image is obtained by the preliminary color fusion, the contrast ratio of the target and the background is observed to be not large, and the histogram correction cannot be directly carried out because the fused image is composed of three color channels, so that the histogram correction is carried out on the three gray level images before the fusion is returned, and the RGB model is used for fusion to obtain the color image with the maximum contrast ratio of the target and the background.
Example 1
The method for fusing the images aiming at different components comprises the following steps:
(1) acquiring different image data to be fused using a high frequency ultrasound compound scanning probe microscope
The sample is breast cancer cells MD-MB-231, and the shape image, the acoustic amplitude image and the acoustic phase image of the MD-MB-231 cells are obtained through scanning of a high-frequency ultrasonic composite scanning probe microscope.
(2) And (2) normalizing the image data of different components to be fused, which is obtained in the step (1), setting a morphology image as an R component, setting an acoustic amplitude image as a G component, setting an acoustic phase image as a B component, fusing the three components into a color image by adopting an RGB fusion color model, adjusting the background color to maximize the contrast between the target and the background, and displaying all external morphology and internal structure in the three component images by the fused color image.
(3) The color image with multi-component fusion is obtained according to the RGB model and the method comprises the following steps:
s1, carrying out normalization processing on the acquired morphology image, acoustic amplitude image and acoustic phase image of the MD-MB-231 cells, normalizing the image into a gray-scale image from an index image, wherein the value of a pixel point is between [0 and 255 ];
s2, setting the normalized images of the three components as R channel, G channel and B channel of three color channels;
and S3, setting the components of the three color channels, fusing to obtain a color image, and adjusting the background of the color image to obtain the fused image with different components.
In the image fusion method of different components, the formula for calculating the value of each individual pixel point in the image in the normalization processing in the step (2) is as follows:
j=(i/max)*256;
wherein i is the numerical value of each pixel point in the image, max is the maximum value of the pixel points in the whole image, and j is the numerical value of each pixel point after normalization.
In the image fusion method for different components, after the color channels corresponding to the component images in the step (2) are subjected to six combinations of comparison, the appearance image is set as an R component; the acoustic amplitude image is set as a G component, the acoustic phase image is set as a B component, the optimal selection is achieved, the information content of the outline and the internal structure of the fused target is maximum, and the edge is clearest. The combinations are as follows:
topography-R component; acoustic amplitude image-G component, acoustic phase-B component;
topography-R component; acoustic amplitude image-B component, acoustic phase-G component;
topography-G component; acoustic amplitude image-R component, acoustic phase-B component;
topography-G component; acoustic amplitude image-B component, acoustic phase-R component;
topography image-B component; acoustic amplitude image-G component, acoustic phase-R component;
topography image-B component; acoustic amplitude image-R component, acoustic phase-G component;
the image fusion method of different components in step (2) is characterized in that image fusion is carried out after three component channels are set, the value of the output image pixel point corresponds to a specific color in a color table, and the color coordinate of the color image pixel point meets the following formula: d ═ (R, G, B);
wherein D is the color coordinate of each pixel point on the fused color image, the color coordinate corresponds to a specific color in the RGB color system, R is the gray value of each pixel point of the red component image, G is the gray value of each pixel point corresponding to the green component image, and B is the gray value of each pixel point corresponding to the blue component image.
The method for fusing images with different components is characterized in that the adjustment of the background color after the color image is obtained in the step (2) is completed by correcting the image histogram. The histogram method for correcting the image is completed by respectively correcting the histograms of the three original images, namely the gray distribution. The gray scale distribution of the corrected image is that the image is gray scale compressed, and the gray scale distribution of a single image is compressed from [0,255] to [130,255 ]. After the fused image is obtained by the preliminary color fusion, the contrast ratio of the target and the background is observed to be not large, and the histogram correction cannot be directly carried out because the fused image is composed of three color channels, so that the histogram correction is carried out on the three gray level images before the fusion is returned, and the RGB model is used for fusion to obtain the color image with the maximum contrast ratio of the target and the background.
More information brought by the fused image is evaluated, specifically as follows:
the fused RGB color image is compared with the original AFM appearance image and is compared with the cell image under optical fluorescence, so that the fused image can be observed to show more information than the AFM appearance image before fusion, and the fused image can be found to represent part of the structure in the cell more clearly after being compared with the cell image dyed by the optical fluorescence.
Fig. 2 is an intermediate image processing step, in which the original index image is normalized to make the pixel values of the image between 0 and 255, and the original image is converted into a grayscale image and then fused. The original images after normalization are changed into gray images from pseudo-color images, and the obtained fusion images are more convincing. Fig. 2a is a grayscale image of the morphology image after normalization processing, which can clearly identify the external morphology of the cell, fig. 2b is a grayscale image of the acoustic amplitude after normalization processing, fig. 2c is a grayscale image of the acoustic phase after normalization processing, and two acoustic images can identify the internal structure of a part of the cell.
The images in fig. 3a-c are gray level images of a morphology image, an acoustic amplitude image and an acoustic phase image of a breast cancer cell MD-MB-231 respectively, then gray level images in a color map are obtained through fusion, the gray level distribution of three component images in fig. 3d is 0-255, and histogram correction is not performed, so that the background color in the color map is complex, the contrast ratio of an object and the background in the gray level map is not large, after the histogram correction, the gray level distribution of the image in fig. 3e is 60-255, the gray level distribution of the image in fig. 3f is 130-255, after the histogram correction, the information of the object part in the color map in fig. 3d-f is more clear and remarkable, the contrast ratio of the object and the background is larger, and the identification of various information of the object is more facilitated. As can be seen from the fused image of fig. 3f, the fused color map superimposes the images of three different components together, all the information is displayed on one image, the superimposed image not only displays the information part of the morphological image, but also displays the information of the acoustic image, and the superimposed image significantly highlights the external contour and the internal part structure of the cell. In our example it can be clearly seen that the outer contour of the cell is much clearer and easier to identify than the contour of the individual topographic image, while the two highlighted dots in the grey-scale image characterize the similar nucleus region inside the cell.
FIG. 4 is a comparison of a fused image of the present invention with a bioluminescent stain image. FIGS. 4a-c are Confocal images of the breast cancer cells MD-MB-231 obtained after fluorescent staining, FIG. 4a is a cytoskeleton image, FIG. 4b is a cell nucleus image, and FIG. 4c is an image showing the contour and the cell nucleus position of the intact cells after fluorescent staining fusion. FIGS. 4d-f are gray scale images of the morphology image, the acoustic amplitude image and the acoustic phase image of the breast cancer cell MD-MB-231, respectively. Fig. 4h is a gray scale image after fusion of bioluminescence staining, and fig. 4i is a gray scale image of a color image after fusion of an RGB model. From fig. 4h, it can be seen that the cell nucleus of the breast cancer cell MD-MB-231 under the bioluminescence staining system accounts for half of the whole proportion of the cell, and a region similar to the cell nucleus structure can be seen in the gray scale image 4i of the RGB color image obtained by fusing the cell images of different components acquired by the high frequency ultrasound composite scanning probe microscope, and the proportion size is substantially consistent with that in the bioluminescence staining system. Therefore, to some extent, the RGB color image obtained by fusion may show a partial structure inside the cell, and the proportional morphology is substantially consistent with the result obtained by the bioluminescent staining system, and a single component image cannot simultaneously represent the external and internal structures of the cell, and after fusion, a result similar to that of bioluminescent staining can be obtained, and various structural information can be simultaneously represented on one image.
It can be seen from the above experimental results and comparison that the present invention acquires images of different components of cells by using a high-frequency ultrasound composite scanning probe microscope, and then, after normalization processing and RGB fusion, can characterize all structural information of the component images on one image to obtain an image with richer information content, and distinguish clear cell outlines and internal structures similar to cell nuclei from the fused image.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (6)

1. A method for fusing images obtained by a high-frequency ultrasonic composite scanning probe microscope system is characterized by comprising the following steps:
(1) acquiring three component images of a morphology image, an acoustic amplitude image and an acoustic phase image to be fused at the same position by using a high-frequency ultrasonic composite scanning probe microscope; the same part is the same cell;
(2) normalizing three component image data of the topography image, the acoustic amplitude image and the acoustic phase image to be fused at the same position obtained in the step (1);
(3) fusing the different component images normalized in the step (2) into a color image; firstly, correspondingly setting the three normalized component images as one of R component, G component and B component full arrays, namely correspondingly setting the three normalized component images as one of red component, green component and blue component full arrays; then, the three images are fused into a color image, and the color coordinates of pixel points of the fused color image satisfy the following formula: d ═ (R, G, B); d is the color coordinate of each pixel point on the fused color image, R is the gray value of each pixel point of the red component image, G is the gray value of each corresponding pixel point of the green component image, and B is the gray value of each corresponding pixel point of the blue component image;
(4) and (4) adjusting the background of the fused color image obtained in the step (3) to maximize the contrast between the target image and the background, wherein the target image is a cell image, and the fused color image is obtained.
2. The method for fusing images obtained by a high-frequency ultrasonic composite scanning probe microscope system according to claim 1, wherein the formula for calculating the value of each individual pixel point in the image in the normalization processing in the step (2) is as follows:
j=(i/max)*256;
wherein i is the value of each pixel point in the image, max is the maximum value of the pixel points in the whole image, and j is the value of each single pixel point after normalization.
3. The method for fusing images obtained by a high-frequency ultrasonic composite scanning probe microscope system according to claim 2, wherein the profile image is set as an R component, the acoustic amplitude image is set as a G component, and the acoustic phase image is set as a B component.
4. The method for fusing images obtained by an HF-ultrasound composite scanning probe microscope system as claimed in claim 1, wherein the step (4) of adjusting the background of the fused color image is performed by respectively correcting the histograms of the three original images, i.e. the gray-scale distribution, i.e. the three original images are gray-scale compressed, and the gray-scale distribution of the single image is compressed from [0,255] to [130,255 ].
5. An image fusion system of a high-frequency ultrasound composite scanning probe microscope system, comprising:
an image acquisition module: the three component images are used for acquiring a morphology image, an acoustic amplitude image and an acoustic phase image to be fused at the same position; the same part is the same cell;
an image normalization module: the three component image data of the morphology image, the acoustic amplitude image and the acoustic phase image to be fused at the same position are subjected to normalization processing;
an image fusion module: the image fusion device is used for fusing the normalized different component images into a color image; firstly, correspondingly setting the three normalized component images as one of R component, G component and B component full arrays, namely correspondingly setting the three normalized component images as one of red component, green component and blue component full arrays; then, the three images are fused into a color image, and the color coordinates of pixel points of the fused color image satisfy the following formula: d ═ (R, G, B); d is the color coordinate of each pixel point on the fused color image, R is the gray value of each pixel point of the red component image, G is the gray value of each corresponding pixel point of the green component image, and B is the gray value of each corresponding pixel point of the blue component image;
an image background adjustment module: the method is used for adjusting the background of the fused color image to maximize the contrast between a target image and the background, wherein the target image is a cell image, and the fused color image can display all external appearances and internal structures in the three component images.
6. The image fusion system of claim 5 wherein the background of the fused color image is adjusted by modifying the histograms, i.e. the gray distribution, of the three original images, respectively, i.e. performing gray compression on the three original images, and compressing the gray distribution of the single image from [0,255] to [130,255 ].
CN201710852872.4A 2017-09-20 2017-09-20 Image fusion method and system Active CN107492086B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710852872.4A CN107492086B (en) 2017-09-20 2017-09-20 Image fusion method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710852872.4A CN107492086B (en) 2017-09-20 2017-09-20 Image fusion method and system

Publications (2)

Publication Number Publication Date
CN107492086A CN107492086A (en) 2017-12-19
CN107492086B true CN107492086B (en) 2020-05-19

Family

ID=60653104

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710852872.4A Active CN107492086B (en) 2017-09-20 2017-09-20 Image fusion method and system

Country Status (1)

Country Link
CN (1) CN107492086B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108765399B (en) * 2018-05-23 2022-01-28 平安科技(深圳)有限公司 Lesion site recognition device, computer device, and readable storage medium
CN112907492A (en) * 2019-12-03 2021-06-04 顺丰科技有限公司 Object motion track generation method and generation system
CN114815207A (en) * 2022-05-24 2022-07-29 宾盛科技(武汉)有限公司 Image depth-of-field fusion method for microscopic imaging automatic focusing and related equipment
CN117058014B (en) * 2023-07-14 2024-03-29 北京透彻未来科技有限公司 LAB color space matching-based dyeing normalization system and method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105107067A (en) * 2015-07-16 2015-12-02 执鼎医疗科技江苏有限公司 Venipuncture system with infrared guidance and ultrasonic location
CN105286917A (en) * 2015-11-03 2016-02-03 华中科技大学 Three-dimensional ultrasonic real-time imaging method and system based on area-array probes

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8784318B1 (en) * 2005-07-22 2014-07-22 Zonare Medical Systems, Inc. Aberration correction using channel data in ultrasound imaging system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105107067A (en) * 2015-07-16 2015-12-02 执鼎医疗科技江苏有限公司 Venipuncture system with infrared guidance and ultrasonic location
CN105286917A (en) * 2015-11-03 2016-02-03 华中科技大学 Three-dimensional ultrasonic real-time imaging method and system based on area-array probes

Also Published As

Publication number Publication date
CN107492086A (en) 2017-12-19

Similar Documents

Publication Publication Date Title
CN107492086B (en) Image fusion method and system
JP6791245B2 (en) Image processing device, image processing method and image processing program
CN104636726B (en) A kind of image color recognition methods, device and terminal
US9224193B2 (en) Focus stacking image processing apparatus, imaging system, and image processing system
CN106296635B (en) A kind of fluorescence in situ hybridization (FISH) image Parallel Processing and analysis method
JP5508792B2 (en) Image processing method, image processing apparatus, and image processing program
JP2020119588A (en) Information processing system, display control system, and program
KR20170139590A (en) Colony contrast collection
JP2014110797A (en) Method of chromogen separation-based image analysis
CN107545550B (en) Cell image color cast correction method
CN105447878A (en) Image quality test analysis method and system
CN110060229A (en) A kind of cell automatic positioning dividing method of myeloplast
KR102431217B1 (en) Image analysis apparatus, method and program
CN109341524A (en) A kind of optical fiber geometric parameter detection method based on machine vision
CN107525768A (en) A kind of method of quality control of DNA ploidy body analytical equipment
CN106558044A (en) The resolution measuring method of image module
CN109102510B (en) Breast cancer pathological tissue image segmentation method based on semi-supervised k-means algorithm
CN112184696B (en) Cell nucleus and organelle counting and area calculating method and system thereof
CN116883323A (en) Multiple immunofluorescence image analysis system based on computer vision
CN103927544A (en) Machine vision grading method for ginned cotton rolling quality
CN111007020B (en) Double-frame four-spectrum imaging method and application
CN117058014B (en) LAB color space matching-based dyeing normalization system and method
JP2018185695A (en) Information processing device, information processing method and program
Kammerer et al. A visualization tool for comparing paintings and their underdrawings
CN112001288B (en) Quick detection method for dark gray aircraft by single multispectral remote sensing image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant